• About
  • Advertise
  • Contact
Tuesday, April 14, 2026
No Result
View All Result
NEWSLETTER
iotindiana
  • Home
  • Internet of Things
  • Security
  • WAN
  • Cloud Computing
  • Mobile
  • Networking
  • Software
  • Home
  • Internet of Things
  • Security
  • WAN
  • Cloud Computing
  • Mobile
  • Networking
  • Software
No Result
View All Result
iotindiana
No Result
View All Result
Home Internet of Things

Self-learning sensor chips won’t need networks

in Internet of Things
Self-learning sensor chips won’t need networks
0
SHARES
16
VIEWS
Share on FacebookShare on Twitter

Tiny, intelligent microelectronics should be used to perform as much sensor processing as possible on-chip rather than wasting resources by sending often un-needed, duplicated raw data to the cloud or computers. So say scientists behind new, machine-learning networks that aim to embed everything needed for artificial intelligence (AI) onto a processor.

“This opens the door for many new applications, starting from real-time evaluation of sensor data,” saysFraunhofer Institute for Microelectronic Circuits and Systemson its website. No delays sending unnecessary data onwards, along with speedy processing, means theoretically there is zero latency.

Plus, on-microprocessor, self-learning means the embedded, or sensor, devices can self-calibrate. They can even be “completely reconfigured to perform a totally different task afterwards,” the institute says. “An embedded system with different tasks is possible.”

Much internet of things (IoT) data sent through networks is redundant and wastes resources: a temperature reading taken every 10 minutes, say, when the ambient temperature hasn’t changed, is one example. In fact, one only needs to know when the temperature has changed, and maybe then only when thresholds have been met.

Neural network-on-sensor chip

The commercial German research organization says it’s developing a specific RISC-V microprocessor with a special hardware accelerator designed for a brain-copying, artificial neural network (ANN) it has developed. The architecture could ultimately be suitable for the condition-monitoring or predictive sensors of the kind we will likely see more of in the industrial internet of things (IIoT).

Key to Fraunhofer IMS’s Artificial Intelligence for Embedded Systems (AIfES)is that the self-learning takes place at chip level rather than in the cloud or on a computer, and that it is independent of “connectivity towards a cloud or a powerful and resource-hungry processing entity.” But it still offers a “full AI mechanism, like independent learning,”

It’s “decentralized AI,” says Fraunhofer IMS. “It’s not focused towards big-data processing.”

Indeed, with these kinds of systems, no connection is actually required for the raw data, just for the post-analytical results, if indeed needed. Swarming can even replace that. Swarming lets sensors talk to one another, sharing relevant information without even getting a host network involved.

“It is possible to build a network from small and adaptive systems that share tasks among themselves,” Fraunhofer IMS says.

Other benefits in decentralized neural networks include that they can be more secure than the cloud. Because all processing takes place on the microprocessor, “no sensitive data needs to be transferred,” Fraunhofer IMS explains.

Other edge computing research

The Fraunhofer researchers aren’t the only academics who believe entire networks become redundant with neuristor, brain-like AI chips. Binghamton University and Georgia Tech are working together on similar edge-oriented tech.

“The idea is we want to have these chips that can do all the functioning in the chip, rather than messages back and forth with some sort of large server,” Binghamton said on its website when I wrote about the university’s work last year.

One of the advantages of no major communications linking: Not only don’t you have to worry about internet resilience, but also that energy is saved creating the link. Energy efficiency is an ambition in the sensor world — replacing batteries is time consuming, expensive, and sometimes, in the case of remote locations, extremely difficult.

Memory or storage for swaths of raw data awaiting transfer to be processed at a data center, or similar, doesn’t have to be provided either — it’s been processed at the source, so it can be discarded.

Join the Network World communities on Facebook and LinkedIn to comment on topics that are top of mind.
Free Download WordPress Themes
Download Premium WordPress Themes Free
Download Nulled WordPress Themes
Download Premium WordPress Themes Free
udemy course download free
download xiomi firmware
Download Premium WordPress Themes Free
free download udemy course
Tags: Self-learning sensor chips won’t need networks
Next Post

Enterprises tap edge computing for IoT analytics

Recommended

Intel embraces Internet of Things, puts sensors on everything

IoT: We’re serfs and pawns

Facebook Twitter Youtube RSS

Newsletter

Subscribe our Newsletter for latest updates.

Loading

Category

  • AI
  • Careers
  • Cloud Computing
  • Connected Cars
  • Connected Vehicles
  • Data & Analytics
  • Data Center
  • Data Centers
  • Databases
  • Development
  • Enterprise
  • Hardware
  • Healthcare
  • IIoT
  • Infrastructure
  • Internet of Things
  • IoT
  • IT Leadership
  • Manufacturing
  • Mobile
  • Networking
  • Oil & Gas
  • Open Source
  • Security
  • Smart Cities
  • Smart Homes
  • Software
  • Software Development
  • Standards
  • Technology Industry
  • Uncategorized
  • Unified Communications
  • Virtualization
  • WAN
  • Wearables

About Us

Advance IOT information site of Indiana USA

© 2024 iotindiana.com.

No Result
View All Result
  • Home
  • Internet of Things
  • Security
  • WAN
  • IoT
  • Cloud Computing
  • Data Centers
  • Mobile
  • Networking
  • Software

© 2024 iotindiana.com.

Login to your account below

Forgotten Password?

Fill the forms bellow to register

All fields are required. Log In

Retrieve your password

Please enter your username or email address to reset your password.

Log In