With the shift to edge architecture underway, embedded sensors are gaining more and more attention in IoT.
Much of what comes together in the Internet of Things depends on the first node in the network, which is most often a sensor. The transformational analysis and automation of Industry 4.0 requires even better input, which has kept vendors busy trying to bring smarter, integrated sensors to market.
The role of smart sensors is under closer scrutiny now, as an industry-wide architectural change is underway that moves more processing to the edge, reducing reliance on âcloud-onlyâ processing. As sensors proliferate, it becomes very difficult to manage all the information from the sensors in the remote cloud, so the sensors and the edge nodes in which they reside have to become smarter.
It is one of the main drivers of Allied Market Research projecting the global IoT sensor industry will grow from $ 12.37 billion to $ 141.80 billion by 2030, reaching a CAGR of 28.1% over that period.
As in the past, today’s on-board sensors must measure temperature, humidity, pressure, proximity, and a wide variety of other phenomena. Accelerometers, magnetometers, and other miniature devices have come to form complex sensor fusions that combine disparate types of sources.
But now included among the options are cutting edge AI devices that handle machine learning at the source, while using significantly less electrical power than cloud-based processing alternatives. Although still nascent, lidar and radio waves are also used for IoT sensor measurements.
The world of sensors is expanding. At the same time, the limits of what is possible are decreasing:
- * Ruggedized IoT sensors connected to IoT gateways are used to measure and monitor grape crops for wine in Napa Valley in California, as part of Cisco System’s Industrial Asset Vision platform.
- Bosch Sensortec sensors with on-board AI act as “digital noses” to detect gases, particles and – a matter of growing concern – airborne viruses.
- Suffolk, Virginia is reinventing itself road sign services using Iteris’ ClearMobility platform, which in turn uses intelligent Vantage Apex sensors that combine high-definition video and four-dimensional radar sensors with integrated AI algorithms.
- NevadaNano’s MPS Mini combines an array of on-chip chemical sensors with an on-chip molecular properties spectrometer to detect combustible gases.
Smart sensors are starting to learn
“A typical smart sensor typically has four main sections – the sensor itself, an analog-to-digital conversion function, a computing unit – or microcontroller – and a communication engine which today can be wireless or wired,” according to Raymond Yin, director of technical content at Mouser Electronics, as well as host of Mouser’s “The technology between us” Podcast. There are many variations, he warned.
For example, many smart sensors have more than one type of individual sensor suitable for a specific application. There are also variations in the function of the integrated microcontroller unit (MCU). Some built-in MCUs are just state machines that control the data conversion process and communication, while others run sensor fusion algorithms entirely, he said.
As an example, Yin cited the LSM6DSO32XTR iNEMO inertial module from STMicroelectronics, which integrates accelerometers with a gyroscope and temperature sensor, and includes a machine learning core that makes it easy to detect applications such as walking, running and driving.
Conservation of energy with purpose
Manuel Tagliavini, senior analyst covering MEMS and sensors at Omdia, said that a smart sensor can be defined as an electronic component that is not only capable of reading and storing physical measurements – such as acceleration, light , flow, humidity, etc. is also able to perform more complex operations which might have different purposes.
âIn a nutshell,â he said, âbeing able to perform operations through an advanced ASIC or an integrated MCU is what allows a sensor to be called ‘intelligent’.
Increasingly, as sensors move to the edge of the Internet of Things, these goals revolve around energy conservation.
âPeople have to think about the ‘sleep’ functions that keep the entire sensor component, and maybe other systems related to it, in low power mode until something in the physical world goes wrong. produce, âTagliavini said via an email message.
The obvious goal is to save electrical power and, for portable devices, battery power, which can be a barrier if systems continually report to the cloud. Battery-less sensors that recover energy from the ambient environment are increasingly used in certain applications.
Always on the cutting edge
In critical edge applications, power consumption concerns extend beyond the sensor to include the processor. Concerns are acute for processors intended for machine learning.
In cloud data centers, rich power supplies are a given. This is not the case at the edge of IoT.
Such considerations are now playing out in the designs of next-generation AI Edge chips. This is shown, for example, by recent offers such as the NDP102 Neural Decision Processor by Syntiant Corp.
It’s meant to apply AI processing to audio and other inputs ranging from the type of alerts we use to wake up Siri or Alexa smart devices to the tilt angle of an oscillating punch press. ready to break down in a factory.
âWe do a lot around sensors and vibration, condition-based monitoring and healthcare,â Kurt Busch, CEO of Syntiant, told IoT World Today. He said vibrations and temperature events that signal machine maintenance issues are best detected and addressed before costly downtime failure occurs.
Importantly, he noted, the Syntiant Neural Processor is designed to operate at less than 100 microwatts of power consumption in always-on sensor applications. Syntiant is one of a handful of companies working to achieve neural processing in the analog rather than digital realm, with speed and energy conservation as their primary goals.
Take a page from Fitbit
Consumer devices like the iPhone, AirPods, and Fitbit have played a big role in advancing sensor functionality and cutting prices. The gold rush-style push for assisted and autonomous vehicles can do the same, or more, including promoting sensor fusions that mix sensor techniques, according to Omdia’s Tagliavini.
The methods of assisted autonomous driving and to come require multiple and diverse sensor measurements for obvious safety reasons, he said.
This means that “the collection and calculation of data read from radar, lidar, inertial unit and GPS is a critical activity which requires reliability, redundancy and timely results”, a- he declared. Advances here will be felt in the wider world of IoT.
Yet advances in smart sensors in critical industrial applications – those that update at longer intervals than ever-changing consumer applications – may take longer to diffuse, he advises.
Requirements for gauging sensors
Faced with the explosion of new technologies, the basic compromises remain similar. The challenge of moving to the edge and deploying machine learning does little to change the basic system decisions that have always influenced sensor system design, said Raymond Yin de Mouser. The questions remain:
- Do the sensors meet the requirements for resolution and accuracy?
- Are the sensor results consistent and reliable enough for the operational requirements?
- Are the sensors providing the data necessary to achieve the system goals?
- Do the sensors meet the power, size and timing requirements?
Likewise, specifications for the connectivity and compute portions of the overall system should be determined based on the application or use case, he said.
The role that embedded sensors and AI play in emerging IoT industries is constantly evolving. Advances in sensors in imaging, MEMS, lidar, Wi-Fi, UWB, radar and elsewhere are clearly numerous, as are various machine learning cores that work to “give a sense to the sensors â.
How system designers align these technologies with profitable use cases is likely to define the ultimate success of the next era of IoT.