Sensors are indeed becoming a ubiquitous element in the IoT bandwagon, which in turn, makes sensor fusion the technology to watch. What is sensor fusion? It's a technology that combines data from multiple sensors for creating a single data point for an application processor in order to formulate context, intent, or location information in real-time for mobile, wearable, and IoT devices. Sensor fusion is based on both hardware and software elements.
The hardware part of sensor fusion comprises of a specialized processor called sensor hub—usually a low-power MCU or ASSP—that aggregates data from various sensors and passes on the compressed data stream to an application processor for computationally intensive tasks. Then, there is this tailored software framework that is commonly known as the sensor fusion algorithm.
A view of sensor fusion solution enabled by Atmel's SAMD20 sensor hub and Hillcrest Labs' SH-1 algorithm
1. Sensor Software Framework
It's the algorithm that determines how to capture the data; so it's critical that developers get the algorithm part right. A sensor fusion algorithm combines sensor outputs with maximum accuracy and efficiency, as well as minimal noise and power consumption. Here, engineers can take advantage of the tools and libraries available from sensor and chip makers while designing multi-sensor products.
Take NXP/Freescale, which is making its sensor fusion libraries open source as part of the Accelerated Innovation Community (AIC) initiative spearheaded by the MEMS Industry Group (MIG) to step up the adoption of sensor fusion algorithms. Analog Devices and PNI Sensor have also joined the AIC effort.
Then, there are firms like Dialog Semiconductor, which is offering the SmartFusion library as part of the smart sensor development kit for IoT applications alongside its DA14583 Bluetooth Low Energy (BLE) chipset. Dialog's SmartFusion library facilitates data acquisition, auto-calibration, and sensor data fusion while running on SmartBond chipset's embedded Cortex M0 processor.
Dialog's SmartFusion library supports gyroscope, accelerometer, magnetometer, and environmental sensors
2. Power Balancing Act
A rise in the number of sensors inevitably leads to more power usage, especially when these sensors carry out always-on operations to enable features like context awareness, gesture recognition, and indoor navigation. At the same time, however, ultra-low-power is a key priority in battery-operated mobile, wearable, and IoT devices.
That makes the design work around sensor hubs a walk on the tightrope. Here, a new breed of processors now offers design engineers with venues such as making a distinction among sensors that can be turned on and off, and the ones that should remain in an always-on state.
The sensor hub processor can remain in a sleep mode as long as there is no activity—motion, sound, etc.—and once there is an activity, the processor can wake up, carry out the data processing task, and go back to sleep mode. Next, the right choice of the sensor interface can help bring down the power consumption.
3. Choosing Right Interface
There are a lot of sensors around, and they all come with specific constraints in terms of power efficiency, frequency, and latency. These sensors connect to an application processor via a sensor hub and the choice of right peripherals for each sensor plays a crucial part in efficient sensor-fusion designs.
For instance, a lot of sensors don't require high-speed connectivity and these low-frequency devices like pressure sensors can be effectively served with legacy serial interfaces such as UART. On the other hand, audio sensors, which demand greater bandwidth, will justify high-speed interfaces such as I2C and SPI.
It's worth noting that the new IoT-centric microcontrollers are expanding the number of peripherals to accommodate different types of sensors. Atmel's SAM4S4A, for example, offers support for I2C and SPI interfaces with multiple channels.