SkillRary

Please login to post comment

Sensor Fusion – The Future of Internet Of Things

  • Amruta Bhaskar
  • Jan 12, 2021
  • 0 comment(s)
  • 1661 Views

Sensor fusion is the art of combining multiple physical sensors to produce accurate "ground truth", even though each sensor might be unreliable on its own. Sensor fusion is the ability to bring together inputs from multiple radars, lidars and cameras to form a single model or image of the environment around a vehicle. The resulting model is more accurate because it balances the strengths of the different sensors. Vehicle systems can then use the information provided through sensor fusion to support more-intelligent actions.

Each sensor type, or “modality,” has inherent strengths and weaknesses. Radars are very strong at accurately determining distance and speed — even in challenging weather conditions — but can’t read street signs or “see” the colour of a stoplight. Cameras do very well reading signs or classifying objects, such as pedestrians, bicyclists or other vehicles. However, they can easily be blinded by dirt, sun, rain, snow or darkness. Lidars can accurately detect objects, but they don’t have the range or affordability of cameras or radar.

Sensor fusion brings the data from each of these sensor types together, using software algorithms to provide the most comprehensive, and therefore accurate, environmental model possible. It can also correlate data pulled from inside the cabin, through a process known as interior and exterior sensor fusion.

The most basic sensor fusion example is an e-compass, in which the combination of a 3D magnetometer and 3D accelerometer provides compass functionality. More complex sensor fusion technologies give users an enhanced experience, leveraging and combining 3D accelerometers, 3D gyroscopes and 3D magnetometers (which measure the components of the magnetic field in a particular direction, relative to the spatial orientation of a given device). Each of these sensor types provides unique functionality but also has limitations:

Accelerometer: x-, y- and z-axis linear motion sensing, but sensitive to vibration

Gyroscope: pitch, roll and yaw rotational sensing, but zero bias drift

Magnetometer: x-, y- and z-axis magnetic field sensing, but sensitive to magnetic interference

When combining all of these technologies, sensor fusion takes the simultaneous input from the multiple sensors, processes the input and creates an output that is greater than the sum of its parts (i.e., by using special algorithms and filtering techniques, sensor fusion eliminates the deficiencies of each individual sensor- similarly to how the human body functions, as described above).

Sensor fusion provides a whole host of capabilities that can make our lives easier and enables a variety of services that can leverage these capabilities.


The IoT encompasses many use cases- from connected homes and cities to connected cars and roads to devices that track an individual's behaviour and use the data collected for "push" services. The IoT is a sort of universal "global neural network in the sky" that will touch every aspect of our lives. From a technology perspective, the IoT is being defined as smart machines interacting and communicating with other machines, objects, environments and infrastructures, resulting in volumes of data generated and processing of that data into useful actions that can "command and control" things and make life much easier for human beings.

Requirements common to all IoT use cases include:

  • Sensing and data collection capability (sensing nodes)
  • Layers of local embedded processing capability (local embedded processing nodes)
  • Wired and/or wireless communication capability (connectivity nodes)
  • Software to automate tasks and enable new classes of services
  • Remote network/cloud-based embedded processing capability (remote embedded processing nodes)
  • Full security across the signal path

The types of sensing nodes needed for the IoT vary widely, depending on the applications involved. Sensing nodes could include a camera system for image monitoring, water or gas flow meters for smart energy, radar vision when active safety is needed, RFID readers sensing the presence of an object or person, doors and locks with open/close circuits that indicate a building intrusion or a simple thermometer measuring temperature. Who could forget the heat-seeking mechanical bugs that kept track of the population of a building in the movie Minority Report? Those mechanical bugs represent potential sensing nodes of the future.

These nodes all will carry a unique ID and can be controlled separately via a remote command and control topology. Use cases exist today in which a smartphone with RFID and/or near field communication (NFC) and GPS functionality can approach individual RFID/NFC-enabled "things" in a building, communicate with them and register their physical locations on the network. Hence, RFID and NFC will have a place in remote registration, and, ultimately, command and control of the IoT.

Healthcare and medical electronics is another area where sensor fusion from accelerometers and gyroscopes is enabling exciting new systems. Advances in the miniaturisation and power consumption of MEMS sensor devices and microcontrollers are enabling wearable sensor systems that can be used in a variety of medical environments. For example, body-worn systems which monitor the movement of limbs can be helpful for physiotherapy, to ensure exercises are being done correctly. Wearable activity trackers, already popular in the consumer wellness market, may in the future have their data fused with data from wearable heart rate monitors, temperature sensors, etc. as part of telehealth services or remote monitoring of patient conditions. Uploading and analysing this data in the cloud means it can be accessed and reviewed by doctors at any time. Intelligent sensor fusion of vital signs data gathered by body-worn sensors can even make it possible for electronic systems to diagnose common diseases without seeing doctors at all.

Behind these new sensor fusion applications are many, many innovations in both hardware and software. On the hardware side, MEMS sensors can be integrated into any number of different combinations, into tiny, power-efficient packages. MEMS have also vastly reduced in price in recent years due to miniaturisation and new automatic calibration techniques, while their limitations in terms of accuracy have been offset with advanced sensor fusion techniques. These innovations are set to bring sensor fusion to more varied applications than ever before.

Using other technological advances such as digital signal processing, huge amounts of data can now be fused very quickly to allow system response to be provided in real-time, while wireless internet access provides sensor systems with access to huge computing power in the cloud. The eventual aim is to emulate with electronics the ultimate in sensor fusion hardware – the human body – which uses the brain as a processor to fuse data from the nervous system, visual systems and other sensory inputs to allow people to perform incredibly complex tasks.


Please login to post comment

( 0 ) comment(s)