How is Sensor Fusion Involved in the IoT?

Sensors are now very common in electronic consumer products, as they are becoming cheaper and smaller, leading to novel applications. They are located in many different applications, such as mobile phones, vehicles, industry, healthcare, oil and climate evaluation. This widespread application is due to the development of sensor fusion, which utilizes a microcontroller (a “brain”) to compile the data acquired to get a total overview of the environment.

Combining multiple sensors allows context awareness, potentially ground-breaking for the Internet of Things (IoT). Developments in sensor fusion for emotion sensing and processing could lead to smart healthcare. Nevertheless, these competencies stimulate major privacy issues that IoT will need to address.

Huge quantities of context-aware information will become obtainable as usage of sensor fusion and REC technologies rises. This information will result in an incredible development in the distribution of context-aware services tailored for any specific situation.

Human Beings: The Ultimate Sensing Example

Human beings experience their environment through their various senses, converted to signals which are propagated through the peripheral nervous system (PNS) to the brain, stimulating a response. The decisions are not made by the PNS, they are made by the brain, which sends out motor information. If you observe a car driving towards you, your brain instructs the muscles to run to avoid a collision.

Therefore, the brain controls the decisions but the PNS is involved in the detection of signals – which is also as important. For example, the smell of smoke will tell your brain that your car is on fire, although you may not be able to see the flames. This, therefore, demonstrates the importance of multiple senses.

Sensory information (vision, hearing, smell, taste, and touch) is gathered from one's surroundings and travels through the peripheral nervous system to the brain for processing and response.

Figure 1: Sensory information (vision, hearing, smell, taste, and touch) is gathered from one's surroundings and travels through the peripheral nervous system to the brain for processing and response.

Similarly, a compilation of multiple sensors leads to more accurate sensing, with improved recognition, overcoming the individual limitations of each sensor.

Evolving Sensor Technology to Improve Everyday Life

Pedometers work by monitoring the number of times the pendulum swung back and forth. However, older pedometers must be worn vertically on the hip to prevent misreading. The development of modern MEMS-based inertial sensors has seen big improvements. The first group of MEMS-based pedometers utilized accelerometers which achieved 1-, 2- or 3-axis (3D) discovery of an individual’s acceleration, more precisely monitoring step counts.

Furthermore, they also calculated the number of movements multiple times every second. Additional pedometer improvements further added altimeters to monitor the alterations in altitude through calculating absolute air pressure. Temperature measurement also boosts the accuracy of the recording.

Pedometer Example.

Figure 2: Pedometer Example.

These pedometers can also be hung on the arm, rather than the hip, requiring a gyroscope to monitor and compensate for parasitic movement. Overall, the blend of three different types of sensors (accelerometer, altimeter, and gyroscope) result in a highly accurate pedometer.

How Sensor Fusion Works

The most rudimentary sensor fusion case is an e-compass, in which the blend of a 3D magnetometer and 3D accelerometer delivers excellent compass functionality. More intricate sensor fusion machinery provide an improved experience, involving all 3 3D accelerometers, 3D gyroscopes, and 3D magnetometers. However, each sensor type also has its own limitations:

  • Accelerometer: Sensitive to vibration
  • Gyroscope: Zero bias drift
  • Magnetometer: Sensitive to magnetic interference

By compiling all these sensors, fusion combines all the input to make a total data set which is greater than its sum of separate parts through algorithms and filtering techniques. It, therefore, has a whole range of potential uses to make life easier.

One potential issue facing the industry nowadays is the absence of standardization in operating systems (OSs). Currently, the majority of OS drivers require very basic data, thereby limiting the overall sensor capability.

Since the fusion of sensors is a section of the Microsoft® strategy,  Windows® 8 OS uses sensor-class drivers grounded on commercial standards established in partnership with their ecosystem partners. Their Runtime programming module permits lightweight executive calls which allow hardware sensor processing.

The combination of a 3D accelerometer, a 3D gyroscope, and a 3D magnetometer is termed a nine-axis system, which permits the operator 9 degrees of freedom (9-DoF). However, recently Freescale presented a 12-axis Xtrinsic sensor platform which allows a 12-DoF sensor fusion solution. This is achieved by involving a thermometer sensor, a barometer sensor, and ambient light sensing functionality.

Freescale 12-axis Xtrinsic Sensor Platform for the Windows 8 OS

As one of the initial companies to receive Windows 8 certification, this all-inclusive hardware and software fuse data from accelerometers, magnetometers, and gyroscopes utilizing a Freescale 32-bit MCU. Frequently utilized in tablets, slates, laptops and other devices, Microsoft's Windows 8 OS enlarges competences of smartphone and tablets with excellent computing power.

Straightforward sensor fusion processing necessitates 10-12 MIPS. However, in applications of 9-DoF sensor fusion require up to18-20 MIPS. There are numerous potential methods to reach these processing needs (with advantages and disadvantages for each), such as using a devoted coprocessor for sensor processing or utilizing a robust MCU with sufficient capabilities to add new functionality.

12-Axis Xtrinsic Sensor Data Flow for Windows 8.

Figure 3: 12-Axis Xtrinsic Sensor Data Flow for Windows 8.

Other Examples of Sensor Fusion

Sensor fusion may have important uses in medical and non-medical applications. For example, algorithms could be used to detect emotions by sensing a variety of processes, such as muscle relaxation, heart rate variability, sweat, attitude, and muscle contraction. This could have various applications, for example, improving the enjoyment of games and enhancing the experience of the customer, making it more challenging and enjoyable.

Context Awareness Using Emotion Sensing.

Figure 4: Context Awareness Using Emotion Sensing.

Furthermore, these sensors may be utilized to perceive emotion by gaging the manner in which a user grasps a cell phone, or by examining the manner in which the individual texts.

Leveraging Sensor Fusion for the Internet of Things

IoT has many potential applications, from associated houses to linked cars and roads to emotion-detecting devices. Internet of things is defined as clever machines interrelating and co-operating with other objects and machines, gathering and processing data to "command and control" things and make life much easier.

Internet of Things.

Figure 5: Internet of Things.

All IoT use cases require: sensing and data collection capability, layers of local embedded processing capability, wired and/or wireless communication capability, software to automate tasks and enable new classes of services, remote network/cloud-based embedded processing capability, full security across the signal path.

A wide range of sensing nodes are involved, for example; camera systems, water or gas flow meters, radar vision, RFID readers, doors and locks with open/close circuits or basic thermometers. All of these nodes carry an exclusive ID, with discrete via a distant command and control topology. Today, a smartphone with RFID or near field communication (NFC) and GPS functionality can interconnect with them and record their physical positions on the system. Hence, RFID and NFC will have a home in remote recordkeeping, and, overall, command and regulation of the IoT.

Functional View of Internet of Things Technologies.

Figure 6: Functional View of Internet of Things Technologies.

'Box-level' View of IoT Building Blocks

If we adapt the construction blocks of the IoT from humble nodes to a product-level view, we end up with sensing/edge nodes that use PAN/BAN/LAN types of communications topologies, linked to openings with diverse stages of hierarchy.

In turn, these gateways connect to the cloud via WAN communication expertise, routing data through a server for action and analysis.

'Box-level' View of IoT Building Blocks.

Figure 7: 'Box-level' View of IoT Building Blocks.

What is "Context?"

Context is defined as the circumstances or facts that form the setting for an event, situation, statement or idea. In software programming, the idea of developing context-aware applications has been around for a while. Context-aware apps examine who, where, what and when; and the software designer uses this contextual information to determine why a situation is happening and then encodes some action in the application.

When utilizing contextual information to create a deterministic act, context interfaces occur between a human being, the environment and, lastly, apparatus and structural elements. Humans emotions cannot be detected by anything as well as sensors, ultimately making an experience more "personal."

  1. The human being: Motion, posture and strides, Reaction to stimuli, Emotions under given conditions, Biometrics at any given time.
  2. The ambient environment: Location, Altitude, Temperature, Humidity, Light, Sound, Smell
  3. Infrastructure/machines being used by the person: Trajectory, Impact, Velocity, Feedback, Vibration and gyration, Structural integrity related changes

Once sensing nodes have collected the data, feedback for instant response is received or the data is passed on for processing. This involves numerous connectivity mechanisms, for example, a mobile is a "gateway" to connect to a wide area network (WAN).

Sensor Fusion Enables Context Awareness.

Figure 8: Sensor Fusion Enables Context Awareness.

Context Data Transferred for Processing.

Figure 9: Context Data Transferred for Processing.

Sensor Fusion/Context-Aware and Remote Emotive Computing-Related Services

When a sensor fusion platform is utilized, both local embedded and cloud-based software processing techniques are used to monitor conditions and deliver a novel class of services. The potential uses of these facilities are limitless:

  • Tracking fruit and vegetable cartons to monitor location, temperature, vibration, and jerkiness
  • Giving customers directions, coupons, reality maps and deal recommendations
  • Improvements to hospitals, including a reduction in the number of times a hospital nurse has to check patients' vital signs, monitor handwashing, and tracking medical conditions
  • Monitoring bridge sturdiness to prevent accidents

No major developments are needed to make these scenarios occur, and only minor improvements are required to makes these ideas a reality. Therefore, this demonstrates how sensors can improve the lives of many. However, there are lots of issues with IoT too – such as privacy and security of concerns.

Context Helping Big Data.

Figure 10: Context Helping Big Data.

This information has been sourced, reviewed and adapted from materials provided by Mouser Electronics.

For more information on this source, please visit Mouser Electronics.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this article?

Leave your feedback