Insights from industry

Motion Processing and Data Fusion Technology

Dave Rothenberg, Movea’s Director of Marketing and Partner Alliances talks to AZoSensors about motion processing and data fusion technology.

How does the Movea SmartFusion™ Studio tool enable better, faster development and debugging of motion-enabled and context awareness features?

Movea’s SmartFusion Studio is a signal processing design toolkit that provides mobile device OEMs and ODMs with a set of high-performance calibration and debugging tools, enabling rapid integration of motion-based features for compelling context-aware applications.

Processing motion whether on an application processor or sensor hub architecture is a fairly new capability, so there are not many intuitive tools for the ecosystem to leverage.

Using optical tools, such as coda motion to check calibration is very expensive, internally-developed solutions have turned out to be less efficient and cost-effective than Movea’s SmartFusion Studio, for example, and going without a calibration and validation tool has proven to result in mediocre output quality, ultimately delivering a poor customer experience.

Through SmartFusion Studio’s simple graphical user interface, customers and partners can quickly connect to sensors, gather data, and accurately calibrate and validate sensor data integration in the early development phase.

By enabling developers to test and validate the output of MotionCore implementation at the engineering level, compared to a gold reference, developers have the confidence of developing the right data, maximizing efficiency, and significantly reducing the time and cost associated with testing and deploying motion-based features for mobile.

What are the motion-enabled and context awareness features of this new technology?

By making it easier to integrate motion and data fusion technology, Movea’s SmartFusion Studio tool allows mobile developers and manufacturers to deploy applications, such as indoor navigation (for use inside shopping malls, airports, and large retail stores, for example), gesture control (for home entertainment and immersive mobile gaming, for example), physical activity monitoring, augmented reality, contextual awareness and more.

How do these features benefit the end-user?

Indoor navigation will be just one of the valuable services that sensor-enabled devices and environments will deliver to users, helping us get where we’re going, find what we need, and save time doing it.

Gesture-controlled devices provide enhanced ease-of-use in many applications, and motion-enabled gaming and other entertainment content enable consumers to enjoy rich and immersive media experiences.

More generally, we see the following benefits across our three major markets:

Sports and Fitness

Through sensors, professional and amateur athletes alike are able to analyze their performance and improve their gameplay quickly and easily. For example, with Babolat’s Play and Connect tennis racket, which leverages Movea’s unique technology, tennis players can collect and display detailed, real-time game statistics such as stroke power, stroke type, ball spin, and impact location (i.e., sweet spot). And through the use of their mobile devices, users will be able to review and analyze their game stats immediately.

Movea has also collaborated with leading sporting goods manufacturer Oxylane to develop a waterproof MP3 player and lap counter and a next-generation pedometer.

Home Entertainment

Today’s motion-sensing technology gives consumers a more intuitive and natural home entertainment experience, making it easy to navigate through interactive content. For example, with Orange’s Livebox Play set-top box, which leverages Movea’s SmartMotion technology, contextual awareness features have been integrated, which allows for a more user-friendly experience for people of all ages and interests.

Consumers will be able to adjust volume with a gentle twist of the wrist, make a “check” gesture to select an item, and a “wipe up and back” gesture to close an application. In addition, MEMS motion sensors onboard the remote control let Orange subscribers easily point and click to navigate the user interface and play compelling motion-driven games with smooth accurate 3D motion.


As sensors become ubiquitous and cloud computing more pervasive, both our mobile devices and our environments will become context-aware, accessing public and private information such as emails, GPS location, weather, transit schedules, etc. to deliver smarter services to consumers.

Movea’s data fusion is capable of collecting and analyzing data from the user along with current information from the device and the cloud to make an informed decision and alert the mobile device user. For example, a passenger running to catch the train for work could be alerted to, “stop running because the train is 10 minutes delayed.”

This along with more intuitive user interface navigation, advanced pedestrian navigation, activity monitoring, contextual awareness, and immersive motion-based mobile gaming are technologies that are being integrated into our mobile devices today.

What sensor data is recorded with this new SmartFusion Studio tool?

The SmartFusion studio installed on a computer easily communicates to the phone over a simple Wi-Fi connection. Through this Wi-Fi connection, the tool collects data from the motion sensors on the phone: a 3-axis accelerometer, a 3-axis gyroscope, and a 3-axis magnetometer. So we’re collecting information about acceleration, rotation, and the earth’s magnetic field. The data is saved in files on the host for playback, analysis, and new feature development.

What have been the main challenges in implementing data fusion capabilities for mobile devices?

The quantity and types of data we must collect for some applications are immense and widely varied.  

Some companies, being domain experts in a given field, are already bringing the first wave of this information to users in the form of specialized services and databases. Movea is part of that wave and we’ve developed data and processing models for many applications.

However, the future context-aware applications being imagined today will require the integration of these various domain data in ways that people are just now beginning to consider for commercial applications, hence Movea’s role as a market enabler. Movea fuses these pieces of data together to create intelligent services that can be used for context-aware applications, pedestrian navigation and augmented reality in TV, mobile, and athletic devices.

How does this new tool reduce the time and cost associated with developing, testing, and deploying motion-based features?

SmartFusion Studio provides automation and analysis features that facilitate the process of calibration and integration. Without tools such as this, people often forego calibration all together, or they conduct simpler, but less effective, calibration which can negatively impact the user experience.

What sort of capabilities can we expect in future developments of this tool?

SmartFusion Studio is actually part of a larger product:  Smart Fusion Lab.

SmartFusion Lab is an advanced signal processing studio and software development tool designed to accelerate the development and debugging of advanced data fusion features. We intend to make SmartFusion Lab available to strategic partners later this year.

SmartFusion Lab will accelerate the process of building complex signal processing data flows with sensors and cloud-based data sources, complemented by a full set of visualization tools. By putting this tool in the hands of developers, we’ll enable our partners to take the complexity of motion processing and data fusion while still benefiting from state-of-the-art technology.

What new advanced features are Movea planning on implementing into their latest technology?

With a world-class research and development team, we’re constantly developing new techniques and features. You can expect to see much more from us in the area of indoor pedestrian navigation, context awareness, and exciting new sports and entertainment applications.

How does Movea plan on developing this technology and how does this company see such technology progressing over the next decade for its users?

We see a growing demand for smarter consumer devices. With the rise of the Internet of things, OEMs, application developers, and system integrators are increasingly implementing context-aware, motion-sensing and gesture-based applications in their electronics.

Whether it’s in the home, in your car or in your office, we know that consumers want to interact in a more natural way with their devices.

They are also starting to expect more from their gadgets— from getting accurate directions both outdoors and indoors (e.g., in a mall, airport, train station, etc.), to capturing their activities throughout the day (e.g., miles walked, calories burned, etc.) and providing them with useful information without them having them to ask for it (e.g., alerting them that their flight has been delayed even before they get to the airport). This, in turn, will fuel the rise of motion sensing and data fusion technologies, without which, these innovations won’t be possible.

About Dave Rothenberg Dave Rothenberg

Dave Rothenberg is Movea’s director of marketing and partner alliances and has more than 14 years of go-to-market experience productizing and commercializing new technologies for companies in Silicon Valley and Europe.

He takes an interdisciplinary approach to business having held senior management roles in marketing, business development, and engineering, across a range of markets including enterprise software, wireless and mobile, consumer electronics, health and fitness and Web services. Dave holds dual degrees in aerospace engineering and physics from the University of Colorado, Boulder.

Disclaimer: The views expressed here are those of the interviewee and do not necessarily represent the views of Limited (T/A) AZoNetwork, the owner and operator of this website. This disclaimer forms part of the Terms and Conditions of use of this website.


Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Kaur, Kalwinder. (2020, October 02). Motion Processing and Data Fusion Technology. AZoSensors. Retrieved on May 22, 2024 from

  • MLA

    Kaur, Kalwinder. "Motion Processing and Data Fusion Technology". AZoSensors. 22 May 2024. <>.

  • Chicago

    Kaur, Kalwinder. "Motion Processing and Data Fusion Technology". AZoSensors. (accessed May 22, 2024).

  • Harvard

    Kaur, Kalwinder. 2020. Motion Processing and Data Fusion Technology. AZoSensors, viewed 22 May 2024,

Tell Us What You Think

Do you have a review, update or anything you would like to add to this article?

Leave your feedback
Your comment type

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.