Researchers Develop Smart Necklace to Detect Facial Expressions

Human facial movements express emotions and enable nonverbal communications and physical activities like drinking and eating.

Researchers Develop Smart Necklace to Detect Facial Expressions.
Facial recognition captured by a mobile phone camera (left), and facial reconstruction using NeckFace (right). Image Credit: Sci-Fi Lab/Provided.

Detecting facial movements and their cause is one of the suggested applications for NeckFace, which is one of the first necklace-type wearable sensing technologies.

A research team guided by Cheng Zhang, assistant professor of information science in the Cornell Ann S. Bowers College of Computing and Information Science, has designed NeckFace, capable of continuously tracing complete facial expressions by employing infrared cameras. These cameras capture the images of the chin and face from below the neck.

The study titled “NeckFace: Continuously Tracking Full Facial Expressions on Neck-mounted Wearables” was published in the journal Proceedings of the ACM on Interactive, Mobile, Wearable, and Ubiquitous Technologies on June 24th, 2021.

The co-lead authors of the study are Tuochao Chen from Peking University and Yaxuan Li from McGill University, visiting students in the Smart Computer Interfaces for Future Interactions (SciFi) Lab, and Cornell MPS student Songyun Tao.

The other contributors are HyunChul Lim, MoseSakashita and Ruidong Zhang, Cornell PhD students in the field of information science, and François Guimbretière, professor of information science in the Cornell Bowers College.

NeckFace is the next generation of Zhang’s earlier research, which yielded C-Face, a device relevant to the new one but available in a headset form.

According to Zhang, the NeckFace offers remarkable developments in privacy and performance and provides the wearer the choice of a less-obstructive neck-mounted device.

Besides the possible emotion tracing, many other applications are also proposed by Zhang. These include facial expression detection in virtual reality conditions, silent speech recognition and virtual conferencing even when a front-facing camera is unavailable.

The ultimate goal is having the user be able to track their own behaviors, through continuous tracking of facial movements. And this hopefully can tell us a lot of information about your physical activity and mental activities.

Ruidong Zhang, Study Principal Investigator, SciFi Lab

According to Guimbretière, NeckFace has the ability to transform video conferencing.

The user wouldnt need to be careful to stay in the field of view of a camera. Instead, NeckFace can recreate the perfect headshot as we move around in a classroom, or even walk outside to share a walk with a distant friend.

François Guimbretière, Professor of Information Science, Cornell Bowers College

Zhang and his collaborators assessed the effectiveness of NeckFace through a user study involving 13 participants. Each of the participants were requested to make eight facial expressions while sitting and another eight while walking. In sitting positions, the participants were requested to rotate their heads while doing the facial expressions and also to remove and remount the equipment in one session.

NeckFace was subjected to test in two designs — a necklace with a pendant-like infrared (IR) camera device hanging beneath the neck and a neckband draped around the neck with two cameras just below the collarbone level.

The team drafted the basic facial movement data with the help of the TrueDepth 3D camera on an iPhone X and compared that with the data gathered using Neckface. The participants expressed a sum of 52 facial shapes between sitting, walking and remounting expressions.

With the help of assessments through deep learning, the team identified that NeckFace was able to detect facial movement with almost equal precision as the direct measurements using the phone camera.

The neckband exhibited more precise detection compared to the necklace. This could be because the dual camera on the neckband was capable of capturing additional information from both sides than the center-mounted necklace camera.

According to Zhang, upon optimization, the device could be especially beneficial in the mental health realm to track people’s emotions over an entire day. Even though people do not always express emotion on their face, the amount of facial expression that varies over time can denote emotional swings.

Can we actually see how your emotion varies throughout a day. With this technology, we could have a database on how youre doing physically and mentally throughout the day, and that means you could track your own behaviors. And also, a doctor could use the information to support a decision.

Ruidong Zhang, Study Principal Investigator, SciFi Lab

The study was financially supported by the Cornell Department of Information Science.

Journal Reference:

Chen, T., et al. (2021) NeckFace: Continuously Tracking Full Facial Expressions on Neck-mounted Wearables. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. doi.org/10.1145/3463511.

Source: https://www.cornell.edu/

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.