Posted in | News | Biosensors

University of Tokyo, ZSpace Partner to Offer High-Speed 3D Hand Gesture Interface

University of Tokyo and zSpace, Inc. today announced a new partnership and technology integration with the zSpace immersive 3D platform. Developed by Professor Masatoshi Ishikawa at the Ishikawa Watanabe Laboratory at University of Tokyo, the technology combines high-speed tracking and gesture recognition of both hands simultaneously as an input to an application which utilizes zSpace, an interactive platform which allows users to manipulate virtual 3D objects displayed in open space.

The high speed 3D hand gesture interface utilizes super high speed stereo cameras, which make it possible to recognize gesture and track 3D position (500 fps) with extremely low latency. With such high performance, very fast and tiny movements, such as shaking, can be detected easily. Utilizing high speed cameras and infrared LEDs, the zSpace virtual environment allows users to directly manipulate 3D objects using two hands, or using one hand and a stylus. Not only fast, the system is also correctly aligned with the real world, making the interaction very intuitive. The application demonstrates an animated beating heart model floating in open space. Through the use of this technology, the user can pick up the heart in one hand and with the other hand can dissect the heart using the stylus with pinpoint accuracy. Click here for a virtual demonstration.

"Our laboratory conducts research on exploring parallel, high-speed, and real-time operations for sensory information processing," said Professor Masatoshi Ishikawa, founder of the Ishikawa Watanabe Laboratory at University of Tokyo. "With the use of zSpace and the technology we have developed, we are creating a new sensation-enhancing technology that is meaningful to people. The future of next generation information environments and human interfaces based on various high-speed technologies is finally here."

Plans to commercialize this technology through a spin-out venture of Ishikawa Watanabe Laboratory, Exvision, are also being developed as a result of this partnership. "With this partnership, we are excited to find new industrial markets and strongly promote the transfer of our technology and the University of Tokyo's research in diverse ways, including collaborative research and commercialization," said Paul Kellenberger, CEO of zSpace, Inc.

About Ishikawa Watanabe Laboratory

Ishikawa Watanabe Laboratory researches 1) sensor fusion: high speed robot and visual feedback, 2) dynamic image control: 3D image tracking and adaptive optics, 3) vision architecture: high speed vision and applications, and 4) meta perception: interactive human interface and media control. The Ishikawa Watanabe Laboratory is part of the Department of Information Physics and Computing and the Department of Creative Informatics in the Graduate School of Information Science and Technology at The University of Tokyo.

About zSpace

zSpace is a leading-edge technology provider that delivers a new way of learning with its flagship product, zSpace®. Focused on the learning market, specifically science, technology, engineering and math (STEM) education, medical instruction, corporate training and research, zSpace inspires and accelerates understanding through immersive exploration. zSpace is located in Sunnyvale, CA, and has filed more than 30 patents for its innovative technologies.

Source: http://zspace.com/

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.