Posted in | News | Sensors General

‘Mixed Reality’ Enhances Maintenance of Buildings

Building maintenance staff would find it simpler to find and address problems in operational commercial buildings with a new technology that combines virtual reality with real-world sensors. Computer experts from Carnegie Mellon University and the University of California San Diego created the system.

‘Mixed Reality’ Enhances Maintenance of Buildings

An example VR scene for an office with building hardware mapped as mixed reality objects that can be interacted with. Image Credit: University of California San Diego

The system, called BRICK, comprises a portable device with several sensors to track airflow, temperature, and CO2 levels. Additionally, it has a virtual reality environment that is linked to the building’s electronic control system and has access to the sensor data and metadata within that particular building.

Building managers can utilize the device to swiftly scan the area with their smartphone’s Lidar tool when a problem is identified in a specific region. This allows them to create a virtual reality representation of the space while they are on-site. The scanning might also take place beforehand.

When building managers launch this mixed reality reproduction of the area on a laptop or smartphone, they can discover sensors superimposed over the mixed reality environment together with the data collected from the portable device.

The intention is to enable building managers to acquire and log pertinent data, evaluate hardware, and rapidly detect concerns.

Modern buildings are complex arrangements of multiple systems from climate control, lighting and security to occupant management. BRICK enables their efficient operation, much like a modern computer system.

Rajesh K. Gupta, Study Senior Author and Professor, Department of Computer Science and Engineering, University of California San Diego

Building managers now have to check the building management database for the specific location first after receiving complaints of an issue. However, the system does not inform them of the precise location of the hardware and sensors in that area.

To determine what the problem is, managers must visit the location, collect more data using heavy sensors, and then compare that data with the details in the building management system. Accurately recording the data collected at different geographical locations is another challenge.

With BRICK, on the other hand, the building manager can go straight to the location with a laptop or smartphone in addition to a portable device. They will have instant access to all of the data from the building management system, the sensors’ locations, and the data from the portable device, all of which will overlay in a single mixed reality environment.

By using this method, the operators can also find malfunctions in the building equipment, such as poorly functioning handling systems or jammed air-control valves.

In the future, researchers expect to identify CO2, temperature, and airflow sensors that can link directly to a smartphone, allowing occupants to participate in regulating local surroundings while also simplifying building operations.

The handheld device was designed by a team from Carnegie Mellon. Xiaohan Fu, a computer science PhD student in Rajesh Gupta’s research group at the Halicioglu Data Science Institute, created the backend and VR components, building on their previous work on the BRICK metadata standard, which has been used by several commercial vendors.

Ensuring that the location utilized in the VR environment was accurate was a significant difficulty. GPS accuracy is limited to around a meter. In this situation, the system must be precise to a few inches. The researchers’ answer was to place a few AprilTags—similar to QR codes—in each room, which would be scanned by the portable device’s camera and recalibrate the system to the right position.

It is an intricate system. “The mixed reality itself is not easy to build. From a software standpoint, connecting the building management system, where hardware, sensors and actuators are controlled, was a complex task that requires safety and security guarantees in a commercial environment. Our system architecture enables us to do it in an interactive and programmable way.

Xiaohan Fu, Ph.D. Student, Halicioglu Data Science Institute

The team presented their findings at the BuildSys 23 Conference in Istanbul, Turkey, on November 15th and 16th, 2023.

The study was supported by the CONIX Research Center, one of six centers of JUMP, a Semiconductor Research Corporation program funded by DARPA.

Journal Reference

Fu, X., et. al. (2024) Debugging Buildings with Mixed Reality. BuildSys ‘23: Proceedings of the 10th ACM International Conference on Systems for Energy-Efficient Buildings, Cities, and Transportation. doi:10.1145/3600100.3626258.

Source: https://today.ucsd.edu/

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit
Azthena logo

AZoM.com powered by Azthena AI

Your AI Assistant finding answers from trusted AZoM content

Your AI Powered Scientific Assistant

Hi, I'm Azthena, you can trust me to find commercial scientific answers from AZoNetwork.com.

A few things you need to know before we start. Please read and accept to continue.

  • Use of “Azthena” is subject to the terms and conditions of use as set out by OpenAI.
  • Content provided on any AZoNetwork sites are subject to the site Terms & Conditions and Privacy Policy.
  • Large Language Models can make mistakes. Consider checking important information.

Great. Ask your question.

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.