Posted in | News | Light / Image Sensor

Image Sensors With Nanostructured Components Enhance Machine Vision

Light intensity is measured by image sensors, but to make substantial progress in machine vision, other properties of light, such as angle and spectrum, must also be extracted.

Image Sensors With Nanostructured Components Enhance Machine Vision.
The schematics of (a) a conventional sensor that can detect only light intensity and (b) a nanostructured multimodal sensor, which can detect various qualities of light through the light-matter interactions at subwavelength scale. Image Credit: Yurui Qu and Soongyu Yi

Researchers from the University of Wisconsin-Madison, Washington University in St. Louis, and OmniVision Technologies illustrate the most recent nanostructured components integrated on image sensor chips that are most likely to have the biggest impact on multimodal imaging in Applied Physics Letters, published by AIP Publishing.

The advancements might enable telescopes to see through interstellar dust, biological imaging to find anomalies at various tissue depths, and autonomous vehicles to see around corners rather than merely in a straight line.

Image sensors will gradually undergo a transition to become the ideal artificial eyes of machines. An evolution leveraging the remarkable achievement of existing imaging sensors is likely to generate more immediate impacts.

Yurui Qu, Study Co-Author, University of Wisconsin-Madison

On a single chip, there are millions of pixels that make up image sensors, which turn light into electrical impulses. One of the main issues, however, is finding a way to mix and compactly fit multifunctional components into the sensor.

In their study, the scientists described a potential method for creating an on-chip spectrometer to detect multiple-band spectra. To produce intricate interactions between incident light and the sensor, silicon-based photonic crystal filters were directly placed on top of the pixels.

The light energy distribution is captured by the pixels underneath the films, and from this data, light spectral information can be deduced. The device, which is less than a tenth of an inch square in size, can be programmed to work with practically any spectral regime from visible to infrared, as well as different dynamic ranges and resolution levels.

The scientists created a component that recognizes angular data to estimate depth and create 3D structures at subcellular sizes. Their research was influenced by the directional hearing sensors that are present in creatures like geckos, whose heads are too small to accurately discern the direction of sound in the same way that humans and other animals can.

As an alternative, they measure the direction of sound using connected eardrums in a region that is orders of magnitude smaller than the corresponding acoustic wavelength.

To facilitate optical resonance, pairs of silicon nanowires were built as resonators. The incident angle affects how much optical energy the two resonators store. The wire that is nearer to the light transmits the most current. The angle of the incoming light waves can be identified by comparing the two wires’ strongest and weakest currents.

A chip that is 1 mm2 in size can accommodate millions of these nanowires. The results of this study could contribute to improvements in robotic vision, augmented reality, and lensless cameras.

Journal Reference:

Qu, Y., et al. (2022) Multimodal light-sensing pixel arrays. Applied Physics Letters. doi:10.1063/5.0090138.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.