A novel type of sensor has been developed by Australian researchers to measure and rectify the distortion of starlight caused by viewing via the Earth’s atmosphere, which should make it less complicated to explore the option of life on faraway planets.
Optical scientists from the University of Sydney have developed a sensor using artificial intelligence and machine learning to neutralize a star’s “twinkle” caused by heat differences in the Earth’s atmosphere. This sensor would simplify the discovery and exploration of planets in faraway solar systems from optical telescopes on Earth.
“The main way we identify planets orbiting distant stars is by measuring regular dips in starlight caused by planets blocking out bits of their sun,” said lead author Dr. Barnaby Norris, who holds a joint position as a Research Fellow in the University of Sydney Astrophotonic Instrumentation Laboratory and the University of Sydney node of Australian Astronomical Optics in the School of Physics.
This is really difficult from the ground, so we needed to develop a new way of looking up at the stars. We also wanted to find a way to directly observe these planets from Earth.
Dr Barnaby Norris, Research Fellow, Sydney Astrophotonic Instrumentation Laboratory, School of Physics, University of Sydney
This invention would now be installed in one of the world’s largest optical telescopes, the 8.2-m Subaru telescope in Hawaii, managed by the National Astronomical Observatory of Japan.
“It is really hard to separate a star's 'twinkle' from the light dips caused by planets when observing from Earth,” added Dr. Norris said. “Most observations of exoplanets have come from orbiting telescopes, such as NASA’s Kepler. With our invention, we hope to launch a renaissance in exoplanet observation from the ground.”
The study was published recently in Nature Communications.
With the new “photonic wavefront sensor,” astronomers will be able to directly image exoplanets surrounding faraway stars from Earth.
In the last 20 years, thousands of planets far from the solar system have been spotted, but only a few have been directly imaged from Earth. This drastically restricts the scientific study of these exoplanets.
Creating an image of the planet offers a lot more information than indirect detection approaches, such as measuring starlight dips. Earth-like planets could look a billion times fainter compared to their host star. Plus, watching the planet separate from its star is similar to viewing a 10-cent coin held in Sydney, as seen from Melbourne.
To solve this issue, the researchers in the School of Physics built a “photonic wavefront sensor,” a new method to measure the exact distortion caused by the atmosphere, so it can then be rectified by the telescope’s adaptive optics instruments thousands of times per second.
“This new sensor merges advanced photonic devices with deep learning and neural networks techniques to achieve an unprecedented type of wavefront sensor for large telescopes,” Dr Norris noted.
“Unlike conventional wavefront sensors, it can be placed at the same location in the optical instrument where the image is formed. This means it is sensitive to types of distortions invisible to other wavefront sensors currently used today in large observatories,” he added.
This is no doubt a very innovative approach and very different to all existing methods. It could potentially resolve several major limitations of the current technology. We are currently working in collaboration with the University of Sydney team towards testing this concept at Subaru in conjunction with SCExAO, which is one of the most advanced adaptive optics systems in the world.
Olivier Guyon, Professor, Subaru Telescope and University of Arizona
He is one of the world’s leading specialists in adaptive optics.
Application Beyond Astronomy
The team has accomplished this extraordinary result by working on a novel technique to measure (and rectify) the wavefront of light that passes via atmospheric turbulence directly at the focal plane of an imaging system. This is performed using an advanced light converter, called a photonic lantern, connected to a neural network inference process.
“This is a radically different approach to existing methods and resolves several major limitations of current approaches,” stated Jin (Fiona) Wei, co-author of the study and a postgraduate student at the Sydney Astrophotonic Instrumentation Laboratory
While we have come to this problem to solve a problem in astronomy, the proposed technique is extremely relevant to a wide range of fields. It could be applied in optical communications, remote sensing, in-vivo imaging and any other field that involves the reception or transmission of accurate wavefronts through a turbulent or turbid medium, such as water, blood, or air.
Sergio Leon-Saval, Associate Professor and Director, Sydney Astrophotonic Instrumentation Laboratory, School of Physics, University of Sydney
Norris, B. R. M., et al. (2020) An all-photonic focal-plane wavefront sensor. Nature Communications. doi.org/10.1038/s41467-020-19117-w.