Home » Facial Recognition Tech To Be Used To Unlock Dark Matter Secrets In The Universe
Tech

Facial Recognition Tech To Be Used To Unlock Dark Matter Secrets In The Universe

Facial Recognition Tech To Be Used To Unlock Dark Matter Secrets In The Universe

Could the same technology that is used to unlock people’s smartphones also help unlock the potential secrets of the universe? It may sound unlikely, but that’s exactly what researchers from Switzerland’s science and technology-focused university ETH Zurich are working to achieve.

Using a variation of the type of artificial intelligence neural network that is behind today’s facial recognition technology, they have developed new A.I. tools which could prove a game-changer in the discovery of so-called “dark matter.” Physicists believe that understanding this mysterious substance is necessary to explain fundamental questions about the underlying structure of the universe.

“The algorithm we [use] is very close to what is commonly used in facial recognition,” Janis Fluri, a Ph.D. student who works in an ETH Zurich lab focused on applying neural networks to cosmological problems, told Digital Trends. “The beauty of A.I. is that it can learn from basically any data. In facial recognition, it learns to recognize eyes, mouths, and noses, while we are looking for structures that give us hints about dark matter. This pattern recognition is essentially the core of the algorithm. Ultimately, we only adapted it to infer the underlying cosmological parameters.”

Dark matter matters

But what exactly is it that the researchers are looking for? Right now, it’s not entirely known. But as United States Supreme Court Justice Potter Stewart memorably stated about obscenity, “I know it when I see it.” Or, rather, we won’t, because it can’t be seen. But scientists will know once they’ve found it. Welcome to the weird world of dark matter.

Getty

The existence of dark matter in some form has been hypothesized for well over a century. It is thought to account for around 27% of the universe, outweighing visible matter by a ratio of approximately six to one. Everything in the universe that we can detect — all the atomic matter that makes up galaxies, stars, planets, life on Earth, the device that you’re reading this article on — is just a tiny, tiny fraction of all the matter that exists. The overwhelming bulk of it cannot be tracked directly. It is invisible and able to pass straight through regular visible matter.

Instead, its existence is predicated on our observations about the way the universe works; like a housemate you never see but are sure exists because their half of the bills get paid and someone is occasionally using the shower when you want it. Only in this case, it’s because scientists have worked out that the speed at which galaxies rotate is sufficiently fast that they could not be held together simply by the gravity generated by observable matter. Dark matter is therefore theorized to be the secret ingredients that give these galaxies the extra mass they require to not tear themselves apart like a suicidal paper bag. It is what drives normal matter in the form of dust and gas to collect and assemble into stars and galaxies.

Weak gravitational lensing to the rescue

Looking for something that can’t be looked at sounds difficult. It is. But there is a way that scientists are able to pinpoint where they think dark matter is most likely to be situated. They do this by looking at the subtle ways that light that the gravity of big galaxy clusters bends and distorts the light from more distant galaxies. This is called weak gravitational lensing.

Getty

Observing the areas around massive clusters of galaxies lets astronomers identify background galaxies which appear warped. By reverse-engineering these distortions they can then isolate where they believe the densest concentrations of matter, both visible and invisible, can be found. Think of it like the mirage effect which causes far-off images to be blurred and shimmery on a hot day — only a whole lot further away.

“Previously one would study weak lensing mass maps by manually selecting the relevant features,” Janis Fluri explained. “This is a very complicated task and there are no guarantees that the selected features contain all of the relevant information. We solve this problem with the A.I. approach. The convolutional neural networks used in our work excel at pattern recognition.”

A convolutional neural network is a type of brain-inspired artificial intelligence that is frequently used for image classification tasks. While its neurons still have the learnable weights and biases of conventional neural networks (i.e. the things which allow it to learn), its explicit assumption that it is dealing with images as inputs allow its creators to reduce the number of parameters in the network. This makes it more efficient.

“Roughly speaking, [it works by us providing the networks] with a large quantity of data they automatically create a set of complex filters to extract the relevant information of the maps,” Dr. Tomasz Kacprzak, one of the other co-authors of the project, told Digital Trends. “Then it tries to optimally combine these filters to give as precise of an answer as possible.”

Extracting the cosmological parameters

The researchers trained their neural network by feeding it computer-generated data which simulates the universe. This allowed it to repeatedly analyze dark matter maps so as to be able to extract “cosmological parameters” from real images of the night sky. The results showed improvements of 30% compared to traditional methods, based on human-made statistical analysis.

“The A.I. algorithm needs a lot of data to learn in the training phase,” Fluri continued. “It is very important that this training data, in our case simulations, are as accurate as possible. Otherwise, it will learn features that are not present in real data. To do this, we had to generate a lot of large and accurate simulation, which was very challenging. Afterward, we had to tweak the algorithm to achieve peak performance. This was done by testing multiple network architectures to optimize the performance.”

They then used their fully trained neural network to analyze actual dark matter maps. These came from the so-called KiDS-450 dataset, made using the VLT Survey Telescope (VST) in Chile. The dataset covers a total area of some 2,200 times the size of the full moon. It contains records of around 15 million galaxies.

Because of this extraordinarily large amount of data, the researchers needed a supercomputer to put their artificial intelligence into action. They ultimately ran their A.I. on a computer at the Swiss National Supercomputing Center in Lugano, a city in southern Switzerland which borders Italy. The supercomputers at CSCS are available to all Swiss universities and research institutions. Its machines are so powerful that, in order to stop them overheating, water from the nearby Lake Lugano is pumped in for cooling at a rate of 460 liters per second.

A cosmological A.I.

“This was the first application of A.I. for real cosmological data, including all practical aspects that come with it,” Fluri said. “We could show that our method produces consistent results on a relatively small data set. We hope to use the same method on larger observations, but also measuring more cosmological parameters to probe other aspects of cosmological physics. Finally, we hope to learn new insights about [the] dark sector of the universe.”

According to Fluri, the team has now moved beyond the KiDS-450 dataset, “as there are newer and better datasets now.” One in particular is the Dark Energy Survey, a massive scale visible and near-infrared survey carried out by research institutions and universities from the U.S., Brazil, the United Kingdom, Germany, Spain, and Switzerland.

“Before we can analyze new datasets however, we need to adapt out method such that it can handle the increased data volume,” Fluri said. “We are currently experimenting with some methods to achieve that. After that we will discuss the next dataset we want to analyze. I can not give you a timescale yet, as it depends on the chosen dataset and the requirements of the simulations.”

via source