Uses dark matter distribution to test

It feels like the classic paradox: How do you see the invisible? But for modern astronomers, it’s a very real challenge: How do you measure dark matter, which by definition does not emit light?

Credit: Stephanie N. Reif, Princeton University

It feels like the classic paradox: How do you see the invisible? But for modern astronomers, it’s a very real challenge: How do you measure dark matter, which by definition does not emit light?

The answer: you see how it affects you Can see. In the case of dark matter, astronomers observe how light from distant galaxies bends around it.

An international team of astrophysicists and cosmologists have spent the last year uncovering the secrets of this elusive material, using state-of-the-art computer simulations and observations from one of the world’s most powerful astronomical cameras, the Hyper Suprime-Cam (HSC). The team, led by astronomers from Princeton University and the Japanese and Taiwanese astronomical communities, used data from the first three years of the HSC sky survey, a wide-field imaging survey conducted with the Subaru 8.2-metre telescope atop Maunakea in Hawaii. Subaru is operated by the Japan National Astronomical Observatory; its name is the Japanese word for the star cluster we call the Pleiades.

The team presented their findings at a webinar attended by over 200 people, and they will share their work at the “Future Science with CMB x LSS” conference in Japan.

“Our overall goal is to measure some of the most fundamental properties of our universe,” said Rohi Dalal, an astrophysics graduate student at Princeton. “We know that dark energy and dark matter make up 95% of our universe, but we understand very little about what they really are and how they evolved throughout the history of the universe. Clumps of dark matter distort the light of distant galaxies through weak gravitational lensing, a phenomenon predicted by Einstein’s General Theory of Relativity. This distortion is a very small effect; the shape of one galaxy is distorted by an unseen amount. But when we took measurements for 25 million galaxies, we were able to measure the distortion with fairly high precision.”

To jump to the punchline: The team has measured a value for the universe’s dark matter “awe” (known to cosmologists as “S8”) of 0.776, which agrees with the value found by other gravitational lensing surveys looking at the universe relatively recently — but does not harmonize with the 0.83 value that comes from the Cosmic Microwave Background, which comes from the origin of the universe.

The gap between these two values ​​is small, but as more research confirms each of the two values, it seems accidental. Another possibility is that there is some unrecognized or incorrect error in either of these two measurements or the standard cosmological model is incomplete in some interesting way.

“We’re still being careful here,” said Michael Strauss, chair of Princeton’s Department of Astrophysical Sciences and co-leader of the HSC team. “We’re not saying that we just discovered that modern cosmology is all wrong, because, as Roohi emphasizes, the effects we measured were very subtle. Now, we think we’ve got our measurements right. And statistics show that there’s only a one in 20 chance that it’s simply due to chance, which is interesting but not entirely certain. But when we in the astronomy community come to the same conclusion over various experiments, as we continue to make these measurements, we may find that this is real.”

Hide and reveal data

The idea that some changes are needed in the standard cosmological model, that there are fundamental parts of cosmology that have yet to be discovered, is a very tempting idea to some scientists.

“We are humans, and we have preferences. That’s why we do what we call a ‘blind’ analysis,” said Strauss. “Scientists have become self-aware enough to know that we will be biased ourselves, no matter how careful we are, unless we carry out the analysis without letting ourselves know the results all the way. For me, I wanted to really discover something fundamentally new. That would be so much fun. But because I’m biased in that direction, we want to be very careful that it doesn’t influence any analysis that we do.”

To protect their work from their bias, they literally hide their results from themselves and their colleagues – month after month.

“I worked on this analysis for a year and didn’t get to see the values ​​emerge,” Dalal says.

The team even added an extra layer of confusion: they ran their analysis on three different galaxy catalogs, one real and two with numerical values ​​offset by random values.

“We don’t know which one is the original, so even if someone accidentally sees the value, we don’t know whether the results are based on the original catalog or not,” he said.

On February 16, the international team got together on Zoom — at night in Princeton, in the morning in Japan and Taiwan — to “unblind”.

“It feels like a ceremony, a ritual, that we go through,” said Strauss. “We uncovered the data, and running our plots, immediately we saw it was good. Everybody said, ‘Oh, my!’ and everyone is very happy.”

Dalal and his roommates brought out a bottle of champagne that night.

A large survey with the world’s largest telescope camera

The HSC is the world’s largest camera telescope of its size, the mantle it will hold until the Vera C. Rubin Observatory currently under construction in the Chilean Andes begins its Time and Space Heritage Survey (LSST) in late 2024. In fact, the raw data from the HSC is processed with software designed for LSST. “It was very interesting to see that our software network was able to handle large amounts of data long before LSST,” said Andrés Plazas, research associate at Princeton.

The survey the research team used covered about 420 square degrees of the sky, roughly the equivalent of 2,000 full moons. It’s not one contiguous chunk of sky, but split between six distinct pieces, each about the size of your hand to cover it with your outstretched fist. The 25 million galaxies they surveyed were so far away that instead of seeing these galaxies as they are now, HSC recorded how they were billions of years ago.

Each of these galaxies glows with the fire of tens of billions of suns, but because of their great distance, they are incredibly dim, as much as 25 million times fainter than the dimmest stars we can see with the naked eye.

“It’s great to see the results of this HSC collaboration, especially as these data are the closest to what we expect from the Rubin Observatory, which the community is working on together,” said cosmologist Alexandra Amon, Senior Kavli Fellow at the University of Cambridge and a senior researcher at Trinity College, who were not involved in this study. “Their in-depth surveys yielded beautiful data. To me, it is interesting that the HSC, like other independent weak lensing surveys, shows low scores for S8 — this is an important validation, and it’s interesting that these tensions and trends force us to pause and think about what the data says about our Universe!”

The standard cosmological model

The standard cosmological model is “very simple” in several respects, explains Andrina Nicola of the University of Bonn, who advised Dalal on the project while he was a postdoctoral fellow at Princeton. The model posits that the universe is composed of only four basic constituents: ordinary matter (atoms, mostly hydrogen and helium), dark matter, dark energy, and photons.

According to the standard model, the universe has been expanding since the Big Bang 13.8 billion years ago: to begin with it was nearly perfectly smooth, but the pull of gravity on the subtle fluctuations in the universe has caused structures – galaxies covered in clumps of dark matter – to form. In the current universe, the relative contribution of ordinary matter, dark matter, dark energy is about 5%, 25% and 70%, plus a small contribution from photons.

The standard model is determined by only a handful of numbers: the expansion rate of the universe; a measure of how clumpy dark matter is (S8); the relative contribution of the constituents of the universe (numbers 5%, 25%, 70% above); the overall density of the universe; and the technical quantities that describe how the squalidity of the universe on large scales relates to those on small scales.

“And basically that!” said Strauss. “We, the cosmological community, have agreed with this model, which has been around since the early 2000s.”

Cosmologists are keen to test this model by constraining these numbers in various ways, such as by observing fluctuations in the Cosmic Microwave Background (which is essentially a picture of the infant universe, capturing its appearance after the first 400,000 years), modeling the historical expansion of the universe, measuring the slums of the universe in the relatively recent past, and so on.

“We confirm the growing understanding in society that there is a marked difference between measurements of clumping in the early universe (measured from the CMB) and those from the galactic era, ‘only’ 9 billion years ago,” said Arun Kannawadi, an associate research scholar at Princeton who involved in the analysis.

Five lines of attack

Dalal’s work undertook what is called Fourier space analysis; the parallel real space analysis was led by Xiangchong Li of Carnegie Mellon University, with whom he worked closely Rachel Mandelbaum, who completed AB physics in 2000 and earned his Ph.D. in 2006, both from Princeton. The third analysis, called the 3×2 point analysis, takes a different approach to measuring the gravitational lensing signal around individual galaxies, in order to calibrate the amount of dark matter associated with each galaxy. The analysis was led by Sunao Sugiyama of the University of Tokyo, Hironao Miyatake (former Princeton postdoctoral fellow) of Nagoya University and Surhud More of the Inter-University Center for Astronomy and Astrophysics in Pune, India.

These five sets of analyzes each use HSC data to arrive at the same conclusion about S8.

Doing a real space analysis and a Fourier space analysis “is kind of a sanity check,” says Dalal. He and Li worked closely to coordinate their analysis, using blind data. Any discrepancy between the two would suggest that the researcher’s methodology was wrong. “It will tell us less about astrophysics and more about how we can screw it up,” Dalal said.

“We didn’t know until it was blind that the two results were identical,” he said. “It feels magical.”

Sunao added: “Our 3×2 point analysis combines weak lensing analysis with galaxy clustering. Only after opening our eyes did we know that our results closely matched Roohi’s and Xiangchong’s. The fact that all of these analyzes give the same answer gives us confidence that we are doing something right!”

Learn more at The research will be presented at “Future Science with CMB x LSS,” a conference taking place from April 10-14 at the Yukawa Institute for Theoretical Physics, Kyoto University. This research was supported by the National Science Foundation Graduate Research Fellowship Program (DGE-2039656); Japan National Astronomical Observatory; Kavli Institute for Physics and Mathematics of the Universe; Tokyo University; High Energy Acceleration Research Institute (SEZ); Academia Sinica Institute of Astronomy and Astrophysics in Taiwan; Princeton University; the FIRST program from the Cabinet Office of Japan; Ministry of Education, Culture, Sports, Science and Technology (MEXT); Japanese Society for the Promotion of Science; Japan Science and Technology Agency; Toray Science Foundation; and the Vera C. Rubin Observatory.

Source link

Related Articles

Back to top button