A new study confirms that gravity has remained constant for the entire age of the universe

Einstein’s predictions for gravity have been tested on the largest possible scale

According to the Standard Model of particle physics, the universe is governed by four fundamental forces: electromagnetism, the weak nuclear force, the strong nuclear force, and gravity. While the first three are described by quantum mechanics, gravity is described by Einstein’s general theory of relativity. Surprisingly, gravity poses the biggest challenge for physicists. While the theory accurately describes how gravity works for planets, stars, galaxies, and clusters, it doesn’t apply perfectly at all scales.

While general relativity has been repeatedly validated over the past century (beginning with the Eddington Eclipse Experiment in 1919), gaps still appear as scientists attempt to apply it at the quantum level and to the universe as a whole. According to a new study led by Simon Fraser University, an international team of researchers have tested general relativity at the largest scale and have concluded that a tweak or two may be needed. This method could help scientists solve some of the biggest mysteries facing astrophysicists and cosmologists today.

The team included researchers from Simon Fraser, the Institute of Cosmology and Gravitation at the University of Portsmouth, the Center for Particle Cosmology at the University of Pennsylvania, the Observatorio Astronomico di Roma, the UAM-CSIC Institute of Theoretical Physics, and the Institute Lorentz der Leiden University at , and the Chinese Academy of Sciences (CAS). Their findings appeared in an article titled “Imprints of cosmological tensions in reconstructed gravity” recently published in natural astronomy.

Remove all ads on Universe today

Join our Patreon for just $3!

Get the ad-free experience for life

Teacher. Albert Einstein delivers the 11th Josiah Willard Gibbs Lecture at the Carnegie Institute of Technology on December 28, 1934. Source: AP Photo

According to Einstein’s field equations for GR, the universe was not static and had to be in an expanding state (otherwise gravity would cause it to contract). While Einstein initially resisted this idea and attempted to propose a mysterious force that kept the universe in balance (his “cosmological constant”), Edwin Hubble’s observations in the 1920s showed that the universe was expanding. Quantum theory also predicts that the vacuum of space is filled with energy that goes unnoticed because conventional methods can only measure changes in energy (and not their total amount).

In the 1990s, new observatories like the Hubble Space Telescope (HST) has pushed the boundaries of astronomy and cosmology. Thanks to surveys like Hubble Deep Fields (HDF), astronomers have been able to see objects as they appeared over 13 billion light-years (or less than a billion years after the Big Bang). To their surprise, they found that the rate of expansion has accelerated over the past 4 billion years. This led to what is known as “the problem of the ancient cosmological constant,” where gravity is weaker on cosmological scales or some mysterious force is driving cosmic expansion.

Lead author Levon Pogosian (Professor of Physics, Simon Fraser University) and co-author Kazuya Koyama (Professor of Cosmology, University of Portsmouth) summarized the topic in a recent article on The conversation. As they explained, the problem of the cosmological constant boils down to a single question with drastic implications:

“[W]Whether the vacuum energy actually gravitates – it exerts a gravitational force and changes the expansion of the universe. If so, why is its gravity so much weaker than predicted? If the vacuum doesn’t gravitate at all, then what’s causing the cosmic acceleration? We don’t know what dark energy is, but we have to assume it exists to explain the expansion of the universe. Similarly, to explain how galaxies and clusters evolved in the way we observe them today, we must also assume that there is a type of invisible matter called dark matter.”

The cosmological LCDM model combines dark energy (L) with the “cold” theories of dark matter. Source: Alex Mittelmann/Wikimedia

The existence of dark energy is part of the standard cosmological theory known as the Lambda Cold Dark Matter (LCDM) model – where lambda represents the cosmological constant/dark energy. According to this model, the mass-energy density of the universe consists of 70% dark energy, 25% dark matter, and 5% normal (visible or “luminous”) matter. While this model has successfully reconciled observations collected by cosmologists over the past 20 years, it assumes that most of the universe is made up of undetectable forces.

Therefore, some physicists have ventured that GR might need some modifications to explain the universe as a whole. In addition, a few years ago, astronomers noticed that measuring the rate of cosmic expansion in different ways gave different values. This problem, Pogosian and Koyama explained, is known as Hubble tension:

“The disagreement or tension exists between two values ​​of the Hubble constant. One is the number predicted by the LCDM cosmological model, which was developed to match the light left over from the Big Bang (the cosmic microwave background radiation). The other is the expansion rate, which is measured by observing exploding stars, known as supernovae, in distant galaxies.”

Many theoretical ideas have been proposed to modify the LCDM model to explain the Hubble voltage. Among them are alternative theories of gravity such as Modified Newtonian Dynamics (MOND), a modified version of Newton’s law of universal gravitation that eliminates the existence of dark matter. For over a century, astronomers have been testing GR by observing how the curvature of spacetime changes in the presence of gravitational fields. These tests have become particularly extreme in recent decades, including how supermassive black holes (SMBHs) affect orbiting stars, or how gravitational lensing amplifies and alters light passage.

An illustration of cosmic expansion. Image credit: NASA’s GSFC Conceptual Image Lab

For their study, Pogosian and his colleagues used a statistical model known as Bayesian inference, which is used to calculate the probability of a theorem as more data is introduced. From there, the team simulated the cosmic expansion based on three parameters: the CMB data from ESA’s Planck satellite, supernova and galaxy catalogs such as the Sloan Digital Sky Survey (SDSS) and Dark Energy Survey (DES), and the predictions of the LCDM model .

“Together with a team of cosmologists, we put the basic laws of general relativity to the test,” said Pogosian and Koyama. “We also investigated whether a modification of Einstein’s theory could help solve some of the open problems in cosmology, such as the Hubble voltage. To determine if GR is correct at the largest scale, we set out, for the first time, to examine three aspects of it simultaneously. These were the expansion of the universe, the effects of gravity on light, and the effects of gravity on matter.”

Their results showed some inconsistencies with Einstein’s predictions, although they had rather low statistical significance. They also found that solving the Hubble stress problem was difficult simply by modifying the theory of gravity, suggesting that an additional force might be required or that there are errors in the data. If the former is true, Pogosian and Koyama said, then it was possible that this force was present during the early Universe (about 370,000 years after the Big Bang) when protons and electrons first combined to form hydrogen.

Several possibilities have been developed in recent years, ranging from a special form of dark matter, an early type of dark energy, or primordial magnetic fields. In any case, this latest study indicates that future research needs to be conducted that could lead to a revision of the most widely accepted cosmological model. Said Pogosian and Koyama:

“[O]Our study has shown that it is possible to test the validity of general relativity over cosmological distances using observational data. Although we haven’t solved the Hubble problem yet, in a few years we will have a lot more data from new probes. This means that we can use these statistical methods to further optimize general relativity, explore the limits of modifications, and pave the way to solving some open challenges in cosmology.”

Further reading: The preservation, Nature astronomy

#Einsteins #predictions #gravity #tested #largest #scale

Leave a Comment

Your email address will not be published. Required fields are marked *