cheap and best hosting

click here

This is how the astronomers will solve the expansion of the universe dispute

This is how the  Astronomers will solve the expansion of the universe Dispute 


Imagine you were a scientist who was trying to measure some of the universe's assets. If you are curious about how to do anything, you will not only need to find a way to figure out what is going on but in what amount. This is a difficult task; Not only do you want a qualitative answer to the question of what is happening, but rather a quantitative part, "How much?"

In cosmology, a major challenge is to measure the expansion of the universe. We know from the 1920s that the Universe is expanding, although it is a search for generations to determine "how much"? Today, a multitude of different techniques is used by using a number of different groups. The answers they receive come in one of two categories but are incompatible with each other. Here's how we solve this relationship.
For generations, astronomers, astronomers, and cosmologists tried to refine our measurements of the expansion of the Universe: Hubble Stable That's why we designed and built the Hubble Space Telescope. This important project is to make this measurement, and it is tremendously cozy. This rate was 72 km/sec per ton with only 72% uncertainty. This result, published in 2001, resolved a dispute in the old form as the law of Hubble.

But in 2019, a new emergence has occurred. Using a camp, residues of the early stages of the Big Bang, with the estimated uncertainty of 1-2%, it retains the values of ~ 67 km / s / Mpc. Second camps, using measurements from a relatively close universe, claim to be ~ 73 km / s / MPC with the uncertainty of only 2-3%. These errors are so small that they no longer overlap. Something is wrong, and we can not figure out where it is.


The universe was small, warm and dense in the past. In order to come into our eyes, light is required to travel through space from any place in the space through the Universe. Ideally, we can measure the light that is received, we can set a distance for the signals measured by us, and it can be guessed that the way the Universe actually results from the signal detected by us Was expanded.

Two classes of methods that we use, however, are giving incompatible results. The possibilities are three times:


  1. The "initial residue" group is wrong. There is a fundamental error in their approach to this problem, and it is biased for improperly lower values of their results.
  2. The "distance staircase" group is wrong. There is a systematic error in some kind of approach, taking their results towards the wrong, high values.
  3. Both groups are correct, and some types of new physics responsible for the two groups achieving different results.
Of course, everyone thinks they are right and the other teams are wrong. But the way science works, it is not by ridicule but by finding crucial conclusive proof to tip the scales. Here's how astronomers are going to solve the biggest dispute in cosmology and learn how the universe is actually expanding.

1.) Is the early relics group mistaken? 


Before we had Planck satellite, we had COBE and WMAP. While Plank has given us a map of the remaining left of the Big Bang in the angular scale of 0.07 ° below, while COBE was able to go down for only 7 ° and WMAP, although very good, only take us down to about 0.5 ° Gaya. There was a distortion between the data in three different parameters: matter density, extension rate, and scalar spectral index. In the WMAP era, the data was actually in large part with uncertainties, although ~ 71 km / s / MPC was in favor.
It was not until Plank took us to those small angular scales which broke the status of downfall, and we found that the expansion rate should be low. The reason is that those small angular scales have attached information about the scalar spectral index (ns, in the diagram given below), which control the large values of the expansion rate (and, accordingly, small values for the density of matter). Do, and teach. We know that the expansion rate should be close to 67 km / s / MPC, which has very small uncertainty.

However, it is possible that there is something wrong or biased about our analysis of small angular scales. This will not only affect Planck but also affect other independent CMB experiments. Even if you have completely survived the CMB, even then you get a result of showing that an initial residual system achieves very little expansion rate from the signal of the stairs of the distance.

Although we do not think this is a possibility - and independent initial residue technology of baryon acoustic oscillations (or "inverse distance staircase") also results in constant results - it is important to note that a small error that we did not properly calculate It can dramatically move our conclusions.

 2.) Is the distance ladder group mistaken?




It is hard. There are many different techniques to measure distances in the expanded universe, but some of them are the same:

They start directly from (eg, geometrically) well-known for measuring the distance, easily seen objects in our own galaxy,
Then we look at the same type of objects in other galaxies so that we can detect the distances of those galaxies based on the known properties of those objects,
And some of those galaxies have bright astronomical incidents, which allow us to use more calibrated galaxies as a calibration point.
Although historically, there are more than a dozen different distance indicators, the fastest and easiest way to get out of the great cosmic distances now comprises only three steps: the parallax of the variable stars going into the form of the SAFED in our own galaxy; Different galaxies have different sea fades, some of which are home type Ia supernova; And then type Ia supernova in the entire universe.

By using this method, we have an expansion rate of 73 km / s / Mpc, which is about 2-3% uncertainty. This is clearly inconsistent with the results of the initial relics group. Clearly, many are worried about possible sources of error, and teams working on the stairs are much smaller than teams working on the initial stairway system.

Even so, there are many reasons for the remote ladder teams to be confident in their results. As well as their errors are set as well as quantities can be expected for one, apart from parallax, there are independent cross-checks on cafid calibration, and the only potential damage is an "unknown unknown" Time can plague any sub-region of astronomy. Even then, there is a plan to do even better. These are many ways that astronomers will check whether the cosmic distance ladder is actually giving a reliable measure of the expansion rate of the universe.

Can we develop a pipeline for distance ladder input for the initial ladder exchange? Now, there are so many programs that can either take a set of cosmological parameters and give you the expected cosmic microwave background, or you can take the observed cosmic microwave background and give you a cosmological parameter.

You can see how the changes in your data, parameters such as case density, the state's dark energy equation, or the difference in detail, along with their error bars are also there.

The distance stairs team is demanding to develop a similar pipeline; One does not exist yet. When this is done, we should be able to read them more accurately on Systematics, but better than what we have today We will be able to see that when different data points/sets have been included or excluded, then the meaning of both the uncertainty in value and the value of the expansion rate is sensitive to them. (However, in 2016, more than 100 models were considered in the supernova analysis, and the difference between them was not responsible for the discrepancy in all forms.)
A possible error source may be that there are two types of Ia supernova: by combining white dwarves and white dwarves. Everywhere there are older stars, which means that we should merge white dwarves everywhere But only in areas where new stars are formed or recently created (known as HII regions), we can capture the white dwarves. The interesting thing is that Cefad Variable Stars, which are also part of the distance ladder, are found only in areas that have created new stars.

When we look into seafood-rich areas, we can not dislike which class of supernova. But if we look at a place where there are no young stars, then we can be sure that we are watching Supernova from merging white dwarfs. There are good reasons to believe this system is smaller than an overall anomaly, but not everyone is convinced. Using a different intermediate distance indicator, such as stars developed at the tip of the asymptomatic large branch found in the outer manifestation of galaxies, will eliminate this potential systematic error. There are currently about a dozen measures from different distance stair teams, which show good agreement with seafids, but more work is still necessary.
Finally, the ultimate sanctity is examined: using a completely independent method, in which there is no distance ladder to measure the expansion rate. If you can measure the distance indicators at different places in both whole and entire places, then you are expected to get an indication which can solve the problem once and for all. However, any new method is going to be interrupted due to low statistics and still due to systematic errors.

Still, there are two ways that scientists are trying to do right now. The first is through standard sirens, which you get from neutron stars to motivate and merge, although they will be preferably closed on the temporal scale. (We have seen one, of course, so far, but LIGO / daughter has many more expectations in the coming decades.) The second gravitational lens is multiplied by the time-delayed signals of the signs. The first such data sets are coming up now, in which four known lenses are showing an agreement with the remote ladder team, but still have a long way to go.

If such hopes (and some are spreading) in such a way, it would mean that we have to take the third - and the most troublesome option.

 3.) Both groups are correct

It is possible that the way we measure the expansion of the universe, it is of fundamental importance to the value we receive. If we measure the surrounding objects cosmically and look outwards, then we get results of approximately 73 km/sec/ MCP. If we measure the expansion rate by the scale of the largest cosmic distance, then we get the result of 67 km/sec/ MCP. There are many exciting explanations for this, which include:

  • Our local area of the Universe has unusual properties compared to the average (although it is already confusing),
  • Dark energy is changing unexpectedly over time,
  • Gravity behaves differently on the cosmic scales,
  • Or new types of fields or forces that allow the Universe.
But before we go on these foreign lands, we have to ensure that neither the group has made any mistake. Despite a number of independent investigations, even a small bias can account for the completeness of this current dispute. Our understanding of the universe in which we live is at stake. Performing every appropriate diligence, and to ensure that we have found it right, can not be exaggerated.

Astrophysicist and Author Ethan Seagull is the founder and primary author of Start With a Bang! His books, Treknology and Beyond the Galaxy, wherever books are sold, are available.

This is how the astronomers will solve the expansion of the universe disputeThis is how the astronomers will solve the expansion of the universe dispute Reviewed by Know It All on February 01, 2019 Rating: 5

No comments:

Click here

Powered by Blogger.