The Biggest Problem of Them All

From Physics World, something unexpected. The “problem” is, er, the Universe. Problems don’t come much bigger! To put this in perspective, the major theory used to describe it is Einstein’s General Relativity. There is only one problem with that theory: everything should collapse into a singular “point” and it doesn’t. To get around that problem, Einstein introduced something he called “the Cosmological Constant”, which effectively was something pulled out of thin air in an ad hoc way to at least admit the obvious problem that the Universe remained large. Worse, this steady universe could not be extrapolated to the infinite past. Somewhere along the line it had to be different from now. It was then that Georges Lemaȋtre proposed a theory that space was expanding, to which Einstein replied that his maths were correct, but his physics abominable. However, Einstein had to backtrack because Hubble showed it was expanding. (Actually, Lemaȋtre had provided evidence, but Hubble’s was better.) The idea was that all matter started from a point, and space expanded to let the energy collapse into matter. Fred Hoyle jokingly referred to this as “The Big Bang”.

All was well. Space was expanding uniformly, so the galaxies were moving away from each other uniformly, on a very large scale, but with localized variation in between. Why space was expanding was left unanswered. Thus on a very large scale, the gravitational interactions between distant galaxies was diminishing, which violates the concept of the law of conservation of energy. (Energy does not have to be conserved, though. Energy is a tricky topic in general relativity.) Hoyle was keen on a steady state universe, and while he accepted the Universe was expanding, he took the idea that if everything was moving apart, that loss of gravitational energy was made up for by the creation of new matter. This was not generally accepted, and the question of this energy imbalance remained.

There is worse. To maintain a constant expansion rate, general relativity requires a constant energy density, and how can that arise when space is expanding? The only way would seem to be that this energy density is a property of vacuum. Vacuum is, therefore, not nothing. There is nothing like an opportunity in physics to get speculation going, and “not nothing” is such an opportunity. Quantum field theory announced the vacuum is full of extraordinarily tiny harmonic oscillators, which convey a zero point energy to space. We have our energy accounted for. All was good, until it wasn’t. It was obvious to take the quantum field theory and see if it fitted with observation on galaxies. It did not. In fact the error was a factor of ten multiplied by itself 120 times (Adler, R. J., Casey, B., Jacob, O. C. 1995. Vacuum catastrophe: an elementary exposition of the cosmological constant problem. Am J. Phys 63: 620 – 626.) This was the most horrendous disagreement between calculation and observation ever.

Then came a shock: not only were the galaxies moving apart, but such motion was accelerating. This was caused, in terms of labelling, by something called dark energy. It should be noted that at this point, dark energy is merely a term that essentially is a holding term to recognize that something must be causing the acceleration. So, what is it? The good news is that whatever it is can also account for the general expansion. All we need is something that is getting bigger as the universe expands.

Now, we have a proposition, which is called “cosmological coupling”. The concept starts with the observation that the  mass of black holes at the heart of distant galaxies have been growing about ten times faster than simply accreting mass or merging with other black holes would allow. The coupling means that the growth of the black holes matched the accelerating expansion of the universe. The concept seems to be that the singularity of black holes is replaced by additional “vacuum energy”. Their coupling means that if the volume of the universe doubles, so does the mass of black holes, but the number of black holes remains constant. The logic is that “something” must give rise to the expansion of the universe, and since no other object exhibits similar behaviour, black holes must be the “something”. Is that valid? There is a problem for me with that explanation, apart from the fact that correlation does not mean causation. The evidence is the Universe took on massive expansion initially, then calmed down, and then the expansion accelerated again. During the initial expansion there would be few if any black holes. Then, suddenly, there was a massive growth of them, but that happened at a time when the universe expansion was at its slowest, then when the final acceleration started, the black holes had settled down to minimal growth. The required correlation for the hypothesis seems to be, if anything, an anti correlation.


Warp Drives

“Warp drives” originated in the science fiction shows “Star Trek” in the 1960s, but in 1994, the Mexican Miguel Alcubierre published a paper arguing that under certain conditions exceeding light speed was not forbidden by Einstein’s General Relativity. Alcubierre reached his solution by assuming it was possible, then working backwards to see what was required while rejecting those awkward points that arose. The concept is that the ship sits in a bubble, and spacetime in front of the ship is contracted, while that behind the ship is expanded. In terms of geometry, that means the distance to your destination has got smaller, while the distance from where you started gets longer, i.e. you moved relative to the starting point and the destination. One of the oddities of being in such a bubble is you would not sense you are moving. There would be no accelerating forces because technically you are not moving; it is the space around you that is moving. Captain Kirk on the enterprise is not squashed to a film by the acceleration! Since then there have been a number of proposals. General relativity is a gold mine for academics wanting to publish papers because it is so difficult mathematically.

There is one small drawback to these proposals: you need negative energy. Now we run into definitions, and before you point out the gravitational field has negative energy it is generated by positive mass, and it contracts the distance between you and target, i.e. you fall towards it. If you like, that can be at the front of your drive. The real problem is at the other end – you need the repulsive field that sends you further from where you started, and if you think gravitationally, the opposite field, presumably generated from negative mass.

One objection often heard to negative energy is if quantum field theory were correct, the vacuum would collapse to negative energy, which would lead to the Universe collapsing on itself. My view is, not necessarily. The negative potential energy of the gravitational field causes mass to collapse onto itself, and while we do get black holes in accord with this, the Universe is actually expanding. Since quantum field theory assumes a vacuum energy density, calculations of the relativistic gravitational field arising from this are in error by ten multiplied by itself 120 times, so just maybe it is not a good guideline here. It predicts the Universe has long since collapsed, but here we are.

The only repulsive stuff we think might be there is dark energy, but we have no idea how to lay hands on it, let alone package it, or even if it exists. However, all may not be lost. I recently saw an article in Physics World that stated that a physicist, Erik Lentz, had claimed there was no need for negative energy. The concept is that energy could be capable of arranging the structure of space-time as a soliton. (A soliton is a wave packet that travels more like a bubble, it does not disperse or spread out, but otherwise behaves like a wave.) There is a minor problem. You may have heard that the biggest problem with rockets is the mass of fuel they have to carry before you get started. Well, don’t book a space flight yet. As Lentz has calculated it, a 100 m radius spacecraft would require the energy equivalent to hundreds of times the mass of Jupiter.

There will be other problems. It is one thing to have opposite energy densities on different sides of your bubble. You still have to convert those to motion and go exactly in the direction you wish. If you cannot steer as you go, or worse, you don’t even know for sure exactly where you are and the target is, is there a point? Finally, in my science fiction novels I have steered away from warp drives. The only times my characters went interstellar distances I limited myself to a little under light speed. Some say that lacks imagination, but stop and think. You set out to do something, but suppose where you are going will have aged 300 years before you get there. Come back, and your then associates have been dead for 600 years. That raises some very awkward problems that make a story different from the usual “space westerns”.

The Universe is Shrinking

Dark energy is one of the mysteries of modern science. It is supposed to amount to about 68% of the Universe, yet we have no idea what it is. Its discovery led to Nobel prizes, yet it is now considered possible that it does not even exist. To add or subtract 68% of the Universe seems a little excessive.

One of the early papers (Astrophys. J., 517, pp565-586) supported the concept. What they did was to assume type 1A supernovae always gave out the same light so by measuring the intensity of that light and comparing it with the red shift of the light, which indicates how fast it is going away, they could assess whether the rate of expansion of the universe was even over time. The standard theory at the time was that it was, and it was expanding at a rate given by the Hubble constant (named after Edwin Hubble, who first proposed this). What they did was to examine 42 type 1a supernovae with red shifts between 0.18 and 0.83, and compared their results on a graph with what they expected from the line drawn using the Hubble constant, which is what you expect with zero acceleration, i.e. uniform expansion. Their results at a distance were uniformly above the line, and while there were significant error bars, because instruments were being operated at their extremes, the result looked unambiguous. The far distant ones were going away faster than expected from the nearer ones, and that could only arise if the rate of expansion were accelerating.

For me, there was one fly in the ointment, so to speak. The value of the Hubble constant they used was 63 km/s/Mpc. The modern value is more like 68 or 72; there are two values, and they depend on how you measure them, but both are somewhat larger than this. Now it follows that if you have the speed wrong when you predict how far it travelled, it follows that the further away it is, the bigger the error, which means you think it has speeded up.

Over the last few years there have been questions as to exactly how accurate this determination of acceleration really is. There has been a question (arXiv:1912.04903) that the luminosity of these has evolved as the Universe ages, which has the effect that measuring the distance this way leads to overestimation of the distance. Different work (Milne et al. 2015.  Astrophys. J. 803: 20) showed that there are at least two classes of 1A supernovae, blue and red, and they have different ejecta velocities, and if the usual techniques are used the light intensity of the red ones will be underestimated, which makes them seem further away than they are.

My personal view is there could be a further problem. The type 1A occurs when a large star comes close to another star and begins stripping it of its mass until it gets big enough to ignite the supernova. That is why they are believed to have the same brightness: they ignite their explosion at the same mass so there are the same conditions, so there should be the same brightness. However, this is not necessarily the case because the outer layer, which generates the light we see, comes from the non-exploding star, and will absorb and re-emit energy from the explosion. Hydrogen and helium are poor radiators, but they will absorb energy. Nevertheless, the brightest light might be expected to come from the heavier elements, and the amount of them increases as the Universe ages and atoms are recycled. That too might lead to the appearance that the more distant ones are further away than expected, which in turn suggests the Universe is accelerating its expansion when it isn’t.

Now, to throw the spanner further into the works, Subir Sarkar has added his voice. He is unusual in that he is both an experimentalist and a theoretician, and he has noted that the 1A supernovae, while taken to be “standard candles”, do not all emit the same amount of light, and according to Sarkar, they vary by up to a factor of ten. Further, previously the fundamental data was not available, but in 1915 it became public. He did a statistical analysis and found that the data supported a cosmic acceleration but only with a statistical significance of three standard deviations, which, according to him, “is not worth getting out of bed for”.

There is a further problem. Apparently the Milky Way is heading off in some direction at 600 km/s, and this rather peculiar flow extends out to about a billion light years, and unfortunately most of the supernovae studied so far are in this region. This drops the statistical significance for cosmic expansion to two standard deviations. He then accuses the previous supporters of this cosmic expansion as confirmation bias: the initial workers chose an unfortunate direction to examine, but the subsequent ones “looked under the same lamppost”.

So, a little under 70% of what some claim is out there might not be. That is ugly. Worse, about 27% is supposed to be dark matter, and suppose that did not exist either, and the only reason we think it is there is because our understanding of gravity is wrong on a large scale? The Universe now shrinks to about 5% of what it was. That must be something of a record for the size of a loss.