Warp Drives

“Warp drives” originated in the science fiction shows “Star Trek” in the 1960s, but in 1994, the Mexican Miguel Alcubierre published a paper arguing that under certain conditions exceeding light speed was not forbidden by Einstein’s General Relativity. Alcubierre reached his solution by assuming it was possible, then working backwards to see what was required while rejecting those awkward points that arose. The concept is that the ship sits in a bubble, and spacetime in front of the ship is contracted, while that behind the ship is expanded. In terms of geometry, that means the distance to your destination has got smaller, while the distance from where you started gets longer, i.e. you moved relative to the starting point and the destination. One of the oddities of being in such a bubble is you would not sense you are moving. There would be no accelerating forces because technically you are not moving; it is the space around you that is moving. Captain Kirk on the enterprise is not squashed to a film by the acceleration! Since then there have been a number of proposals. General relativity is a gold mine for academics wanting to publish papers because it is so difficult mathematically.

There is one small drawback to these proposals: you need negative energy. Now we run into definitions, and before you point out the gravitational field has negative energy it is generated by positive mass, and it contracts the distance between you and target, i.e. you fall towards it. If you like, that can be at the front of your drive. The real problem is at the other end – you need the repulsive field that sends you further from where you started, and if you think gravitationally, the opposite field, presumably generated from negative mass.

One objection often heard to negative energy is if quantum field theory were correct, the vacuum would collapse to negative energy, which would lead to the Universe collapsing on itself. My view is, not necessarily. The negative potential energy of the gravitational field causes mass to collapse onto itself, and while we do get black holes in accord with this, the Universe is actually expanding. Since quantum field theory assumes a vacuum energy density, calculations of the relativistic gravitational field arising from this are in error by ten multiplied by itself 120 times, so just maybe it is not a good guideline here. It predicts the Universe has long since collapsed, but here we are.

The only repulsive stuff we think might be there is dark energy, but we have no idea how to lay hands on it, let alone package it, or even if it exists. However, all may not be lost. I recently saw an article in Physics World that stated that a physicist, Erik Lentz, had claimed there was no need for negative energy. The concept is that energy could be capable of arranging the structure of space-time as a soliton. (A soliton is a wave packet that travels more like a bubble, it does not disperse or spread out, but otherwise behaves like a wave.) There is a minor problem. You may have heard that the biggest problem with rockets is the mass of fuel they have to carry before you get started. Well, don’t book a space flight yet. As Lentz has calculated it, a 100 m radius spacecraft would require the energy equivalent to hundreds of times the mass of Jupiter.

There will be other problems. It is one thing to have opposite energy densities on different sides of your bubble. You still have to convert those to motion and go exactly in the direction you wish. If you cannot steer as you go, or worse, you don’t even know for sure exactly where you are and the target is, is there a point? Finally, in my science fiction novels I have steered away from warp drives. The only times my characters went interstellar distances I limited myself to a little under light speed. Some say that lacks imagination, but stop and think. You set out to do something, but suppose where you are going will have aged 300 years before you get there. Come back, and your then associates have been dead for 600 years. That raises some very awkward problems that make a story different from the usual “space westerns”.

The Universe is Shrinking

Dark energy is one of the mysteries of modern science. It is supposed to amount to about 68% of the Universe, yet we have no idea what it is. Its discovery led to Nobel prizes, yet it is now considered possible that it does not even exist. To add or subtract 68% of the Universe seems a little excessive.

One of the early papers (Astrophys. J., 517, pp565-586) supported the concept. What they did was to assume type 1A supernovae always gave out the same light so by measuring the intensity of that light and comparing it with the red shift of the light, which indicates how fast it is going away, they could assess whether the rate of expansion of the universe was even over time. The standard theory at the time was that it was, and it was expanding at a rate given by the Hubble constant (named after Edwin Hubble, who first proposed this). What they did was to examine 42 type 1a supernovae with red shifts between 0.18 and 0.83, and compared their results on a graph with what they expected from the line drawn using the Hubble constant, which is what you expect with zero acceleration, i.e. uniform expansion. Their results at a distance were uniformly above the line, and while there were significant error bars, because instruments were being operated at their extremes, the result looked unambiguous. The far distant ones were going away faster than expected from the nearer ones, and that could only arise if the rate of expansion were accelerating.

For me, there was one fly in the ointment, so to speak. The value of the Hubble constant they used was 63 km/s/Mpc. The modern value is more like 68 or 72; there are two values, and they depend on how you measure them, but both are somewhat larger than this. Now it follows that if you have the speed wrong when you predict how far it travelled, it follows that the further away it is, the bigger the error, which means you think it has speeded up.

Over the last few years there have been questions as to exactly how accurate this determination of acceleration really is. There has been a question (arXiv:1912.04903) that the luminosity of these has evolved as the Universe ages, which has the effect that measuring the distance this way leads to overestimation of the distance. Different work (Milne et al. 2015.  Astrophys. J. 803: 20) showed that there are at least two classes of 1A supernovae, blue and red, and they have different ejecta velocities, and if the usual techniques are used the light intensity of the red ones will be underestimated, which makes them seem further away than they are.

My personal view is there could be a further problem. The type 1A occurs when a large star comes close to another star and begins stripping it of its mass until it gets big enough to ignite the supernova. That is why they are believed to have the same brightness: they ignite their explosion at the same mass so there are the same conditions, so there should be the same brightness. However, this is not necessarily the case because the outer layer, which generates the light we see, comes from the non-exploding star, and will absorb and re-emit energy from the explosion. Hydrogen and helium are poor radiators, but they will absorb energy. Nevertheless, the brightest light might be expected to come from the heavier elements, and the amount of them increases as the Universe ages and atoms are recycled. That too might lead to the appearance that the more distant ones are further away than expected, which in turn suggests the Universe is accelerating its expansion when it isn’t.

Now, to throw the spanner further into the works, Subir Sarkar has added his voice. He is unusual in that he is both an experimentalist and a theoretician, and he has noted that the 1A supernovae, while taken to be “standard candles”, do not all emit the same amount of light, and according to Sarkar, they vary by up to a factor of ten. Further, previously the fundamental data was not available, but in 1915 it became public. He did a statistical analysis and found that the data supported a cosmic acceleration but only with a statistical significance of three standard deviations, which, according to him, “is not worth getting out of bed for”.

There is a further problem. Apparently the Milky Way is heading off in some direction at 600 km/s, and this rather peculiar flow extends out to about a billion light years, and unfortunately most of the supernovae studied so far are in this region. This drops the statistical significance for cosmic expansion to two standard deviations. He then accuses the previous supporters of this cosmic expansion as confirmation bias: the initial workers chose an unfortunate direction to examine, but the subsequent ones “looked under the same lamppost”.

So, a little under 70% of what some claim is out there might not be. That is ugly. Worse, about 27% is supposed to be dark matter, and suppose that did not exist either, and the only reason we think it is there is because our understanding of gravity is wrong on a large scale? The Universe now shrinks to about 5% of what it was. That must be something of a record for the size of a loss.