Much Ado About Muons

You may or may not have heard that the Standard Model, which explains “all of particle physics”, is in trouble and “new physics” may be around the corner. All of this arises from a troublesome result from the muon, a particle that is very similar to an electron except it is about 207 times more massive and has a mean lifetime of 2.2 microseconds. If you think that is not very important for your current lifestyle (and it isn’t) wait, there’s more. Like the electron it has a charge of -1, and a spin of ½, which means it acts like a small magnet. Now, if the particle is placed in a strong magnetic field, the direction of the spin wobbles (technically, precesses) and the strength of this interaction is described by a number called the g factor, which for a classical situation, g = 2. Needless to say, in the quantum world that is wrong. For the electron, it is roughly 2.002 319 304 362, the numbers here stop where uncertainty starts. If nothing else, this shows the remarkable precision achieved by experimental physicists. Why is it not 2? The basic reason is the particle interacts with the vacuum, which is not quite “nothing”. You will see quantum electrodynamics has got this number down fairly precisely, and quantum electrodynamics, which is part of the standard model, is considered to be the most accurate theoretical calculation ever, or the greatest agreement between calculation and observation. All was well, until this wretched muon misbehaved.

Now, the standard model predicts the vacuum comprises a “quantum foam” of virtual particles popping in and out of existence, and these short-lived particles affect the g-factor, causing the muon’s wobble to speed up or slow down very slightly, which in turn leads to what is called an “anomalous magnetic moment”. The standard model should calculate these to the same agreement as with the electron, and the calculations give:

  • g-factor: 2.00233183620
  • anomalous magnetic moment: 0.00116591810

The experimental values announced by Fermilab and Brookhaven are:

  • g-factor: 2.00233184122(82)
  • anomalous magnetic moment: 0.00116592061(41)

The brackets indicate uncertainty. Notice a difference? Would you say it is striking? Apparently there is only a one in 40,000 chance that it will be a statistical error. Nevertheless, apparently they will keep this experiment running at Fermilab for another two years to firm it up. That is persistence, if nothing else.

This result is what has excited a lot of physicists because it means the calculation of how this particle interacts with the vacuum has underestimated the actual effect for the muon. That suggests more physics beyond the standard model, and in particular, a new particle may be the cause of the additional effect. Of course, there has to be a fly in the ointment. One rather fearsome calculation claims to be a lot closer to the observational value. To me the real problem is how can the same theory come up with two different answers when there is no arithmetical mistake?

Anyway, if the second one is right, problem gone? Again, not necessarily. At the Large Hadron collider they have looked at B meson decay. This can produce electrons and positrons, or muons and antimuons. According to the standard model, these two particles are identical other than for mass, which means the rate of production of each should be identical, but it isn’t quite. Again, it appears we are looking at small deviations. The problem then is, hypothetical particles that might explain one experiment fail for the other. Worse, the calculations are fearsome, and can take years. The standard model has 19 parameters that have to be obtained from experiment, so the errors can mount up, and if you wish to give the three neutrinos mass, in come another eight parameters. If we introduce yet another particle, in come at least one more parameter, and probably more. Which raises the question, since adding a new assignable parameter will always answer one problem, how do we know we are even on the right track?

All of which raises the question, is the standard model, which is a part of quantum field theory, itself too complicated, and maybe not going along the right path? You might say, how could I possibly question quantum field theory, which gives such agreeable results for the electron magnetic moment, admittedly after including a series of interactions? The answer is that it also gives the world’s worst agreement with the cosmological constant. When you sum the effects of all these virtual particles over the cosmos, the expansion of the Universe is wrong by 10^120, that is, 10 followed by 120 zeros. Not exceptionally good agreement. To get the agreement it gets, something must be right, but as I see it, to get such a howling error, something must be wrong also. The problem is, what?

Dark Energy

Many people will have heard of dark energy, yet nobody knows what it is, apart from being something connected with the rate of expansion of the Universe. This is an interesting part of science. When Einstein formulated General Relativity, he found that if his equations were correct, the Universe should collapse due to gravity. It hasn’t so far, so to avoid that he introduced a term Λ, the so-called cosmological constant, which was a straight-out fudge with no basis other than that of avoiding the obvious mistake that the universe had not collapsed and did not look like doing so. Then, when he found from observations that the Universe was actually expanding, he tore that up. In General Relativity, Λ represents the energy density of empty space.

We think the Universe expansion is accelerating because when we look back in time by looking at ancient galaxies, we can measure the velocity of their motion relative to us through the so-called red shift of light, and all the distant galaxies are going away from us, and seemingly faster the further away they are. We can also work out how far away they are by taking light sources and measuring how bright they are, and provided we know how bright they were when they started, the dimming gives us a measure of how far away they are. What two research groups found in 1998 is that the expansion of the Universe was accelerating, which won them the 2011 Nobel prize for physics. 

The next question is, how accurate are these measurements and what assumptions are inherent? The red shift can be measured accurately because the light contains spectral lines, and as long as the physical constants have remained constant, we know exactly their original frequencies, and consequently the shift when we measure the current frequencies. The brightness relies on what are called standard candles. We know of a class of supernovae called type 1a, and these are caused by one star gobbling the mass of another until it reaches the threshold to blow up. This mass is known to be fairly constant, so the energy output should be constant.  Unfortunately, as often happens, the 1a supernovae are not quite as standard as you might think. They have been separated into three classes: standard 1a, dimmer 1a , and brighter 1a. We don’t know why, and there is an inherent problem that the stars of a very long time ago would have had a lower fraction of elements from previous supernovae. They get very bright, then dim with time, and we cannot be certain they always dim at the same rate. Some have different colour distributions, which makes specific luminosity difficult to measure. Accordingly, some consider the evidence is inadequate and it is possible there is no acceleration at all. There is no way for anyone outside the specialist field to resolve this. Such measurements are made at the limits of our ability, and a number of assumptions tend to be involved.

The net result of this is that if the universe is really expanding, we need a value for Λ because that will describe what is pushing everything apart. That energy of the vacuum is called dark energy, and if we consider the expansion and use relativity to compare this energy with the mass of the Universe we can see, dark energy makes up 70% of the total Universe. That is, assuming the expansion is real. If not, 70% of the Universe just disappears! So what is it, if real?

The only real theory that can explain why the vacuum has energy at all and has any independent value is quantum field theory. By independent value, I mean it explains something else. If you have one observation and you require one assumption, you effectively assume the answer. However, quantum field theory is not much help here because if you calculate Λ using it, the calculation differs from observation by a factor of 120 orders of magnitude, which means ten multiplied by itself 120 times. To put that in perspective, if you were to count all the protons, neutrons and electrons in the entire universe that we can see, you would multiply ten by itself about 83 times to express the answer. This is the most dramatic failed prediction in all theoretical physics and is so bad it tends to be put in the desk drawer and ignored/forgotten about.So the short answer is, we haven’t got a clue what dark energy is, and to make matters worse, it is possible there is no need for it at all. But it most certainly is a great excuse for scientific speculation.