You may or may not have heard that the Standard Model, which explains “all of particle physics”, is in trouble and “new physics” may be around the corner. All of this arises from a troublesome result from the muon, a particle that is very similar to an electron except it is about 207 times more massive and has a mean lifetime of 2.2 microseconds. If you think that is not very important for your current lifestyle (and it isn’t) wait, there’s more. Like the electron it has a charge of -1, and a spin of ½, which means it acts like a small magnet. Now, if the particle is placed in a strong magnetic field, the direction of the spin wobbles (technically, precesses) and the strength of this interaction is described by a number called the g factor, which for a classical situation, g = 2. Needless to say, in the quantum world that is wrong. For the electron, it is roughly 2.002 319 304 362, the numbers here stop where uncertainty starts. If nothing else, this shows the remarkable precision achieved by experimental physicists. Why is it not 2? The basic reason is the particle interacts with the vacuum, which is not quite “nothing”. You will see quantum electrodynamics has got this number down fairly precisely, and quantum electrodynamics, which is part of the standard model, is considered to be the most accurate theoretical calculation ever, or the greatest agreement between calculation and observation. All was well, until this wretched muon misbehaved.

Now, the standard model predicts the vacuum comprises a “quantum foam” of virtual particles popping in and out of existence, and these short-lived particles affect the g-factor, causing the muon’s wobble to speed up or slow down very slightly, which in turn leads to what is called an “anomalous magnetic moment”. The standard model should calculate these to the same agreement as with the electron, and the calculations give:

- g-factor: 2.00233183620
- anomalous magnetic moment: 0.00116591810

The experimental values announced by Fermilab and Brookhaven are:

- g-factor: 2.00233184122(82)
- anomalous magnetic moment: 0.00116592061(41)

The brackets indicate uncertainty. Notice a difference? Would you say it is striking? Apparently there is only a one in 40,000 chance that it will be a statistical error. Nevertheless, apparently they will keep this experiment running at Fermilab for another two years to firm it up. That is persistence, if nothing else.

This result is what has excited a lot of physicists because it means the calculation of how this particle interacts with the vacuum has underestimated the actual effect for the muon. That suggests more physics beyond the standard model, and in particular, a new particle may be the cause of the additional effect. Of course, there has to be a fly in the ointment. One rather fearsome calculation claims to be a lot closer to the observational value. To me the real problem is how can the same theory come up with two different answers when there is no arithmetical mistake?

Anyway, if the second one is right, problem gone? Again, not necessarily. At the Large Hadron collider they have looked at B meson decay. This can produce electrons and positrons, or muons and antimuons. According to the standard model, these two particles are identical other than for mass, which means the rate of production of each should be identical, but it isn’t quite. Again, it appears we are looking at small deviations. The problem then is, hypothetical particles that might explain one experiment fail for the other. Worse, the calculations are fearsome, and can take years. The standard model has 19 parameters that have to be obtained from experiment, so the errors can mount up, and if you wish to give the three neutrinos mass, in come another eight parameters. If we introduce yet another particle, in come at least one more parameter, and probably more. Which raises the question, since adding a new assignable parameter will always answer one problem, how do we know we are even on the right track?

All of which raises the question, is the standard model, which is a part of quantum field theory, itself too complicated, and maybe not going along the right path? You might say, how could I possibly question quantum field theory, which gives such agreeable results for the electron magnetic moment, admittedly after including a series of interactions? The answer is that it also gives the world’s worst agreement with the cosmological constant. When you sum the effects of all these virtual particles over the cosmos, the expansion of the Universe is wrong by 10^120, that is, 10 followed by 120 zeros. Not exceptionally good agreement. To get the agreement it gets, something must be right, but as I see it, to get such a howling error, something must be wrong also. The problem is, what?

Philosophically speaking Newton:

It is inconceivable that inanimate Matter should, without the Mediation of something else, which is not material, operate upon, and affect other matter without mutual Contact…That Gravity should be innate, inherent and essential to Matter, so that one body may act upon another at a distance thro’ a Vacuum, without the Mediation of any thing else, by and through which their Action and Force may be conveyed from one to another, is to me so great an Absurdity that I believe no Man who has in philosophical Matters a competent Faculty of thinking can ever fall into it. Gravity must be caused by an Agent acting constantly according to certain laws; but whether this Agent be material or immaterial, I have left to the Consideration of my readers.

— Isaac Newton, Letters to Bentley, 1692/3

Laplace did the obvious: assume that gravity was a field propagating at a finite speed (18C). Next Poincare realized that gravitational waves had to travel at the speed of light (1905 CE).

The argument can be made that Quantum observations are actually Quantum INTERACTIONS. Bohr basically said this. Then (Popper-)Einstein counter-observed that if that were true, there would be spooky instantaneous interaction at a distance.

Indeed.

As observed (by many experimentalists).

Solution: pontificate that this interaction is happening at a finite speed. We already know it is much higher than the speed of light, by a factor of 10^23.

That would seriously change the Standard Model, especially cosmologically.

A finite speed Quantum Interaction is the basic axiom of Sub Quantum Physical Reality.

Out of it Dark Matter pops out effortlessly… but then it becomes an emergent property modifying the ΛCDM Lambda Cold Dark Matter model of the universe… which is giving different local and global expansion speeds at this point….

Ha, Patrice, you are goading me. I am sure you know that I do not accept that faster than light quantum entanglement has been proved at all.

In my opinion, nobody has demonstrated without flaws anything interacts faster than c

Yes, well, Ian, I confess to various shortcomings, from goading to sincerely not understanding your argument about the EPR experimental demonstration, due to my daft nature.

I wish I could understand your objection. I tried, but failed.

For me it’s very simple various little drawings I made on my site shows: the wave is spread out, then it singularizes as a particle. If the wave is spread over a light year, clearly something FTL is going on, as the singularization takes a very short time. Retrospective instantaneous localization was demonstrated… and it better be true, or well-known laws would be non “conserved”.

The UP, the Uncertainty Principle, is failing for large objects, as expected. That was just published. Two different groups, it’s in Science.

I have that in my essay just published extending the previous comment.

By the way, I tried to order your book on pilot wave but failed for reasons relating to Amazon refusing my location (pun not intended!)

Patrice, The simplest explanation I can think of regarding the violations of Bell’s Inequalities in the Aspect experiment is this. You need three separate determinations A B and C which can have plus or minus values. In his explanation, Bell used washing socks at, from memory, 25 degrees, 35 degrees and 45 degrees C, and then doing some test. Now in the Aspect experiment, A+B- was in one configuration, B+C- was exactly the same, but rotated 22.5 degrees. From Noether’s theorem, you can’t get new results from rotating the apparatus provided the source is rotationally invariant any more than you can by coming back tomorrow and repeating the same configuration. Aspect simply did not have sufficient independent variables to put into the inequalities.

The wave question you mention bothered Einstein, at least in terms of the Born interpretation. As he wrote to Born (if memory serves me correctly) if you fire a particle through a pinhole, diffraction of the wave spreads it out in all directions, BUT the particle is always observed at a point. My answer is the probabilities merely reflect what could happen, not what must happen. The article more or less follows a trajectory, as is shown in the following experiment: Kocsis, S. and 6 others. 2011. Observing the Average Trajectories of Single Photons in a Two-Slit Interferometer Science 332: 1170 – 1173.

Amazon refused your location?? [I am tempted to say you obviously had no momentum]. However, for me this is disturbing. If Amazon is doing this sort of thing, I shall have to do something.