Geoengineering – to do or not to do?

Climate change remains an uncomfortable topic. Politicians continue to state it is such an important problem, and then fail to do anything sufficient to solve it. There seems to be an idea amongst politicians that if everyone drove electric vehicles, all would be well. Leaving aside the question as to whether over the life of a vehicle the electric vehicle actually emits less greenhouse gas (and at best, that is such a close call it is unlikely to have any hope of addressing the problem) as noted in my post https://wordpress.com/post/ianmillerblog.wordpress.com/885

the world cannot sustain then necessary extractions to make it even vaguely possible. If that is the solution, why waste the effort – we are doomed.

As an article in Nature (vol 593, p 167, 2021) noted, we need to evaluate all possible options. As I have remarked in previous posts, it is extremely unlikely there is a silver bullet. Fusion power would come rather close, but we still have to do a number of other things, such as to enable transport, and as yet we do not have fusion power. So what the Nature article said is we should at least consider and analyse properly the consequences of geoengineering. The usual answer here is, horrors, we can’t go around altering the planet’s climate, but the fact is we have already. What do you think those greenhouse gases are doing?

The problem is that while the world has pledged to reduce emissions by 3 billion t of CO2 per year, even if this is achieved, and that is a big if, it remains far too little. Carbon capture will theoretically solve some of the problems, but it costs so much money for no benefit to the saver that you should not bet the house on that one. The alternative, as the Nature article suggests, is geoengineering. The concept is to raise the albedo of the planet, which reflects light back to space. The cooling effect is known: it happens after severe volcanic eruptions.

The basic concept of sending reflective stuff into the upper atmosphere is that it is short-term in nature, so if you get it wrong and there is an effect you don’t like, it does not last all that long. On the other hand, it is also a rapid fix and you get relatively quick results. That means provided you do things with some degree of care you can generate a short-term effect that is mild enough to see what happens, and if it works, you can later amplify it.

The biggest problem is the so-called ethical one: who decides how much cooling, and where do you cool? The article notes that some are vociferously opposed to it as “it could go awry in unpredictable ways”. It could be unpredictable, but the biggest problem would be that the unpredictability would be too small. Another listed reason to oppose it was it would detract from efforts to reduce greenhouse emissions. The problem here is that China and many other places are busy building new coal-fired electricity generation. Exactly how do you reduce emissions when so many places are busy increasing emissions? Then there is the question, how do you know what the effects will be? The answer to that is you carry out short-term mild experiments so you can find out without any serious damage.

The other side of the coin is, if we even stopped emissions right now, the existing levels will continue to heat the planet and nobody knows by how much. The models are simply insufficiently definitive. All we know is that the ice sheets are melting, and when they go, much of our prime agricultural land goes with it. Then there is the question of governance. One proposal to run small tests in Scandinavia ran into opposition from a community that protested that the experiments would offer a distraction from other reduction efforts. It appears that some people seem to think that with just a little effort this problem will go away. It won’t. One of the reasons for obstructing research is that the project will affect the whole planet. Yes, well so does burning coal in thermal generators, but I have never heard of the rest of the planet being consulted on that.

Is it a solution? I don’t know. It most definitely is not THE solution but it may be the only solution that acts quickly enough to compensate for a general inability to get moving, and in my opinion we badly need experiments to show what can be achieved. I understand there was once one such experiment, although not an intentional one. Following the grounding of aircraft over the US due to the Twin Tower incident, I gather the temperatures the following two days went up by over a degree. That was because the ice particles due to jet exhausts were no longer being generated. The advantages of an experiment using ice particles in the upper atmosphere is you can measure what happens quickly, but it quickly goes away so there will be no long term damage. So, is it possible? My guess is that technically it can be managed, but the practical issues of getting general consent to implement it will take so long it becomes more or less irrelevant. You can always find someone who opposes anything.

Ebook discount: Athene’s Prophecy

From May 20 – May 27, Athene’s Prophecy will be discounted to 99c/99p on Amazon and Amazon UK. Science fiction with some science you can try your hand at. The young Roman Gaius Claudius Scaevola receives a vision from Pallas Athene, who asks him to do three things so he can save humanity from total destruction. He must learn the art of war, he must make a steam engine, and while using only knowledge available in the first century he must show the Earth goes around the sun. Can you do it? Try your luck. I suspect you will fail, and to stop cheating, the answer is in the following ebook. 

Scaevola is in Egypt for the anti-Jewish riots, then he is sent to Syria as Tribunis laticlavius in the Fulminata. Soon he must prevent a rebellion when Caligulae orders a statue of himself in the temple of Jerusalem. You will get a different picture of Caligulae than what you normally see, supported by a transcription of a report of the critical meeting regarding the statue by Philo of Alexandria. (Fortunately, copyright has expired.). First of a series. http://www.amazon.com/dp/B00GYL4HGW

Fighting Global Warming Naturally?

In a recent edition of Nature (593, pp191-4) it was argued that to combat global warming, removing carbon from the atmosphere, which is often the main focus, should instead be reviewed in light of how to lower global temperatures. They note that the goal of reducing warming from pre-industrial times to two Centigrade degrees cannot be met solely through cuts to emissions so carbon dioxide needs to be removed from the atmosphere. The rest of the article more or less focused on how nature could contribute to that, which was a little disappointing bearing in mind they had made the point that this was not the main objective. Anyway, they went on to claim nature-based solutions could lower the temperature by a total of 0.4 degrees by 2100. Then came the caveats. If plants are to absorb carbon dioxide, they become less effective if the temperatures rise too high for them.  

Some statistics: 70% of Earth’s land surface has been modified by humanity; since 1960 we have modified 20% of it. Since 1960 changes include (in million square kilometers): urban area  +0.26; cropland +1.0; pasture +0.9; forestry -0.8.

The proposal involves three routes. The first is to protect current ecosystems, which includes stopping deforestation. This is obvious, but the current politicians, such as in Brazil, suggest this receives the comment, “Good luck with that one.” This might be achieved in some Western countries, but then again they have largely cut their forests down already. If we are going to rely on this we have a problem.

The second is to restore ecosystems so they can absorb more carbon. Restoration of forest cover is an obvious place to start. However, they claim plantation forests do not usually give the same benefits as natural forest. The natural forest has very dense undergrowth. Unless there are animals to eat that you may end up generating fuel to start forest fires, which wipe out all progress. Wetlands are particularly desirable because they are great at storing carbon, and once underway, they act rather quickly. However, again this is a problem. Wetlands, once cleared and drained are somewhat difficult to restore because the land tends to have been altered in ways to avoid the land reverting, so besides stopping current use, the alterations have to be removed. 

Notwithstanding the difficulties, there is also a strong reason to create reed beds that also grow algae. The reason is they also take up nitrogen and phosphate in water. Taking up ammoniacal wastes is very important because if part of the nitrogen waste is oxidized, or nitrates have been used as fertilizer, ammonium nitrate will decompose to form nitrous oxide, which is a further greenhouse gas that is particularly difficult because it absorbs in quite a different part of the infrared spectrum, and it has no simple decay route or anything that will absorb it. Consequently, what we put in the air could be there for some time. 

 The third is to improve land management, for timber, crops and grazing, Thus growing a forest on the borders of a stream should add carbon storage, but also reduce flooding and enhance fish life. An important point is that slowing runoff also helps prevent soil loss. All of this is obvious, except, it seems, to officials. In Chile, the government apparently gave out subsidies for pine and eucalyptus planting that led to 1.3 million hectares being planted. What actually happened was that the land so planted had previously been occupied with original forest and it is estimated that the overall effect was to emit 0.05 million t of stored carbon rather than sequester the 5.6 million t claimed. A particular piece of “politically correct” stupidity occurred here. The Department of Conservation, being green inclined, sent an electric car for its people to drive around Stewart Island, the small southern island of New Zealand. It has too small a population to warrant an electric cable to connect it to the national grid, so all the electricity there is generated by burning diesel!

The paper claims it is possible to remove 10 billion tonne of CO2 fairly quickly, and 20 billion tonne by 2055. However, my feeling is that is a little like dreaming because I assume it requires stopping the Amazon and similar burnoffs. On the other hand, even if it does not work out exactly right, it still has benefits. Stopping flooding and erosion while having a more pleasant environment and better farming practices might not save the planet, but it might make your local bit of planet better to live in.

How can we exist?

One of the more annoying questions in physics is why are we here? Bear with me for a minute, as this is a real question. The Universe is supposed to have started with what Fred Hoyle called “The Big Bang”. Fred was being derisory, but the name stuck. Anyway what happened is that a very highly intense burst of energy began expanding, and as it did, perforce the energy became less dense. As that happened, out condensed elementary particles. On an extremely small scale, that happens in high-energy collisions, such as in the Large Hadron Collider. So we are reasonably convinced we know what happened up to this point, but there is a very big fly in the ointment. When such particles condense out we get an equal amount of matter and what we call antimatter. (In principle, we should get dark matter too, but since we do not know what that is, I shall leave that.) 

Antimatter is, as you might guess, the opposite of matter. The most obvious example is the positron, which is exactly the same as the electron except it has positive electric charge, so when a positron is around an electron they attract. In principle, if they were to hit each other they would release an infinite amount of energy, but nature hates the infinities that come out of our equations so when they get so close they annihilate each other and you get two gamma ray photons that leave in opposite directions to conserve momentum. That is more or less what happens when antimatter generally meets matter – they annihilate each other, which is why, when we make antimatter in colliders, if we want to collect it we have to do it very carefully with magnetic traps and in a vacuum.

So now we get to the problem of why we are here: with all that antimatter made in equal proportions to matter, why do we have so much matter? As it happens, the symmetry is violated very slightly in kaon decay, but this is probably not particularly helpful because the effect is too slight. In the previous post on muon decay I mentioned that that could be a clue that there might be physics beyond the Standard Model to be unraveled. Right now, the fact that there is so much matter in the Universe should be a far stronger clue that something is wrong with the Standard Model. 

Or is it? One observation that throws that into doubt was published in the Physical Review, D, 103, 083016 in April this year. But before coming to that, some background. A little over ten years ago, colliding heavy ions made a small amount of anti helium-3, and a little later, antihelium-4. The antihelium has two antiprotons, and one or two antineutrons. To make this, the problem is to get enough antiprotons and antineutrons close enough. To give some idea of the trouble, a billion collisions of gold ions with energies of two hundred billion and sixty-two billion electron volts produced 18 atoms of antihelium 4, with masses of 3.73 billion electron volts. In such a collision, the energy requires a temperature of over 250,000 times that of the sun’s core. 

Such antihelium can be detected through gamma ray frequencies when the atoms decay on striking matter, and apparently also through the Alpha Magnetic Spectrometer on the International Space Station, which tracks cosmic rays. The important point is that antihelium-4 behaves exactly the same as an alpha particle, except that, because the antiprotons have negative charge, their trajectories bend in the opposite direction to ordinary nuclei. These antinuclei can be made through the energies of cosmic rays hitting something, however it has been calculated that the amount of antihelium-3 detected so far is 50 times too great to be explained by cosmic rays, and the amount of antihelium-4 detected is 100,000 times too much.

How can this be? The simple answer is that the antihelium is being made by antistars. If you accept them, gamma ray detection indicates 5787 sources, and it has been proposed that at least fourteen of these are antistars, and if we look at the oldest stars near the centre of the galaxy, then estimates suggest up to a fifth of the stars there could be antistars, possibly with antiplanets. If there were people on these, giving them a hug would be outright disastrous for each of you.Of course, caution here is required. It is always possible that this antihelium was made in a more mundane way that as yet we do not understand. On the other hand, if there are antistars, it solves automatically a huge problem, even if it creates a bigger one: how did the matter and antimatter separate? As is often the case in science, solving one problem creates even bigger problems. However, real antistars would alter our view of the universe and as long as the antimatter is at a good distance, we can accept them.

Much Ado About Muons

You may or may not have heard that the Standard Model, which explains “all of particle physics”, is in trouble and “new physics” may be around the corner. All of this arises from a troublesome result from the muon, a particle that is very similar to an electron except it is about 207 times more massive and has a mean lifetime of 2.2 microseconds. If you think that is not very important for your current lifestyle (and it isn’t) wait, there’s more. Like the electron it has a charge of -1, and a spin of ½, which means it acts like a small magnet. Now, if the particle is placed in a strong magnetic field, the direction of the spin wobbles (technically, precesses) and the strength of this interaction is described by a number called the g factor, which for a classical situation, g = 2. Needless to say, in the quantum world that is wrong. For the electron, it is roughly 2.002 319 304 362, the numbers here stop where uncertainty starts. If nothing else, this shows the remarkable precision achieved by experimental physicists. Why is it not 2? The basic reason is the particle interacts with the vacuum, which is not quite “nothing”. You will see quantum electrodynamics has got this number down fairly precisely, and quantum electrodynamics, which is part of the standard model, is considered to be the most accurate theoretical calculation ever, or the greatest agreement between calculation and observation. All was well, until this wretched muon misbehaved.

Now, the standard model predicts the vacuum comprises a “quantum foam” of virtual particles popping in and out of existence, and these short-lived particles affect the g-factor, causing the muon’s wobble to speed up or slow down very slightly, which in turn leads to what is called an “anomalous magnetic moment”. The standard model should calculate these to the same agreement as with the electron, and the calculations give:

  • g-factor: 2.00233183620
  • anomalous magnetic moment: 0.00116591810

The experimental values announced by Fermilab and Brookhaven are:

  • g-factor: 2.00233184122(82)
  • anomalous magnetic moment: 0.00116592061(41)

The brackets indicate uncertainty. Notice a difference? Would you say it is striking? Apparently there is only a one in 40,000 chance that it will be a statistical error. Nevertheless, apparently they will keep this experiment running at Fermilab for another two years to firm it up. That is persistence, if nothing else.

This result is what has excited a lot of physicists because it means the calculation of how this particle interacts with the vacuum has underestimated the actual effect for the muon. That suggests more physics beyond the standard model, and in particular, a new particle may be the cause of the additional effect. Of course, there has to be a fly in the ointment. One rather fearsome calculation claims to be a lot closer to the observational value. To me the real problem is how can the same theory come up with two different answers when there is no arithmetical mistake?

Anyway, if the second one is right, problem gone? Again, not necessarily. At the Large Hadron collider they have looked at B meson decay. This can produce electrons and positrons, or muons and antimuons. According to the standard model, these two particles are identical other than for mass, which means the rate of production of each should be identical, but it isn’t quite. Again, it appears we are looking at small deviations. The problem then is, hypothetical particles that might explain one experiment fail for the other. Worse, the calculations are fearsome, and can take years. The standard model has 19 parameters that have to be obtained from experiment, so the errors can mount up, and if you wish to give the three neutrinos mass, in come another eight parameters. If we introduce yet another particle, in come at least one more parameter, and probably more. Which raises the question, since adding a new assignable parameter will always answer one problem, how do we know we are even on the right track?

All of which raises the question, is the standard model, which is a part of quantum field theory, itself too complicated, and maybe not going along the right path? You might say, how could I possibly question quantum field theory, which gives such agreeable results for the electron magnetic moment, admittedly after including a series of interactions? The answer is that it also gives the world’s worst agreement with the cosmological constant. When you sum the effects of all these virtual particles over the cosmos, the expansion of the Universe is wrong by 10^120, that is, 10 followed by 120 zeros. Not exceptionally good agreement. To get the agreement it gets, something must be right, but as I see it, to get such a howling error, something must be wrong also. The problem is, what?