Air strikes over Syria

This was the week of air strikes that demonstrated at best, general incompetence, and at worst, something more sinister. Deir al-Zour is, or was, the seventh largest city in Syria and has an unfortunate history due to its being on the Euphrates river and being on an important trade route. Consequently it has been a target for various invaders until the Mongols simply wiped it out and thus removed the misery. However, its location led it to rise again. Now it is one of the very few centres in Eastern Syria still held by the government, although ISIS has been attacking it and by surrounding it, has meant that resupply is only possible by air. This makes the local airfield important. Overlooking, from a distance, is the al-Thardeh mountains, although I might more likely call them hills.

Now, obviously these hills are strategically important, and ISIS has been trying to capture them from the Syrian army for some time. So what does the US do? It sends in warplanes from Iraq and bombs the Syrian positions, thus allowing ISIS to capture some key strategic positions. Shortly later, a Syrian warplane tried to bomb ISIS troops, and was seemingly shot down by a US made surface to air missile operated by ISIS soldiers. What do you make of that?

The Russians argue that the US is helping ISIS. While strictly speaking this is true, I really don’t believe it is intentional. The problem, of course, is if you are bombed, it makes little difference to those on the ground whether the bombing was intentional. If you are dead, you are dead. The Russians also argue that the US refuses to coordinate air attacks with the Russians, and that seems to be not in dispute. The US spokesperson countered by accusing the Russians of point scoring, which it undoubtedly was, but that is irrelevant because the point was valid. I see a real problem here. Both claim they are trying to bomb ISIS, but the US seems to have a pathological hatred against Assad. The Russians do not have this problem so the Russians are far more likely to be aware of the Syrian army deployments, and of current conditions. To make matters worse, this is the second time this year that the US warplanes have attacked the Syrian army in this district. Either these are not mistakes, or the USAF is not learning from its mistakes.

The US says it did not knowingly strike the Syrian military, and it confused them with ISIS fighters. For me, there are several difficulties with this statement. The first is the obvious one that it should be prepared to talk to the Russians and accept intelligence. If you really do not want to kill members of the Syrian military, should you not take the trouble to find out where they are? What also bothers me is a parallel with the old hunting advice: do not shoot until you have positively identified your target. That simply did not happen. Given the strategic nature of these hills, and given that the US knows that ISIS has a number of the surface to air missiles it has given so-called moderate rebels, then it knows that either the hills must be kept in Syrian hands, or the city will have trouble with resupply and it will probably fall to ISIS.

So, what is the problem? In my view, a mixture of arrogance and incompetence. The question then is, do we accept that as an excuse? The US Air Force appears to be so technically superior to anything else around that might fight it that it can effectively do what it likes. Does that not give greater responsibility to use that power responsibly?

The second air strike was against a UN relief convoy heading to Aleppo. The US has accused the Russians of being responsible. I do not believe a Russian plane did that, but the US then counters by saying Russia is responsible for the Syrians because it is supporting them. Assad denies doing it, but he would. He may well be right. It is possible that some Syrian pilot decided to do this on his own volition, and it is unlikely Assad could find out. It is also unlikely he would try very hard.

Why would a Syrian do it? Because the UN is sending food and medical supplies to the rebels who are busy shooting at the Syrian Army. I suspect the average Syrian soldier considers such supplies will largely go to the rebels, which relieves that major problem for them, other than ammunition. This is the problem for cease-fires; they solve nothing, as both sides try to strengthen their positions, and when one side cannot do much more, it is in their interest to restart as quickly as possible. The only time a cease-fire can achieve anything is if there are grounds by which the two sides can agree to end the fighting. That requires a resolution to the issue that caused it to start. The Western politicians all want this resolved, but they also want Assad to go, and they have no idea what to replace him with, having seemingly learned nothing from Iraq. Assad would have to be mad to step down now because all and sundry would want to try him for war crimes, and such trials have only one outcome: what the victor wants. So, with no means of resolving this conflict, the ceasefire is actually counterproductive to the innocent civilians, because it merely extends the misery as the rebels get stronger. (The fact that Assad is guilty is irrelevant; he has to have some reason to step down.)

If a politician rambles on about how the various parties should end this fighting, then they should specifically state how it could be resolved, and what will happen next. Otherwise, all they are doing is point scoring, and who cares?

Dark Energy and Modern Science

Most people think the scientific endeavour is truly objective; scientists draw their conclusions from all the facts and are never swayed by fashion. Sorry, but that is not true, as I found out from my PhD work. I must post about that sometime, but the shortened version is that I entered a controversy, my results unambiguously supported one side, but the other side prevailed for two reasons. Some “big names” chose that side, and the review that settled the issue conveniently left out all reference to about sixty different sorts of observations (including mine) that falsified their position. Even worse, some of the younger scientists who were on the wrong side simply abandoned the field and tried to conveniently forget their work. But before I bore people with my own history, I thought it would be worth noting another issue, dark energy. Dark energy is supposed to make up about 70% of the Universe so it must be important, right?

Nobody knows, or can even speculate with some valid reason, what dark energy is, and there is one reason only for believing it even exists, and that is it is believed that the expansion of the Universe is accelerating. We are reasonably certain the Universe is expanding. Originally this was discovered by Hubble, who noticed that the spectra of distant galaxies have a red shift in their frequencies and the further away they are, the bigger the red shift. This means that the whole universe must be expanding.

Let me digress and try to explain the Doppler shift. If you think of someone beating a drum regularly, then the number of beats per unit time is the frequency. Now, suppose the drum is on the back of a truck. If you hear a beat, and expect the next one at, say, 1 second later, if the truck starts to move away, the beat will come slightly later because the sound has had further to go. If the truck goes away at a regular speed, the beats will be delayed from each other by the same interval, the frequency is less, and that is called a red shift in the frequency. Now, the sound intensity will also become quieter with distance as the sound spreads out. Thus you can determine how far away the drum is and how fast it is moving away. The same applies to light, and if the universe is expanding regularly, then the red shift should also give you the distance. Similarly, provided you know the source intensity, the measured light intensity should give you the distance.

That requires us to measure light from stars that produce a known light output, which are called standard candles. Fortunately, there is a type of very bright standard candle, or so they say, and that is the type 1A supernova. It was observed in the 1990s that the very distant supernovae were dimmer than they should be according to the red shift, which means they are further away than they should be, which means the expansion must be accelerating. To accelerate there must be some net force pushing everything apart. That something is called dark energy, and it is supposed to make up about two thirds of the Universe. The discoverers of this phenomenon won a Nobel prize, and that, of course, in many people’s eyes means it must be true.

The type 1A supernova is considered to arise when a white dwarf star starts to strip gas from a neighbouring star. The dwarf gradually increases in mass, and because its nuclear cycle has already burnt helium into carbon and oxygen, where because the mass of the dwarf is too low, the reactions stop. As the dwarf consumes its neighbour, eventually the mass becomes great enough to start the burning of carbon and oxygen; this is uncontrollable and the whole thing explodes. The important point here is that because the explosion point is reached because of the gradual addition of fresh mass, it will occur at the same point for all such situations, so you get a standard output, or so it is assumed.

My interest in this came when I went to hear a talk on the topic, and I asked the speaker a question relating to the metallicity of the companion star. (Metallicity is the fraction of elements heavier than helium, which in turn means the amount of the star made up of material that has already gone through supernovae.) What I considered was that if you think of the supernova as a bubble of extremely energetic material, what we actually see is the light from the outer surface nearest to us, and most of that surface will be the material of the companion. Since the light we see is the result of the heat and inner light promoting electrons to higher energy levels, the light should be dependent on the composition of the outer surface. To support that proposition, Lyman et al. (arXiv: 1602.08098v1 [astro-ph.HE] 2016) have shown that calcium-rich supernovae are dimmer than iron-rich ones. Thus the 1A supernova may not be such a standard candle, and the earlier it was, the lower the metallicity will be, and that metallicity will favour lighter atoms, which do not have as many energy levels from which to radiate so they will be less efficient at converting energy to light.

Accordingly, my question was, “Given that low metallicity leads to dimmer 1a supernovae, and given that the most distant stars are the youngest and hence will have the lowest metallicity, could not that be the reason the distant ones are dimmer?” The response was a crusty, “That was taken into account.” Implied: go away and learn the standard stuff. My problem with that was, how could they take into account something that was not discovered for another twenty years or so? Herein lies one of my gripes about modern science: the big names who are pledged to a position will strongly discourage anyone questioning that position if the question is a good one. Weak questions are highly desired, as the name can satisfactorily deal with it and make himself feel better.

So, besides this issue of metallicity, how strong is the evidence for this dark energy. Maybe not as strong as everyone seems to say. In a recent paper (Nielsen et al. arXiv:1506.01354v2) analysed data for a much larger number of supernovae and came to a somewhat surprising conclusion: so far, you cannot actually tell whether expansion is accelerating or not. One interesting point in this analysis is that we do not simply relate the measured magnitude to distance. In addition there are corrections for light curve shape and for colour, and each has an empirical constant attached to it, and the “constant” is assumed to be constant. There must also be corrections for intervening dust, and again it is a sheer assumption that the dust will be the same in the early universe as now, despite space being far more compact.

If we now analyse all the observed data carefully (the initial claims actually chose a rather select few) we find that any acceptable acceleration consistent with the data does not deviate significantly from no acceleration out to red shift 1, and that the experimental errors are such that to this point we cannot distinguish between the options.

Try this. Cabbolet (Astrophys. Space Sci. DOI 10.1007/s10509-014-1791-4) argues that from the Eöt-Wash experiment that if there is repulsive gravity (needed to accelerate the expansion), then quantum electrodynamics is falsified in its current formulation! Quantum electrodynamics is regarded as one of the most accurate theory ever produced. We can, of course, reject repulsive gravity, but that also rejects dark energy. So, if that argument is correct, then at least one of the two has to go, and maybe dark energy is the one more prone to go.

Another problem is that it is assumed that type 1a supernovae are standard because they all form by the gradual accretion of extra matter from a companion. But Olling et al. (Nature, 521: 332 – 335, 2015) argue that they have found three supernovae where the evidence is that the explosion occurred by one dwarf simply swallowing another, and now there is no standard mass, so the energy could be almost anyhting, depending on the mass of the companion.

Milne (ApJ 803 20. doi:10.1088/0004-637X/803/1/20) has shown there are two classes of 1a supernovae, and for one of those there is a significant underestimation of the optical luminosity of the NUV-blue SNe Ia, in particular, for the high-redshift cosmological sample. Not accounting for this effect should thus produce a distance bias that increases with redshift and could significantly bias measurements of cosmological parameters.

So why am I going on like this? I apologize to some for the details, but I think this shows a problem in that the scientific community is not always as objective as they should be. It appears to be, they do not wish to rock the boat holding the big names. All evidence should be subject to examination, and instead what we find is that only too much is “referred to the experts”. Experts are as capable of being wrong as anyone else, when there is something they did not know when they made their decision.

Interstellar probes

I get annoyed when something a little unexpected hits the news, and immediately after, a number of prominent people come up with proposals that look great, but somehow miss what seems to be obvious. In my last post (https://wordpress.com/post/ianmillerblog.wordpress.com/603), I discussed the newly discovered exoplanet Proxima b, which is orbiting a red dwarf about 4.25 light years away. Unfortunately, that is a long way away. A light year is about 9.46 x 10^12 km, or nearly nine and a half trillion km. The Universe is not exactly small. This raises the question, what do we do with this discovery? The problem with such distances is that if we sent something at the speed of light it would take 4.25 years to get there. Apart from electromagnetic radiation, we cannot get anything up to light speed. If we got something up to a tenth of light speed, it would take 42.5 years to get there, and if it sent back messages through electromagnetic radiation, say light or radio, it would take a further 4.25 years to get back, and we would get information in 46.75 years. In other words, it is close enough that in principle, someone could get information back in his/her lifetime, provided they started young. So, how could one get even up to one tenth light speed? This is where the experts jump in.

What, they propose, with about one day’s thought at most, is to send small, almost micro, probes that have a sail attached. The reason for small is that we can get more acceleration for any applied force. The idea then is that these miniprobes can be accelerated up to a significant fraction of light speed by directing a laser at the sail. Light has momentum. There is the classic way to demonstrate this: little paddles with one side light and the other dark are suspended with bearings so the paddles can rotate, then these are enclosed in a glass jar and the air evacuated. If light is shone on it, the paddles rotate. It is very easy to show from Maxwell’s electromagnetic theory why this must be. Maxwell showed that electromagnetic radiation is emitted if charge is accelerated. The reason why light must carry momentum: if the body carrying the charge is accelerated, its momentum changes, and conservation of momentum requires the radiation to carry that momentum away. For our sail, the great advantage is all the energy is generated on Earth. The normal problem with space travel is so much weight has to be carried just for propulsion, in the fuel, the engines, the piping, and the structure connecting the engines to the rest. Here all the power is generated somewhere else, so that weight can be left behind. All the weight needed for acceleration is the sail and a connection to the probe. This proposal seems superficially to be a great idea, but in my opinion, there will be great difficulties in making this work.

The first reason is aim. If the probe has motors, they can correct for faulty aim, but if all the power comes from Earth, you have to get it right from Earth. The probe may fly by, but it has to get reasonably close. Assume you can tolerate it being anywhere on a 1000 km arc (you can put in your own number) where the arc is part of a circle with the centre at Earth. The length of the arc is r θ, where θ now represents the maximum angle that the “aim can be wrong. That means θ has to be less than 1000/40.2×10^12, or 1 part in 40 billion. Do you really think you can aim that well? Of course, the aim is for where the planet will be then. If you get the speed slightly wrong, the planet could be on the other side of the star when the fly-by happens. The velocity control has to have similar accuracy. Besides the planet orbiting, the star is also moving, so the whole system has to be there when required.

The next problem is that the laser beam must strike the sail exactly above the centre of mass of the probe. Any deviation and there is an applied torque to the probe, and the sail starts to spin. Can we be that accurate?

The final problem is the sail has to be exactly at right angles to the beam. Suppose it deviates by φ. Now the accelerating impulse is p.cos φ, where p is the mometum available to be transferred to the probe, and there is a sideways impulse of p.sinφ. Now, if the probe moves even slightly sideways, as noted above it starts to spin, and it is out of control. Alternatively, if it starts to spin in any way at all, the sail will give the probe a lateral nudge. It does not need much to exceed one part in forty billion. It will move out of the laser beam, and it would be seemingly extremely difficult to devise a means of correcting for such errors becaus eyou have no way of knowing wher eit is, once it has got underway for a reasonable distance. In short, a great idea in principle, but geometry is not going to make this at all easy. What I find hard to understand is why the proponents of this scheme do not seem to have stated how they can get around this obvious problem.

Proxima b

The news this week is that a planet has been found around Proxima Centauri, the nearest star to our solar system. Near, of course is relative. It would take over four years for a radio message to get there, or 40 years if you travel at 0.1 times light speed. Since we can never get anywhere near that speed with our current technology, it is not exactly a find critical to our current society. The planet is apparently a little larger than Earth, and it is in the so-called habitable zone, where the star gives off enough heat to permit liquid water to flow, assuming it has sufficient atmospheric pressure of a suitable composition. That last part is important. Thus Mars and Venus might permit water to flow if their atmospheres were different. In Venus’ case, far too much carbon dioxide; in Mars’ case, insufficient atmosphere, although it too might need something with a better greenhouse effect than carbon dioxide.

Proxima Centauri is what is called a red dwarf. Its mass is about 1/8 of the sun’s, so it gives off a lot less energy, however, Proxima b is only 0.0485 times as far from the star as Earth is, so being closer, it gets more of what little heat is available. The question then is, how like Earth is it?

Being so close to the star, standard wisdom argues that it will be tidally locked to the star, i.e. it always keeps the same face towards the star. So half of the planet is unaware of the star’s presence; one side is warm, the other very cold and dark. However, maybe here standard theory does not give the correct outcome. That it should be tidally locked is valid as long as gravitational interactions are the only ones applicable, but are they? According to Leconte et al. (Science 346: 632 – 635) there is another effect that over-rides that. Assuming the planet does not start tidally locked, and if it has an atmosphere, then there is asymmetric heating through the day, and because the highest temperature is at about 1500 hrs, the heat causes the air there to rise, and air from the cold side to replace it, which leads to retrograde rotation through conservation of angular m omentum. (Prograde rotation is as if the planet went around its orbit as if rolling on something.) Venus is the only planet in our system that rotates retrogradely, and Leconte argues it is for this reason. The effect on Venus is small because it is quite far away from the star, and it has a very thick atmosphere.

Currently, everyone seems to believe that because the planet is in the habitable zone then it will be a rocky planet like Earth. This raises the question, how do planets form? In previous posts I have outlined how I believe rocky planets form, (https://wordpress.com/post/ianmillerblog.wordpress.com/568 and https://wordpress.com/post/ianmillerblog.wordpress.com/576 ). These posts omit the cores of the giants, which in my theory accrete like snowballs. There are four cores leading to giants (Jupiter, Saturn, Uranus and Neptune) and there are four sets of ices to form them. (There are actually potentially more that would lead to bodies much further out.) The spacing of those four planets is very close to the projected ice points, assuming Jupiter formed where water ice would snowball.

Standard theory has it that we start with a distribution of planetesimals about the star, and these attract each other gravitationally, and larger bodies accrete until we get protoplanets, then there are massive collisions. The core of Jupiter, for example, would take about 10 My to form, then it would rapidly accrete gas.

There are, in my opinion, several things wrong with this picture. The first is, nobody has any idea how the planetesimals form. These are bodies as large as a major asteroid. The models assume a distribution of them, and this is based on the assumption all mass is evenly distributed throughout the accretion disk, except that the density drops off inversely proportional to rx. The index x is a variable that has to be assumed, which is fine, and it is then assumed that apart from particle density, all regions of space have equal probability of forming planetesimals. It is the particle density that is the problem. Once beyond the distance of Saturn, collision probability becomes too low to form planets, although the distribution of planetesimals permits Uranus and Neptune to migrate out of the planet-forming zone. Now, for me a major problem comes from the system LkCa 15, where there is a planet about five times bigger than Jupiter about three times further away from a star slightly smaller than our sun that is only 3 My old. There has simply been insufficient time to form that. In my ebook Planetary Formation and Biogenesis I proposed that bodies start accreting in the outer regions similar to snowballs. As the ices are swept towards the star, when they reach a certain temperature the ices start sticking together, and such a body can grow very quickly because as it grows, it starts orbiting faster than the gas. Accordingly, what you get depends on the temperatures in the accretion disk. Unfortunately, for any star we have no idea of that distribution because the disk is long gone. However, to a first approximation, the temperature is dependent on the heat generated less the heat lost. The heat generated at a point would depend on the gravitational potential at that point, which is dependent on stellar mass M, and the rate of flow past that point, which to a very rough approximation, depends on M2. That is by observation, and is + over 100%. So if we assume that all disks radiate equally (they don’t) and we neglect accretion rate variability, the position of a planet depends on M3. That gives a rough prediction of where planets might be, within a factor of about 3, which is arguably not very good.

However, if we use that relationship on a red dwarf of Proxima’s mass, the Jupiter core would be about 0.01 A.U. from the star. In short, Proxima b is at the very centre of where the Saturn equivalent should be, although I think it is far more likely to be the Jupiter core. The relationship is very rough, as the rate of accretion varies considerably from that relationship, and also, as the distances collapse, the size of the star now becomes significant, and back-heating will push the ice point further out. However, what is important here is that this “ice point” is relevant to any accretion theory. If ice is a solid at a given distance, then the ice should be accreted alongside any other solid.

So my opinion is that Proxima b would be a water world, with no land on the surface at all. Further, if it is the Jovian equivalent, it probably will not have an atmosphere, or if it does, it will be mainly oxygen from photolysed water. So I am very interested in seeing the future James Webb telescope aimed at that planet, as water gives a very clear infrared spectrum.

Money and Sport

Now the Olympics are over, we can admire what most of the athletes have achieved. Yes, they are professional, and they get paid to do it, nevertheless those there have shown genuine dedication. Whether professionalism is bad depends, I guess, on your point of view. The original games were also professional, and the Greeks designed them to ensure their soldiers were fit and skilled. In those times, throwing a javelin was not for fun. No need to let good ideas moulder away; I pinched that concept in my novel “Miranda’s Demons”, where an insurrection was taking place on Mars. The rebels took time out to hold games, with contests such as cross-country running, throwing and shooting in pressure suits. The objective here was to find the best way to carry out these activities when it became time to fight in the open.

In the earliest revitalized games the contests were for amateurs, but that was not quite as noble as some would have you think. It had one primary purpose: to reinforce the class structure of the time. To illustrate that point, consider the opprobrium deposited on Harold Larwood for his “bodyline” bowling against the Australian cricket team. Larwood came from the “working class” hence was designated a player; a gentleman would not do that. Of course that was quite wrong. His team captain was one of the “gentlemen” and he more or less ordered Larwood to bowl like that. Hypocrisy has always been strong in sport too.

Anyway, while such professional athletes do extremely well in the field, there raises a question about the nature of sport outside the actual events. One example was the vociferous attacks on Russian doping, which I mentioned earlier. This would have been more valid if the loudest voices were not supporting replacement athletes who could only get there if spaces were made available. The concept of conflict of interest is rather weak these days.

However, more publicized was the account of some American athletes who claimed to have been hijacked by Rio police, then as villains dressed as Rio police. Eventually, this was shown to be quite wrong and what had actually happened was that the group who had had far too much to drink had trashed a Rio business. (The details are still somewhat obscure as there have been different versions published.) Why would someone do that, then make up such a silly counter story? Much better would have been to come back, apologize, and offer restitution for the damage, in which case this would probably have blown over, but what actually happened was that the lies continued, in a sequence of watered down versions.

I should add that this sort of behaviour is not restricted to Americans. I know in New Zealand every now and again, a famous sportsman has far too much to drink and does something stupid, and hence hits the headlines, but usually there is at least some semblance of contrition when sobriety returns. The usual explanation is that when such young people are suddenly showered with money outside any of the wildest dreams of the ordinary members of their age group they simply cannot control themselves. It is almost as if some of them need minders on certain occasions.

Sometimes, however, it seems that others need minders, the others being organizers and officials. One of the best-known sporting teams in New Zealand is the All Blacks, the national rugby team, and who currently are world champions. (Yes, it helps that in most countries rugby is a minority sport.) The All Blacks spent all of last week in Sydney, preparing to play Australia on the Saturday. On the Friday (I think) they announced that they had found a bug in the room they used to hold team meetings. An initial response by the Australians was, they must be paranoid to be looking for bugs. Hold on a minute – they found a bug. You are not mad if you have suspicions that turn out to be true. This is particularly the case where there is a clear record of Australians having secretly video recorded team practices prior to games, and not only against New Zealand, as the South Africans caught them out once too. Why do they do these sort of things? Why is a game that important? The answer to that, of course, is money. Winners tend to get far better sponsorship deals.

So, the outcome of this debacle? Well, the All Blacks thrashed the Wallabies. There may be some poetic justice here because after the game was over the All Blacks announced that they actually found the bug on the Monday, but they left it there. I would like to think they managed to concoct up some totally erroneous plans to teach their opponents not to do that sort of thing.

Where to with the European Union?

Is the European Union a grand concept in the making, or merely a muddled mess? I suspect the latter. The British voters have now warned the European Union that Britain will exit, however since then nothing much has happened. One reason that I saw recently is that there will be elections in the next year or so in France and Germany, and the British want to know who they will be negotiating with before they start negotiating. At first sight that seems sensible, except consider how many countries there are in the EU. There is always going to be an election sometime in the near future somewhere. Does that mean that France and Germany are more important? (Yes, they are, but it is not politically correct to say so.) So it appears the first problem with exiting the European Union is there is no actual central part, other than an issuer of regulations conceived essentially by bureaucrats.

Herein lies a great problem for the EU. By permitting individual countries to have their own political systems that are nominally subordinate to a number of regulations from Brussels, and this is a very large number, there is a chaotic situation in terms of economies. The Union may have started as a common market, but it has changed considerably since then, and not in any clear direction. This was made worse when the European countries decided to move towards union via a common currency, apparently hoping that the various countries would adopt common economic policies. However, in representative systems of government, where politicians have to win elections, a common policy is very difficult unless there is some overall federal government. Thus in the US, while the various states can carry out independent actions to a certain degree, Washington still has overall control. That is something the EU has seemingly rejected, even though its structures seem to be heading that way by stealth. And it may be the stealth that has annoyed the British voters.

When the Euro came into being, there were marked structural differences between the various economies, and there was never a plan to harmonize these. Going further, the concept tended to involve a right wing economic view that markets were rational and would self-correct themselves. Just leave it to the market. So they did, and in 2008 there was chaos with a number of casualties. The ordinary Irishman had to pay, under edict from the European Central Bank, to rescue the corrupt Irish bankers. It continues with Greece. There is no plausible mechanism I can see how Greece can get out of the mess the German bankers got them into. It is true that the Greek politicians made the situation much worse, but why should the average Greek have to pay for their politician’s inability to accept policies that might lead to their being voted out?

The question then is, what can the European countries do? Superficially, the simplest “cure” would be to restructure debt and adopt countercyclical measures to fix productivity. The fact seems to be that you can have all the austerity you want, but you cannot have increasing productivity at the same time. It would also be necessary to heavily regulate the banking sector. However, none of this is going to happen. Germany wants to focus on price stability because its economy is, on the whole, doing much better than most. Further, it thinks that austerity on others is the only way its big banks are going to get their money back. That is unlikely to be true, because with economies like Greece, austerity has led to a major contraction, and it is difficult to get more money out of a severely contracting economy.

I suppose there is also the question of whether economic measures can increase productivity. They can certainly destroy it, but increasing it requires more than investment, although investment is certainly critical. The point is investment has to be the right investment. A machine might do the work of ten people, but if it makes something nobody wants, it is actually destructive. So the problem now reduces to finding the right people in the right place at the right time to see what the right things are to do, and then ensuring that there are adequate resources so that these things actually get done. It is here that Germany shines. The problem then is, while it is great that Germany shines, if the rest of Europe does not then Europe as a whole performs indifferently, none of which is helped by a prolonged uncertainty about the position of Britain. This would not matter if the rest of the world was vibrant, but it is not. Now the world has become so interconnected, we cannot have various parts of it performing poorly because it is highly likely that there will be serious difficulties at some time in the future. Economies always seem to go in cycles, and we cannot seem to get ourselves out of the 2008 downturn. We need to put that behind us before the next downturn arrives.

So where to for Europe? In my view, it has to make a clear decision. Either it should revert to a common market, or it should aim to be the United States of Europe. It has to get off the fence; it cannot succeed by being something in between.

Election Hazards

One of the curses of the Republic form of government that we have, where we elect representatives to govern us, is that politicians have to make statements on what they are going to do, and much of the time this tends to be made “on the hoof” so to speak, without proper consideration of the consequences. For most countries, this is annoying for the citizens who may suffer, but for America it is worse because everyone else suffers as well. We might hope that America with its greater population would produce more suitable politicians, but this may not be the case. It does not help that the American system is so prolonged, and this year it is so bitter.

What sparked this post was an item in our newspaper about Syria.

The situation is, the Russian air force is assisting the Syrian army in its advance on Aleppo, and if Assad can retake it, then he controls most of the population centres. Thirty percent of the country would be still controlled by ISIS, but that is mainly desert. What started as a sort of revolution to oust Assad, helped mainly by Saudi Arabia and the US, instead turned the county upside down and into a happy feeding ground for ISIS, helped by the fact that many of the rebel groups are essentially al Qaeda offshoots. Of course al Qaeda does not particularly support ISIS, but the rebels are indirectly helping ISIS, and there is no evidence they are doing anything directly to oppose it. They may have a different version of Islamic terrorism, but al Qaeda is still a terrorist grouping.

The good news for Assad is that Turkey is now not so eager to help the US in its efforts to get rid of Assad, the reason being that Erdogan now considers that there was US assistance to the recent coup effort to oust him. Given the US involvement in a number oustings of established governments, he is hardly likely to give the US the benefit of any doubt he might have. That there may be no real evidence to support the assertion is beyond the point, especially since the US and Europe have been heavily criticizing Erdogan for his purge following the coup attempt. They may well be right that most of those purged were innocent of the coup attempt, but being right does not mean anything to Erdogan as he cannot afford plotters. From Erdogan’s point of view, if the US wants to get rid of Assad and is prepared to even support al Qaeda affiliates to achieve that, then maybe the US is playing the same game with him. Can you blame him? When the facts are unclear, track record counts. The net result of this uncertainty is the supply of weapons etc from Turkey to the rebels in Aleppo is drying up.

So, where does that leave the US election process? Apparently both Trump and Clinton have stated that a major alteration of strategy is required, and I think that up to this point they are both right. The problem is, alter to what? According to the report in our newspaper, Clinton would order a “full review” of US strategy to get the “murderous regime” of Assad out, while escalating the fight against Islamic state. It quotes an advisor, Jeremy Bash, as saying Clinton has promised to establish a no-fly zone over Syria.

That, to my mind, is a potential disaster. What would Russia do? If Putin simply walks away, that would be the next best thing to a disaster for him. If he flies, do US planes shoot down his aircraft? The US air force would probably win any given combat, but that effectively triggers a major war. The problem is, where do the US planes come from? If they come from Turkey, and Russia attacks the base, then NATO is drawn in, and Russia has to do something about the ,missiles pointing at it from its western borders. WW III is underway. If Turkey bans them, the US has to resort to a carrier. Suppose the Russians sink the carrier, now what? At first, each side finds out how good their forces are, and the Russians may not be very happy with what they discover. On the other hand, it is unlikely to be a free shot and there will be US casualties. The next problem is how to contain this? Is getting rid of Assad really worth the risk of triggering world war three?

Trump apparently has stated (correctly in my view) the US has bigger problems than Assad, and he would assist Putin in getting rid of ISIS. On this point at least, I think Trump is right, but being right does not win votes. Now, Trump is accused of being a Putin plant. This is bad, because it means that anyone who tries to be reasonable with Russia is going to be accused of being . . . What? Is this a return to McCarthyism?

None of this is very encouraging.