## Saturday, October 29, 2011 ... //

### Climate alarmists as numerologists

Harold Camping calculated May 21st, 2011 as the date of the rapture (by the way, it should have another phase in this very month). Arthur Eddington "has" related all important parameters of the Universe to the inverse fine structure constant – which he thought to be 136 rather than 137.036. Alejandro Rivero is one of his numerous modern followers.

Numerology is the would-be scientific research of hypothetical relationships between "important numbers" and events in the complex world or the everyday life. This very definition conveys the message that such reasoning is usually irrational; I have discussed these matters 3 years ago in Excitement about numerical coincidences.

Complex quantities in the real world don't have a good reason to be described by "simple numbers" such as integers (if they're a priori real). Moreover, the circumstantial evidence that numerologists possess is usually extremely weak: they may "match" two or three digits which is just too little for a sensible person to get excited. However, when you happen to match six digits such as 196884 in the case of monstrous moonshine, there is some marginal reason to be open-minded. Indeed, in this case, the number appeared in the expansion of the $$j$$-function, a one-to-one map of the fundamental domain of the $$SL(2,Z)$$ group to the sphere,
$j(\tau) = 1/q + 744 + 196884 q + 21493760 q^2 + ...$ and the dimension of the Griess algebra, 196884 (the smallest nontrivial representation of the Monster group is the 196883-dimensional Griess algebra stripped of the single invariant "direction of real numbers").

Yes, several digits matched and one could actually find other, more complicated numbers that appeared in both of these seemingly distinct and unrelated contexts. It was later understood that this was no coincidence: string theory allowed the mathematicians to "connect" the monster algebra, a huge animal from the research of huge discrete symmetries, to the $$j$$-function which seems to be just some topic in complex analysis. Needless to say, those successes are exceptional and I don't want to talk about them here. Instead, I want to look at some proclamations by the climate alarmists.

We will look at an article by Judith Curry. She says some wise things that the alarmists, instead of fighting the "deniers", should have listened to wise warnings and ideas from the skeptics and should have avoided overconfident claims that we will "very likely" see continuing and relentless warming because it's caused by humans, something that looks really bizarre after just one decade of warming (such as the most recent one). When I say that her comments are wise (not only when addressed to climate alarmists), I must also add a disclaimer which became necessary after our experience with her Berkeley Earth colleague, Richard Muller: I am not saying that she is a climate skeptic and my endorsement of this comment of hers doesn't mean that I endorse all her future comments and attitudes.

But it is a quote attributed to Kevin Trenberth, a climate alarmist, that Curry mentioned and I would like to discuss in much more detail.

During the June 2011 conference on forecasting, I have probably surprised Scott Armstrong and Kesten Green by some kind of defense of Kevin Trenberth. I would say that he is a scientist who has also made some legitimate contributions to atmospheric physics besides his climate panic. Well, I must add today that he is a lunatic, too. He is irritated by the comment about one decade without a warming trend – one of the numerous inconvenient truths – and he says:

17: In any case, one decade is not long enough to say anything about human effects on climate; as one forthcoming paper lays out, 17 years is required.
Note the number 17 – which is really a prominent actor in the insane paper Trenberth mentions. Is that a celebration of the Great October Socialist Revolution in 1917? This number is extremely far from being the only arbitrary figure that is attributed a nearly religious importance by the climate crackpots. Let me mention just two more major examples (there are many more):
2.0: The world has to avoid the warming by more than 2 °C (the benchmark with respect to which we are comparing is never defined) because we will otherwise enter Hell on Earth.

350: We have to return the CO2 concentration from 392 ppm to 350 ppm because only at or beneath 350 ppm of CO2, our Earth has a chance to survive.
These are just three numbers, 17, 2.0, and 350, that play a prominent role in the hysterical new religion. Arbitrary plans to reduce CO2 emissions by 20, 30, 50, or 80 percent may be added to the list, too. Needless to say, all of them are completely meaningless random numbers, nothing special may be occurring when the corresponding quantities reach these values, and the qualitative statements are absurd with any numbers, anyway.

I am flabbergasted if someone fails to see that claims with similar precise numbers are products of dysfunctional brains' activity. However, the truth is even worse than that: those specific numbers are being used by those people exactly because they think that if they add particular precise numbers, their statements will look more convincing, more serious, and more scientific!

Well, they may only look more convincing to the listeners who have no idea about the rational reasoning whatsoever. When we say that something occurs when a quantity reaches the value $$X$$, we should realize that in the real world, the value of $$X$$ is a priori continuous and therefore never quite precise. We may at most say that it's approximately $$X$$; more quantitatively, we may determine a probabilistic distribution that the quantity has one value or another. The most important features of a probabilistic distribution are its mean value $$X_0$$; and the standard deviation $$\sigma$$. Let's use the term "error margin" for the latter.

None of the three statements – or dozens of analogous would-be quantitative statements that the climate alarmists offer – is described with an error margin. Well, that's not too surprising because in the case of 2.0, even the meaning of the mean value is ill-defined because the "initial moment for comparisons" isn't described unequivocally. However, even in the other examples, it's clear that we can determine neither the error margin nor the mean value if the statements are described in the same vague way we see.

Just to be sure, let's go through the three examples one-by-one.

Seventeen years

Kevin Trenberth says that one needs exactly seventeen years, and not just ten years, when the global warming is absent to disprove the theory of man-made global warming. This absolute and seemingly authoritative statement misses the fact that there are no sharp numbers here at all. First of all, the humans surely have a nonzero effect on the temperatures. The previous sentence would be valid even if the human contribution to the temperature changes were 1,000 times smaller than the largest natural driver. So we can never prove that the human effect is "exactly zero" because this is clearly an invalid statement.

So you have to clarify the statement. You have to say that the theory you want to disprove is e.g. that "the human contribution to temperature changes in $$X$$ years is larger than 50 percent." Note that I have mentioned the time scale – those $$X$$ years – because at different time scales, the percentage contributed by the humans is different. However, it depends not only on the length of the time interval; the percentage coming from the human activity depends on which exact $$X$$-year period we consider because at least the natural drivers are variable and quasi-periodic. Sometimes their contribution is higher, sometimes it's lower.

To deal with this problem, you would either have to talk about a particular – unrepeatable – period in the human history or you would have to carefully describe the way how you're averaging over many periods. There are many subtleties in this business that may change the quantitative answers by dozens or hundreds of percent. There is no "big question" here that can be given a "big yet accurate answer" resembling a commandment from the Bible. This is just not possible in science. We are dealing with a very complex system involving thousands of not-too-important but still non-negligible processes and the devil is in the details.

I think that this very general statement about the devil – a very self-evident one – is already hugely inconvenient for the alarmists because the invention of nonsensical simple-minded slogans that should be parroted by the brainwashed masses is one of the main reasons for the alarmists' existence. This is what they really want their "climate science" to be based upon. This is what they want to do at work; this is what they ideally want to be paid for.

Fine. Imagine that you exactly define what you mean by the "large man-made contribution" hypothesis and you want to determine the timescale over which the human contribution is allowed to be absent. How many years do you have to wait in order to decide whether the man-made contribution is persistent and dominant? It's $$X$$ years. How much is $$X$$?

Well, even at this level, there can't be any sharp answer. Even if the man-made contribution were dominant, there's always a nonzero probability that the other climate drivers (due to their quasi-randomness) will compensate or defeat the man-made contributions in a whole $$X$$-year period, regardless of the magnitude of $$X$$. If there is an "underlying" warming trend, the probability will decrease with $$X$$ but it will never be strictly zero. When Trenberth et al. chose $$X=17$$, they implicitly say that there must be something special about $$X=17$$ as opposed to a rounder number such as $$X=20$$. However, if the probability that the man-made contribution is beaten by the natural drivers is 10 percent for $$X=17$$, it may be still over 5 percent for $$X=20$$. So even if you could crisply and accurately calculate the confidence levels for different values of $$X$$, there is still "no preferred confidence level". A 90% probability of a warming or a 95% probability of a warming are still very far from a certainty. Moreover, the calculation of the confidence level as a function of $$X$$ is surely less accurate than what would be needed to distinguish 17 from 20 years.

So all such numbers such as "17 years needed for a proof of a warming trend" are nonsensical. By the way, using this very criterion, man-made global warming has already been excluded because there was no warming trend between the mid 1940s and the mid 1970s – for 30 years which is longer than 17 years. You may say that the CO2 emissions were somewhat lower in the 1950s and 1960s than they are today; fine, but they may also be higher in the future than they are today. So the figure 17 years also depends on the year in the middle or, which is implicitly the same dependence, on the typical annual CO2 emissions in this period.

To summarize, Trenberth's mentioning of "17 years" without any error margins is meant to impress true morons who think that the appearance of one number makes things more scientific. But they don't actually understand that the right numbers depend on dozens of factors and these numbers should be described with a confidence level and/or an error margin, anyway.

Two degrees

Analogously, some climate alarmists have brainwashed a couple of idiotic politicians and journalists into saying that "our goal is to keep the temperature change within 2 °C". Here, even the mean value is completely undefined because we're comparing the global mean temperature sometime in the future with... with something we are not even told what it is. Those people clearly mean something like a "pre-industrial temperature" except that there has not been any single or unique "pre-industrial temperature". During the glaciation cycles, the temperatures are changing in an interval whose width is comparable to 10 °C. It's very likely that the typical change of the global mean temperature during a 500-year period often exceeded 1 °C even in the pre-industrial era. It makes a big difference whether you compare the future temperature with the year 1300, 1650, or 1850.

Insert your favorite discussion of the Medieval Warm Period and the Little Ice Age and more ancient warm and cool periods that are supported by historical sources.

So the initial moment for comparisons isn't mentioned. But another thing that isn't mentioned is the convention with which we define and measure the global mean temperature e.g. when it comes to the altitude (or depth in the sea) where the contribution of each square meter is measured and dozens of other important subtleties. Also, the surface weather stations produce warming trends that are something like 50% higher than those measured by the satellites.

Which of them are you supposed to use when you decide about the important "preservation of the global mean temperature in the right interval" (not to mention the fact that the error margin of the centennial temperature change at a particular place of the globe exceeds 2 °C, because of the spatial dependence of the trend, so those holy 2 °C may be viewed just as noise for all local purposes, anyway)?

Because the accumulated difference between the datasets (e.g. UAH AMSU vs GISS) easily reaches 1 °C in a century, the precise date of the "apocalypse" may depend on the dataset and change by as much as one century if you use a different methodology. These differences are not just due to errors of the methodologies, something that can be avoided. If you define the global mean temperature using the natural definitions boiling down to weather stations, you just get a differently behaving quantity than if you do the natural things derived from the satellite data. These global warming temperatures are genuinely distinct quantities. There is no "one, single, and unique" global mean temperature – another basic inconvenient truth for all those who want to place this unimportant and ill-defined quantity on a religious pedestal.

Imagine some people in the future – who will have been contemporary children successfully brainwashed by the global warming propaganda – who will live in the world where all people with IQ above 70 will have been executed as deniers, so only the alarmists like them are left. They were told that when the temperature increase surpasses 2 °C, the Armageddon arrives. Imagine that they were taught to read the number from the thermometers. But when will the doomsday arrive, the alarmists capable of speaking will ask? Depending on how they measure it, their predictions differ by a century. Of course, the right answer is that there isn't any Armageddon but you won't be able to explain this simple thing to the hopeless morons who survived the "denier hunt of 2055". ;-)

Because the quantity called "the increase of the global mean temperature" is so ill-defined, we can't even criticize the people for their failure to mention the error margin. Obviously, the error margin is comparable to 100% of their mean value because the precision with which they define what they mean by their "mean value" is so lousy that it makes no sense to talk about this utter idiocy at all. Well, that doesn't prevent some people from promoting policies to throw trillions of dollars into the toilet because of this idiocy.

350 ppm of carbon dioxide

But the most insane example involves those 350 ppm. Recall that the CO2 concentrations used to be over 4,000 ppm half a billion years ago when life began to flourish and diversify. Then they were dropping and in recent epochs (a few million years), they were around 180 ppm during ice ages. Plants start to go extinct if the figure drops below 150 ppm: plants were trained by Nature to survive the starving concentrations that may be as low as 180 ppm but not much lower than that.

During the interglacials – the warm "non-ice ages" in the glaciation cycles – the concentration was close to 280 ppm; it's higher than during ice ages because the heat encourages the CO2 (and many other gases) to escape the ocean. These interglacials included the pre-industrial era; we're still in such an interglacial, waiting for another ice age (hopefully in thousands of years or more). However, we also added CO2 in the recent two centuries so the warming and cooling oceans were no longer the only major driver of the CO2 in the atmosphere and the current concentration is 392 ppm and keeps on growing by almost 2 ppm a year. It will reach 560 ppm, the "doubled concentration", sometimes between 2070 and 2090. The peak we reach (due to the finiteness of fossil fuels or their replacement by something else in a century or two) will probably be between 600 ppm and 1000 ppm. At those high levels, Nature will absorb the excess CO2 much more quickly so even if our CO2 emissions will increase well above the current ones, it will be hard for the CO2 to get much higher than that.

Plants like higher CO2 levels. Each doubling of the CO2 concentration warms the Earth by 1 °C or so which means that the temperature at the peak – in 2200 or so – could be 2 °C higher than in 1750. Sensitive people start to feel uncomfortable at 5000 ppm and the rest of them joins them in the interval 5,000-50,000 ppm. Above the upper limit I mentioned, CO2 starts to be lethal within hours, beating oxygen when you try to breath. 1,000,000 ppm means 100 percent, just to be sure (it's close to the ratio you see both on Mars and Venus: the total pressure is much higher on Venus than here, and much lower on Mars than here), and you can't go above that haha.

A paper by James Hansen et al. mentioned a meaningless religious figure of 350 ppm, something we've seen 25 years ago or so. We have to return to that point, for some pseudoscientific reasons they mention. Bill McKibben, a prominent hardcore climate crackpot, has seized this figure and established a whole "movement" called 350.org whose goal is to revert the 25 years of CO2 emissions and save the world. These sick people are painting the figure "350" across the globe. When you see "350" written in your nearest public restroom, chances are that Bill McKibben has visited the facility before you. Enhanced rules of hygiene are recommended in this case because those people have stopped using the toilet paper.

Needless to say, there can't be anything special about 350 ppm. It's a random number that is "somewhere" above the natural concentration during the recent interglacials but below the current concentration and well below the concentrations dangerous for health or those that the Earth experienced when life of the kind we are familiar with was really exploding for the first time. If you omit the "current concentration" from the list, there are many more concentrations that satisfy the same conditions and that you could use instead of 350 ppm – such as 1917 ppm, to please the climate alarmists with a favorite number of theirs. Much like 350 ppm, 1917 ppm is also higher than the concentrations in recent hundreds of thousand of years but much lower than concentrations that are dangerous for the human (and animal) health.

Of course, any special importance attributed to the figure 350 ppm is nonsensical. A person could find out that this is the case if she tried to calculate the error margin with which the exceptional figure of 350 ppm is known. During such a calculation, she would be forced to discover that the arguments make no sense and that the error margin is at least thousands of ppm (in the upper direction). This is of course inconvenient for the climate communists. They want an "action" similar to the 1917 revolution and they want it "right now". They believe that random numbers that elevate the apparent "urgency" of the climate claims. However, the main thing that these numbers achieve is to prove that the climate alarmists are hopeless mental cripples in the eyes of the observers with IQ above 70.

And that's the memo.

This picture looks like parody but it is actually an official picture of the Green politicians who promote a coming Durban conference on means of transportations for politicians to cool the climate and save the Milky Way.

Czechia celebrates the National Holiday today; Czechoslovakia was established on October 28th, 1918. Woodrow Wilson made the greatest contribution to its birth among the foreign politicians. Prague recently restored his statue near the main train station. It's been 93 years from the moment and people like Sean Carroll could criticize me for breaking a law of physics, the symmetry between the future and the past, by only including the anthem of the country (well, 2/3 of it) that was created after October 28th, 1918. To repair this discrimination, I also include a link to the anthem of Austria-Hungary which existed before October 28th, 1918. It sounds much like the anthem of current Germany or the Nazi Germany, for that matter, but the lyrics are different! ;-)

#### snail feedback (3) :

Just in case that some modern reader does not get why I am a follower of Eddington (btw, I am not exactly; I prefer a fine structure constant from Hans de Vries) let me point the numerology thread I created in PhysicsForums. I am happy that this thread at least helped to create awareness on Koide formula, which rather misteriosly links the lepton masses bewtween them and with the quark constituient mass (albet constituient mass is also produced from electron mass straight via one third of Lenz formula, independently discovered by more recent authors)

Other use of the word numerology, dated at least from 1974, is as a try to fit representations of Lie groups into multiplets in an arbitrary form. In this way, I am still on it :-)

Complex quantities in the real world don't have a good reason to be described by "simple numbers" such as integers (if they're a priori real).

There's a reason why eigenvalues of Hermitian operators are real.

Hansen ought to be horse whipped 350 times for spreading that 350 disease. That is by far the most puddin'-headed, imbecilic, sickening invention of human fantasy. I can't even write the integer 350 without feeling sick.

Eddington by the way is frequently misunderstood. He was trying to relate coefficients of a quadratic equation he obtained to represent the curvature of spacetime - which he believed to be a constant. A thing such as an electron, he reasoned, doesn't know how big it "has" to be - so the charge to mass ratio of an electron, thus the fine structure constant, thus the curvature of spacetime, thus the Hubble "constant" are all related in some way.