## Sunday, July 06, 2008 ... //

A few days ago, Tommaso Dorigo wrote an essay about the numerical coincidences and fine-tuning. It doesn't seem to me that he has explained the relationship between these two concepts and his main point is not clear to me either but he wrote many propositions I agree with. So let me write something about these matters, too.

As the readers of Common Crackpots' Errors know, I completely agree with Tommaso's thesis that numerology is one of the defining features of crackpots. And I also agree with him that numerical coincidences that seem unlikely deserve an explanation and they may often be legitimate sources of scientists' inspiration. My main point is that:

Quantitative tools exist that may determine whether a coincidence should really shock you. In many cases, it shouldn't: a "coincidence" is often very inaccurate, contrived, and unimpressive. However, whenever it should shock you, it is legitimate to search for an explanation.
Coincidences in mathematics

Let me give you an example. The first mathematical example we will analyze was written by Tommaso: 123456789 and 987654321 seem to be pretty "simple" numbers. Most people remember them more easily than random 9-digit integers. You might think that they're pretty random numbers, anyway, because their "easiness" might look purely psychological to you. However, their ratio turns out to be 8.00000007 or so. The probability that a random ratio has 7 zeroes following the decimal point is about 10^{-7}. Yes, you should be somewhat surprised.

If you decided to declare the fact above to be a coincidence, you should honestly predict that similar things won't occur repeatedly. But what if I tell you that a ratio of two hexadecimal numbers, namely FEDCBA987654321 over 123456789ABCDEF, is equal to the decimal number 14.0000000000000002 or so. The probability that the decimal point is followed by 15 zeroes is roughly 10^{-15}. The probability that a similar miracle takes place "by chance" in these two examples is approximately 10^{-22}. Especially once you realize that similar results occur for any base, your decision should be clear: Yes, you should be stunned and you should definitely demand an explanation of these miracles! ;-)

Unlike Tommaso, I will give you one. :-) The ratio 987654321/123456789 is the same thing as (999999999-012345678)/123456789. The advantage of writing 987654321 as a difference is that the remaining numbers that occur in the expression can be written as "almost infinite" sums. First, rewrite the latest ratio as (9.99999999-0.12345678)/1.23456789, by canceling the factors 0f 10^8. Now, all the numbers can be approximated, once you realize that the following Taylor expansion identity holds:
1 + 2x + 3x2 + 4x3 + ... = (1-x)-2
You can prove it by differentiating a more well-known "geometric series" identity with the 1,1,1,1... instead of 1,2,3,4... prefactors and the power "-1" instead of "-2".

We therefore have
(9.99999999-0.12345678)/1.23456789 ≈
≈ (10 - 0.1 (1-0.1)-2) / (1-0.1)-2 =
= (10 - 0.1 x 0.9-2) / 0.9-2 =
= (100 x 0.92 - 1) / 10 =
= (81 - 1) / 10 = 8
It is easy to generalize the proof for a general base - the ratio is always the base minus two - and to estimate the errors. At any rate, the problem is solved. The mystery is no longer mysterious.

Needless to say, mathematics often reveals more non-trivial and more surprising regularities. For example, start with Euler's favorite identity,
exp(π √(-1)) = -1
Well, the square root of -1 equals "i" and the complex exponentials can be rewritten in terms of sines and cosines while π is exactly what it should be so that the result of the exponentiation equals -1. Fine. But what if you replace -1 by a different (positive) integer and try to compute
exp(π √(+67)) = ?
The argument seems to be both a real number as well as gibberish and if you exponentiate it, you might believe in the principle that gibberish comes in, gibberish goes out. It should be a random real number, right? However, the result is 147197952743.9999987..., again very close to an integer (now a positive one).

The probability that there are five digits 9 after the decimal point is 10^{-5} so as a conservative skeptic, you could still believe it was a coincidence. However, as a patient numerologist, I won't give up! ;-) And as an educated numerologist, I know it is important for scientific evidence to be "reproducible". But we need an independent piece of evidence, i.e. slightly new numbers. The following similar example I offer you is
exp(π √(+163)) = ?
Before you find your accurate calculator, let me give you the result. It is 262537412640768743.9999999999993 or so. Check it out.

The probability that there are 12 digits "9" after the decimal point is 10^{-12}. The information I "inserted" seems to be much too small for such a coincidence to occur by chance and it is not the first miracle of this kind. I won't be able to create 10^{12+5} = 10^{17} "simple" and "natural" expressions similar to the exponentials above that would allow me to call the particular results above coincidences. Again, a rational person should demand an explanation.

The fact that I could find several surprising phenomena of the same type has helped us to transform them into a "pattern" that deserves an explanation.

Of course, the explanation of these two miracles is known, too. The so-called j-invariant, an important function of a complex variable that appears as a factor in string-theoretical one-loop partition sums, can be expanded in various ways. One formula for j allows you to see that
j(1+√(-d)/2)
is exactly an integer if "d" being a Heegner number (click!), namely a member of the list 1,2,3,7,11,19,43,67,163. On the other hand, another expansion of the j-invariant, one that can be used to estimate the high-level degeneracy of the (very massive) string spectrum and that holds because of the "modular invariance" of string theory, allows you to prove that
exp(i π √(-d)) = exp(π √(d)) ≈ -j(1+√(-d)/2) + 744
so it is close to an integer, too!

The statement that the full list of Heegner numbers we mentioned is correct was conjectured to be true by Gauss but the full proof was only found by Kurt Heegner in 1952. A laymen could think that there's a lot of random lucky numbers around - such as 744 or the list of the Heegner numbers - and they cannot play any "privileged" role in the scheme of the world.

Except that they often do play such roles. The additive shift in the identity for the j-invariant is always 744. The Heegner numbers are always 1,2,3,7,11,19,43,67,163, and if you care, the related lucky numbers of Euler are always 2,3,5,11,17,41. There is no alternative world where the values would be different. There is no egalitarianism in between the integers. ;-)

Once you study number theory, small sets of primes and other integers tend to belong to various classes and mafias. Many of them play extremely special roles in well-defined contexts. I believe that it may be counterintuitive for a self-described rational layman but it is true. It is also true in physics in general and in string theory in particular. For example, the critical spacetime dimension of perturbative superstring theory is always 10 and the dimension of the 10-dimensional N=1 gauge group is always 496. There may exist a world where creatures have 20 fingers but there exists no world where the critical dimensionality is 20 or where the other number is 486. ;-)

I could continue with hundreds of similar coincidences in mathematics. For example, a crazy privileged list of primes appears at two seemingly very unrelated places: this coincidence is known as monstrous moonshine (because it initially looked as crazy as the things you often see after a bottle of whiskey) but today, it can be (largely) explained by tools of string theory, too.

Mathematics - especially number theory, algebra, and holomorphic calculus - is full of similar miracles. Many of the most shocking ones are very close to field theory and especially string theory. If you only treat string theory as a mathematical structure, it is full of similar purely mathematical miracles.

Many of them have been rigorously understood, others can be understood if you are allowed to assume the correct (but not rigorously proven) proposition that a "magnificent, consistent, and unique mathematical structure called string theory exists". And there always exist some cutting-edge coincidences that haven't yet been proven even if you are allowed to be a "string theory believer". ;-)

Coincidences in physics

So far, I tried to be careful and discuss pure mathematics only. It should be possible to disconnect all the miracles and coincidences from the real world and it should be possible to prove them without any reference to the real world. However, related miracles occur in physics as an experimental science, too.

Tommaso's cute example of a numerical coincidence that turned out to be deep involves the Balmer series. Johann Jakob Balmer noticed that the frequencies of the Hydrogen spectrum were (almost precisely) proportional to (1/m^2 - 1/n^2) where m,n were integers. That was a fascinating observation that partially inspired old Bohr's model of the atom and that was eventually confirmed by Schrödinger's exact calculation of the non-relativistic quantum Kepler/Coulomb problem.

Tommaso claims that Balmer was no crackpot. Well, I would be less certain. Balmer was certainly not a physicist (maybe an honorary physicist). He was a mathematician except that he has never done any work in mathematics that would deserve to be mentioned on this blog either. If you allow me to say what I think, I think that Balmer was a classic numerological crackpot but a very lucky one! Or maybe he had much better intuition than other numerological crackpots. It's hard to say.

But you should admit that it could be a coincidence that the discoverer of one of the few numerological observations that have turned out to be really deep was called Johann Balmer and not e.g. Tony Smith, Jack Sarfatti, or Lee Smolin who have never found anything of any value.

The following example involves black hole quasinormal modes. In 1998, Shahar Hod noticed that a limiting frequency of black hole ringing modes - a number that was only known numerically as 0.0437123 - can be written as ln(3)/8π.

Now, this observation due to Hod - later proved analytically by your humble correspondent and by a more flexible "monodromy method" paper by Motl and Neitzke - probably required more non-trivial reasoning than Balmer's discovery.

Hod has also sketched a whole "physical" picture why the number had to be what it was (and Olaf Dreyer has even "connected" it with loop quantum gravity in 2002), a picture in which black holes have a brutally discrete spectrum of many quantities etc. Every detail of Hod's (and Dreyer's) explanation was a naive, childish, and arbitrary physical nonsense showing that Hod (and Dreyer) didn't quite understand how the black holes really work according to modern (and not so modern) theories but what is equally important for this discussion is that some of the intuition was leading to the right Ansatz.

There exists a more rational explanation why the limiting frequency could have the form it has. You can obtain this rough intuition by simplifying our papers about the issue - by focusing on its qualitative features and by ignoring various details. Because the quasinormal frequencies are "equally spaced" in the imaginary direction, they will include a denominator similar to 2π because the equation
exp(2π x) = -K,
one that is likely to approximate the exact equation in the highly-damped limit (the simplest complex equation with equally spaced solutions in the complex plane), is solved by "x" being a half-integer times "i" (equally spaced spectrum with the spacing chosen to match the conventions) plus some real part.

The real part of "x" can be seen to be "ln(K)/2π" and the factor of "4" depends on conventions, units, and other details. It happens that K=3, for no good reason. But it's not shocking that various equations simplify so that numbers such as 3 appear in them in various limits.

If you rationally try to prove the observed coincidences only, you can easily see that the factor of 1/8π or the presence of the logarithm of a simple integer has nothing to do with Bohr-like heuristics or with a hypothetical discrete spectrum of the event horizon area.

Hod has added many irrational layers of pseudoscience but what is important for our discussion is that this pseudoscience led to the same guess about a very rough form of the asymptotic Schwarzschild quasinormal spectrum as a correct derivation. Was it a coincidence that Hod interpreted this number correctly? Sorry but I guess it was: for example, all his additional predictions about other black holes, other dimensions, and other details have been falsified. Only one number, namely ln(3), characterizing one solution, the Schwarzschild solution, has worked for him.

On the other hand, for Edward Witten, it has been calculated that the probability that his achievements were a matter of "chance" rather than a testimony of his powerful mind is comparable to 10^{-200}. ;-) This tiny number is obtained because Witten has written more than 300 papers with similar (and more serious) "surprises". It's probably not a coincidence.

I don't know for sure whether Witten would be lucky or creative enough to guess Balmer's series or Hod's interpretation of 0.0437123 if he were supposed to be the first man in the world. No doubt, Witten has also made many ingenious guesses like that and he is far too modest. But more importantly, I believe that scientists shouldn't be monkeys who are randomly guessing and typing numbers, formulae, and theories, hoping to become lucky revolutionaries - the kind of "science" promoted by the likes of Lee Smolin.

Good scientists must surely find valuable results "more systematically" and if they succeed in one situation, it shouldn't be because they were lucky! If they have only succeeded because they were lucky, the universities and governments should employ monkeys instead. They are much cheaper than Lee Smolin.

I think that I have made it somewhat clearer which people were just lucky and which people really knew (or know) what they were doing.

Fine-tuning

The known hierarchies and perceived fine-tuning - usually the smallness of the observed dimensionless parameters relatively to one - may be viewed as the contemporary physics' example of numerical coincidences. For example, the Higgs mass is indirectly known to be much (by 15 orders of magnitude) smaller than the Planck mass which may look very strange from the viewpoint of a fundamental theory. How seriously should we think about these surprising matters?

Well, if we believe that the value of a parameter is pretty much a random number of order one, we imagine that the number has a nearly uniform probability distribution between "0" and "2" or something of this sort. If the number is observed to be smaller than 10^{-15}, it is surely strange. The "Bayesian" probability that a number of order 1 is smaller than 10^{-15} is comparable to 10^{-15} itself. It shouldn't happen in a normal world.

Once you obtain a more accurate idea about the physical system - a vague but more substantial preliminary theory - your ideas about the statistical distributions encoding your expectations may change. And as soon as you find the full correct theory, the probability distribution may become a delta function supported by the correct answer. The correct answer is often one of the "boring" numbers that were expected from the very beginning. But sometimes it is not. When it's not, it's usually because the original expectation was too naive. Less frequently, it may be a matter of chance.

There are analogies in mathematics that we have already seen. Once you qualitatively understand the j-functions, the Heegner numbers above, and the structure of a few identities, you will be more educated and you will be able to expect that some of those seemingly random exponentials can be "unnaturally" close to an integer. If you really "swallow" the essence of all these arguments, you might even say that it is "natural" for these numbers to be close to integers.

In the case of the hierarchy problem(s) in physics, we generally don't have a universally acceptable explanation of the "miracles". Such an explanation is guaranteed to have a mathematical portion that is remotely analogous to the Heegner number patterns but it is also guaranteed to have a physical portion because this portion is needed to connect the new mathematical insights with the established physical theories and/or observations. The cosmological constant problem is probably the most serious "miracle" or the most shocking "generalized hierarchy problem" we know today.

The "monovacuists", and I tend to count myself as one, usually assume that there exist mathematical subtleties that may eventually change our estimate of the cosmological constant from a quantity comparable to the Planck scale to a quantity that is "naturally" smaller by something like 123 orders of magnitude or "naturally" smaller than the (quartic) supersymmetry breaking scale by 60 orders of magnitude because of a reason that hasn't been previously appreciated. Again, there can exist more detailed alternative calculations that prove that the cosmological constant is not such a "random" number, after all.

And maybe, they don't exist. The "anthropic" people tend to assume (and rely on incomplete, slightly demagogic, and somewhat circular "proofs") that the cosmological constants of ensembles of vacua always behave as "gibberish" and that the vacua with very small values of the vacuum energy are not "preferred" according to any physically meaningful criterion. In their opinion, all these small values are a matter of "chance" and the only bad thing we can say about the "unlucky", high values is that they are incompatible with life that is remotely similar to ours.

We don't know who is right. Of course, the anthropic people don't have a proof that nothing like a "cosmological constant q-expansion" exists, if you want me to use the Heegner analogy. This question is important but we unfortunately have many more philosophical and religious expectations and prejudices (and wishful thinking?) than fully rational arguments and calculations relevant for these issues.

Many numerological observations are worthless

Finally, I want to say that crackpots and not-quite-serious thinkers tend to be excited about many "secret formulae" for the constants of Nature that are actually completely unimpressive.

For example, at some moment when he was no longer a serious physicist, Arthur Eddington thought that the fine-structure constant had to be exactly equal to 1/136. It was natural for him to assume that the denominator was an integer. And Eddington thought that he had evidence that it was an integer. And he has invented dozens of much crazier guesses based on the number 136.

These days, we know that he was wrong on both (or all) counts. The inverted constant is not an integer - in fact, it is 137.036, closer to 137 than 136 - and there is no good reason why it should be close to an integer. And it is not "spectacularly" close to an integer so that we should worry about it. The fine-structure constant is a pretty complicated function of other constants that we consider more fundamental these days - the electroweak couplings (that you should furthermore evaluate at the unification scale, not at very low energies where the number 1/137.036 is measured) - and there is no known reason why such a complicated function should have some simple properties, e.g. being an integer.

Many crackpots like to "design" formulae that are approximately equal to some constants of Nature and they seem to get very excited. The only impressive formula of this kind I have seen was one that I rediscovered 25 years ago, namely that the proton-electron mass ratio is very close to 6π^5. When someone tells you something he is excited about, you should rationally calculate the odds that the "coincidence" works as accurately as it does in a given example.

The probability that a new identity relating two a priori comparable numbers holds with the relative accuracy of "p" is comparable to "p" which can often be small. But the (artificially engineered) left hand side is often very contrived. So you shouldn't ask what is the probability that a particular contrived expression gives you the desired result, within the error margin. Instead, you should ask what is the probability that any expression from the same universality class - one that uses the same "tools" (symbols and operations) and that has a comparable "length" - gives you the desired result.

Because the contrived expressions usually come in very many a priori possible "flavors", the latter probability is usually pretty close to one or, at least, it is not crazily small relatively to one. In that case, you should be left unimpressed. A simple rule is that if the relative error is worse (larger) than 1/N where N is the number of similar expressions that the proponent could have used (N is sometimes comparable to a power, namely the number of symbols in your "alphabet" to the power of the number of symbols that have been used to write the expression), you should ignore her or usually his "miracle".

And even if the latter probability is much smaller than one, don't forget that statistics happens. There are many "constants" in the world and it is guaranteed that some of them will be equal to an expression that will look like a small "miracle". You should try to have realistic expectations how many such "miracles" you should expect and how "strong" these miracles should be. Weird observations are sometimes important but more often, they are not.

And that's the memo.

#### snail feedback (5) :

Well Lubos, thanks for the link and I must say, excellent post.

Regarding Balmer, I would hold that if it all boils down to luck as the thing which discriminates a true physicist and a crackpot, then the whole existence of the second category is rather useless.

Cheers,
T.

In addition to coincidence analysis, I always suggest a GIGO (Garbage Input / Garbage Output) analysis: most crackpots work "according thermodynamics": they put more info in that the info they get out.

Now let me reminder to the general public that during 2005 to 2007 we run a thread "All the lepton masses" in physicsforums doing some enforcement in the precission of the "G.Out" side; I think that all the other crackpots in the internet where more silent than usual during this period!

Last, there are some coincidences we have actually reported to hep-ph because they could constitute some Balmer-like hint. There are integer or combinatorial coincidences (the need of 3 generations in the standard model, I am proud of it) and there are purely decimal ones, as Weinberg angle, Z0 Decay or, why not, Koide's. But it is hard to see how such observations could constitute a useful hint for theoretists. The infamous observation O(y_top)=1, for instance, did a good job to motivate susy higgs, but now that it is more accurate, y_top=1, there is no mechanism to explain it.

Speaking of m_p=6π^5 m_e= 938.25, there is also the slightly worse, but not involving pi:

m_p= 3 (avg(sqr(m_e)))^2= 941.57

where m_e runs now over the three generations of electron,muon,tau.

One feels tempted to rewrite them as
m_q= 2π^5 m_e and
m_q= (avg(sqr(m_e)))^2

The exponential you mentioned is $\exp(\pi\sqrt{58}) \approx 396^4-104.000000177$