Saturday, January 07, 2012 ... Deutsch/Español/Related posts from blogosphere

What's up with that fine-tuning?

When I opened Anthony Watts' blog minutes ago, Watts Up With That.COM, I got the following.

Click to zoom in.

The counter on the right side shows 99,999,962. That's a pity because the person who manages to obtain a screenshot with the number 100,000,000 gets a $100 million coffee mug. That's a lot of money. As far as I remember, I didn't hold $100 million in my hands for years if not decades. It's even hard to imagine how the mug may be this expensive.

It's not just the money. The British aristocracy promised to celebrate the 100,000,000th visitor to Watts' blog in the British Commonwealth during the January 7th Watts Up Day. So one had a chance to obtain a giant platinum coffee cup and share the niceties with the British royal family. At any rate, a fast shift-reload didn't give me any better number (it was the same: the next one I got a long time later was 100,004,397) and someone else was lucky.

You could still be impressed that I got pretty close. Suspicious values such as 99,999,962 are referred to as fine-tuning. How is it possible?

One explanation is publication bias. People including myself have also gotten more boring numbers during the years. In fact, they were so boring that I didn't publish a TRF blog entry about them. So the numbers from the blog entries can't be viewed as representative. They don't follow the same statistical distribution as the actual values that the counter shows.

These comments may sound as jokes but because of their universal validity, they are important for serious situations, too. You simply can't view the literature – not even scientific literature – as being representative of the actual importance of various problems or actual likelihood that some figures are right (although the scientific literature should still be closer to the truth than a random guess by a laymen: but even this sometimes fails to be the case). Some values may receive many more papers not because they're likely or important but because they would be more interesting if they were true. Or because of pure ignorance. Or due to chance.

The anthropic explanation is that TRF's obtaining a number similar to 99,999,962 is needed for the intelligent life on Earth to exist. It's a good try but you see that it doesn't really work. One can perceive other civilizations where TRF doesn't produce this number. The same failure applies to the strong CP-problem: the \(\theta\)-angle of QCD weighting the instantons in the action (which cause additional CP-violation) is very small according to the observations. But it's apparently not needed for life, either. Some other parameters may have to be more accurately fine-tuned but it still doesn't mean that there doesn't exist a much more precise, non-anthropic calculation of these values as well.

In some cases, there exist totally rational explanations of such coincidences. For example,
\[\begin{align}\exp(\pi \sqrt{163}) \approx\\ \approx 262,537,412,640,768,743.99999999999925 \end{align} \] That's a painfully close number to an integer but it isn't an integer. The probability that you get 12 copies of "9" after the decimal point is just \(10^{-12}\), one part in a trillion. You shouldn't be so lucky with simple formulae such as one including a simple number of 163. There are much fewer "simple formulae" of this complexity than a trillion. So statistically speaking, there has to exist an explanation. And there does. It's based on the \(j\)-function, a function that naturally parameterizes the integration region for one-loop (toroidal) diagrams in string theory. You may rewrite the particular number above using another expansion whose first term is an integer and the following term is already manifestly tiny.

Let me just assure a reader that his or her "surprising observations" including the neutron mass or proton mass don't have any such explanation and they're coincidences, indeed. If he calculates how likely it is for the coincidences to work at the given accuracy, he gets a likelihood vastly less tiny than \(10^{-12}\), too. So the statistical argument is much weaker. Also, only dimensionless parameters – those independent of our choice of units which is just a part of the "messy cultural baggage" – may be expected to have any rational explanations.

From the viewpoint of the Standard Model, the Higgs boson mass which is probably close to 125 GeV as we learn last year – and we will probably learn for sure in 2012 – is extremely tiny. Quantum corrections would naturally drive it (both the vev and the mass) towards the fundamental scale which is 15 orders of magnitude higher. This puzzle is known as the hierarchy problem and supersymmetry remains the most viable solution according to the evidence we have as of today.

Congratulations to Anthony

Add to Digg this Add to reddit

snail feedback (0) :

(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//','ga'); ga('create', 'UA-1828728-1', 'auto'); ga('send', 'pageview');