Jon Butterworth of ATLAS asked the same question on his Guardian blog. Many other physicists will be asking the question increasingly more frequently, especially if (and as long as) the 125 GeV God particle remains the only new particle that the LHC discovers.
Things in Nature may be perfectly natural even if they don't look natural to some of us. Click the picture for more.
Can the Standard Model be the whole story?
As we've known since mid December 2011, the God particle has the mass approximately 125 GeV. However, we have known for decades that it also has a vacuum expectation value. It's the main property that allows the BEH mechanism to operate.
The vacuum expectation value of the God doublet is\[
\bra 0 h(x,y,z,t) \ket 0 = \pmatrix { v \\ 0},\qquad v=246~\GeV
\] At each point of the spacetime, the God doublet has a nonvanishing mean value. More precisely, only the upper component, one that is electrically neutral, has a nonzero vev. That's why the photon remains massless and the force it mediates, electromagnetism, remains a longrange force (one that only vanishes as a power law at infinity).
The Wbosons and Zbosons, corresponding to the generators of the gauge group (charges) that are carried by the God vev to a nonvanishing extent, become massive because of their interactions with the God condensate (the spirit penetrating the space, first promoted by Isaac Newton). That's why the force they mediate, the weak force, becomes a shortrange force whose impact decreases exponentially at very long distances.
The God field has to have spin zero so that its vev in the vacuum only breaks the electroweak symmetry, not the Lorentz symmetry. Fine. So is the God particle mass natural?
Reconstructing the potential
In perturbative quantum field theory, the potential energy for the God field has to be at most quartic for the theory to be renormalizable (for it to allow us to cancel the infinities without introducing an infinite amount of ignorance). The potential has to be symmetric under the electroweak \(SU(2)\times U(1)\) symmetry because the potential is a part of the laws of physics that have to be symmetric; only the vacuum state is allowed to break the symmetry. The symmetry and the renormalizability determine the potential to take the form\[
V(h) = \frac{\lambda}{2}(h^\dagger h  v^2)^2 + V_0, \quad v = 246~\GeV
\] Note that the expression was constructed out of \(h^\dagger h\) which is symmetric under the electroweak symmetry as well as the Lorentz symmetry. The expression above is manifestly minimized at \(h^\dagger h = v^2\). The absolute length of the vev is known (and has been known since it's linked to the Wboson and Zboson masses that have been known for half a century); the direction of the complex twocomponent God field condensate is not known and it is, frankly speaking, totally unphysical.
The absolute term \(V_0\) contributes to the vacuum energy density, also known as the cosmological constant. However, because it's constant, you can't really say "which particle" or "which subset of physical phenomena" are responsible for producing this vacuum energy density. All of them may contribute something and only the sum is relevant. Moreover, the total vacuum energy density is very small, \(10^{123}\) in the Planck units i.e. \(10^{123}~m_{\rm Pl}^4\), and was deduced from the rate of accelerated expansion of our Universe. Its smallness is staggering and related to the main topic of this article but it's not directly related to the God particle problems and I won't discuss it here.
The vev was known before the God particle was glimpsed in 2011. However, the overall normalization of the first term in the potential, the quartic and quadratic one, wasn't known. We only knew that the coefficient \(\lambda\) which is also proportional to the quartic coupling, the coefficient of \(h^4\), had to be positive for the energy to be bounded from below. People also knew that if the constant \(\lambda\) exceeds a particular value of order one, the perturbative expansion of the quantum field theory breaks down. Quite certainly, it's the whole quantum field theory that breaks down, whether you analyze it perturbatively or not. Too high a value of \(\lambda\) combined with the "running" makes the value of \(\lambda\) diverge at slightly higher energy scales, reaching the "Landau pole", which is probably a lethal inconsistency for similar selfinteracting scalar theories.
The potential above may also be rewritten as\[
V(h) = \frac\lambda 2 (h^\dagger h)^4  \lambda v^2\cdot h^\dagger h + (V_0 + \frac{\lambda v^4}{2})
\] In this form, optimized for expansions around \(h=0\) instead of \(h=(v,0)\), you see that the vacuum energy density was shifted by \(\lambda v^4/2\). The middle term is a "mass term" for the excitations of the God field around \(h=0\). The coefficient is negative so the God field behaves as a tachyon around this point, causing an instability. The first term is the quartic coupling we discussed before. Also, if you expand the potential around \(V=(v,0)\), writing it in the form \((v+H,0)\), setting the first component to a real number and the second complex component to zero, by using the gauge symmetry, you will get\[
\eq{
V(h) &= \frac\lambda 2 (v+H)^4  \lambda v^2 (v+H)^2 + V'_0=\dots \\
\dots &= \frac\lambda 2 H^4 + 2\lambda vH^3 + (31)\lambda v^2 H^2+V''_0
}
\] I hope you know how to expand fourth and second powers of a sum. Wow, the subscript 0 isn't below the primes in \(V''_0\), it's ugly. I am convinced that \(\rm\LaTeX\) would put these subscripts right below the primes.
At any rate, the linear terms in \(H\) around the minimum cancel and the absolute terms are irrelevant. You see that the mass term, one proportional to \(H^2\), has the coefficient\[
\frac{m_H^2}{2} = 2 \lambda v^2
\] so you see that \[
m_H = 2v\sqrt\lambda
\] in my conventions. Because we know that \(m_H=125~\GeV\) and \(v=246~\GeV\), you see that the quartic coupling\[
\lambda = \zav{\frac{125}{2\times 246}}^2 \sim 0.065
\] is not far from \((1/4)^2\sim 1/16\). It's of order one but kind of "safely lower" than one. The value of this coupling is a bit different (larger) near the fundamental (Planck) scale but it is comparable to one and you may imagine it jumps out of the equations. The real problem is that the constant \(v\) or, equivalently, \(m_H\) is of order\[
v\approx 246~\GeV \lll \dots \lll 10^{19}~\GeV\approx m_\text{Planck}.
\] The God particle mass is about 15 orders of magnitude smaller than the "natural mass" of unified theories, the Planck mass. The previous sentence actually understates the gap: the actual parameters in the Lagrangian include the squared masses of the God particle or the squared vevs which are 30 orders of magnitude smaller than the squared Planck mass.
If you imagine a beautiful unique theory describing our Universe, it must manage to "spit out" a number of order \(10^{30}\) which is rather small. It's pretty much guaranteed that such a theory predicts some particles that are lighter than the Planck mass. But if you imagine that the distribution of the squared masses is kind of uniform, the probability that the squared mass (a coefficient in the Lagrangian) will be equal to or smaller than \(10^{30}\) times the "natural estimate" is comparable to \(10^{30}\). The probability is very small.
That's the reason why we should be shocked by such "unnaturally small" numbers. It's pure Bayesian reasoning: the probability of its being so close to a special value is just small if the parameter is picked by chance. And sufficiently unpredictable fundamental equations should be pretty much choosing the values "by chance".
Natural vs natural
By simple etymological arguments, the adjective "natural" should label anything that Nature can produce "smoothly" or "easily", without much human intervention, without humans and their pushy attitudes, ideologies, and contrived and awkward adjustments, without affirmative action and social engineering. The idea of naturalness is that the laws of physics should be laissezfaire. Analogously, a theory of climate change is either natural – if it respects the natural mechanisms that actually operate in Nature – or Mannmade – which means that it's constructed by artificial overfitting, fraudulent adjustments, incoherent splicing of graphs from different sources, occultist algorithms designed to mask this splicing, and cherrypicking motivated by a hardcore preconceived Marxist agenda.
This is the ideal meaning of the word "natural" but there's a problem with this definition: we don't really know every detail about how Nature operates. So a thing that looks unnatural to us may look natural to Nature and vice versa. That's a problem. Because of our incomplete knowledge about the inner workings of Nature, we have to invent "proxy" definitions of the word "natural". It may be very mathematically welldefined but it may still be different from Nature would consider "natural".
As we're learning things, we're adjusting our definition and usage of the adjective "natural", too. A typical example of this evolution is the term "natural supersymmetry" that started to be used a few years ago. A decade ago, people would say that it was "natural" for all superpartners to have roughly the same mass. However, what we actually need for "naturalness", the avoidance of a priori unlikely (which usually means unreasonably small) values of parameters in the Lagrangian, is something different than uniform masses of all superpartners. We need the stop squark and perhaps some neutralinos and gauginos to be comparably light to the God particle.
That brings me to this picture that Jon Butterworth added to his article. What does it have to do with the problem of naturalness in particle physics? When I saw the picture, I thought: "They built a replica of Notre Dame near the Harvard's Holyoke Center." Then I thought a little bit more carefully and realized that it wasn't Notre Dame. It actually had to be something in London because there was also a Big Ben next to the Notre Dame. So I opened Google Earth to find out that the Notre Dame replica is actually called the Westminster Palace (correction: it's really Westminster Abbey) which is also a Gothic structure. You must forgive me: the Heathrow is the only place in London I kind of intimately know but it's been much better with Paris, thanks to the Bogdanoff brothers. ;)
Fine. So it's London. And there may even be a circular map of an ATLAS collision on the glassy ground floor of the Holyoke Center replica on the right side. But even after all these additional insights, I still think that Jon Butterworth posted the picture because of the "bus STOP" caption on the road which may have something to do with naturalness.
What will we think about naturalness if nothing is found beyond the God particle?
In the supersymmetric model building, the lightness of the stop (scalar top quark) is the most important property to guarantee the naturalness of the unbearable lightness of the God particle's being. The God particle mass near \(125~\GeV\) is somewhat (ten percent or so) higher than the value in the simplest and most minimal supersymmetric models so various modifications or nonminimalities are needed. The possibilities have been discussed several times on TRF; I don't want to go into that.
Just imagine that no explicit new mechanism responsible for the lightness of the God particle is going to be found in 2012 – or in any year up to 2020. How will it affect our understanding of naturalness?
Well, I would personally not think that "something is really crazy" in our Universe. If the stop squark mass is \(2~\TeV\), it's plausible that the LHC will never find it. But many theories with the stop squark mass of this mass only need to adjust some terms with the relative accuracy of \(0.01\). That's not too small a number. The probability of getting a number smaller than \(0.01\) if it's a priori uniformly distributed between \(0\) and \(1\) is just \(0.01\) or so (Sheldon: exactly \(0.01\)) which is equivalent to something like 2sigma evidence that something is strange.
I've never taken the "little hierarchy problems" and similar concepts too seriously. Being amazed by the fact that some parameter in the Lagrangian is a bit smaller than \(0.01\) is fully analogous to being impressed by 2sigma bumps. A mature physicist shouldn't build all his or her reactions on this kind of emotions. One always has to look for more solid clues. A parameter's being finetuned to this accuracy is just not amazing.
QCD: a role model that produces a big gap between scales
But just for the sake of a thought experiment, let's imagine that we may run colliders up to the Planck scale, or very high scales, to say the least, and we won't find any new physics that would explain why the parameters in the Lagrangian are so much smaller than the fundamental scale where quantum gravity starts to matter and where forces are forced to unify, the Planck scale. Should we be shocked? What does it mean?
I think that in that case, we should be shocked. As far as I can say, there are only two plausible explanations in such a situation: either there is an oldfashioned calculation that simply produces a particular small number by some exponentiation or something similar out of much more natural numbers; or some anthropic selection is needed to explain the smallness. Let me discuss both possibilities.
We will begin with the first one, the oldfashioned possibility. Is there any example of a system that is able to produce scales that are much smaller than the fundamental scale even though we only substitute plausible numbers that are between \(0.01\) and \(1\) for natural parameters, i.e. if we substitute natural values of these parameters?
Yes, there is an example. It's called Quantum Chromodynamics, QCD. It describes the "colorful force" between quarks and gluons. It generalizes Quantum Electrodynamics, QED. In QED, the gauge group is \(U(1)\); it's extended to \(SU(3)\) in QCD. Because the latter is nonAbelian and for other reasons, QCD is confining. The charged (colorful) objects such as individual quarks can't be isolated; only uncharged (colorless) objects such as protons and neutrons may exist in isolation.
How does QCD produce scales? Well, its coupling constant \(g\) isn't really constant. Due to quantum processes, its size has a slow, logarithmic dependence on the energy scale \(E\). The running looks like this:\[
\frac{1}{g^2(E)} = \frac{1}{g^2(E_0)}  C\cdot \beta\cdot \ln\zav{\frac{E}{E_0}}
\] You may imagine that \(E_0\equiv m_\text{Planck}\) and \(C\), a qualitatively unimportant coefficient, is equal to one. The \(\beta\)function (well, in my notation, \(\beta\) is a constant) is negative. So if you consider energies \(E\lt E_0\), the logarithm is also negative and the last term on the right hand side is negative (three minus signs) although it depends on the energy scale \(E\) very slowly. At some low enough value of \(E\), it may fully compensate the positive term \(1/g^2(E_0)\) on the right hand side. When it's so, the right hand side goes to zero which means that \(g^2(E)\to\infty\). The coupling constant diverges at some low enough energy scale!
Of course, this low enough energy scale \(E=E_{QCD}\) is nothing else than the QCD scale, not too far from the mass of the proton. If you invert the equation above, you may see something like\[
E_{QCD} \sim E_0\cdot \exp\zav{\frac{1}{g^2(E_0)\cdot C\cdot \beta}}
\] Now, both \(\beta\) and \(C\) are of order one and \(\beta\) is negative. So if \(g^2(E_0)\), the value of the squared QCD coupling at the Planck scale, is reasonably small but clearly smaller than one, we get a number much greater than one when we invert it. The argument of the exponential is therefore a very large negative number. And exponentials are able to produce really tiny numbers out of reasonable arguments. That's how you may derive that the mass of the proton is actually exponentially lower than the Planck scale. Start with a reasonable value \(g^2/4\pi\sim 1/25\) at the Planck scale or something like that and you get a much lighter proton.
However, most of the mass of the proton really comes from the "glue" and its impact on the quarks, not from the quark rest masses. The three quark rest masses add up to something close to \(10~\MeV\) only, just one percent of the proton mass. Those 99% of the proton mass that occupy the Wall Street as well as the proton's interior is made out of the QCD glue.
It's still such a success that you may want to borrow the ideas from QCD and explain the lightness of the God particle (and therefore all the leptons and quarks, not to mention Wbosons and Zbosons) as well. However, you will find out that there doesn't seem to be any "required" proximity between the electroweak scale and the QCD scale.
If you want to borrow the QCD trick, you have to start from scratch. In some sense, you must produce your own God particle out of new and artificial quarks, the techniquarks, which are charged under a new type of charged emulating the color, namely the technicolor. When you do all these things, there will be a new techniQCD scale which will be naturally smaller than the Planck scale and may coincide with the electroweak scale.
Everything is natural except that all these natural things are made out of cyborgs, robocops, schwarzeneggers, and all the scenes are colored by technicolor. In other words, things don't look too natural. This was of course just a pun; if we use a cyborg terminology, it doesn't mean that Nature finds the underlying theory unnatural. But in this case, it actually seems that the terminology correctly captures the fact that technicolor theories are contrived. Technicolor theories seem to predict too many things – new particles, bound states, flavorchanging neutral currents etc. – that are not seen. These "unwanted predictions" are even more "robust" properties of these theories than similarly ambitious predictions of supersymmetry.
Moreover, the observed mass of the God particle, \(125~\GeV\), just looks way too low for the oldfashioned technicolor theories and way too high for some newer theories of composite God particles etc. The observed value \(125~\GeV\) is, on the other hand, smoothly compatible with supersymmetry although it prefers some nonminimal versions of supersymmetry.
At any rate, technicolor and supersymmetry are two representatives of the oldfashioned explanations of the lightness of the God particle. It seems inevitable that all these oldfashioned explanations imply some totally new phenomena beyond the Standard Model – whether they're superpartners (in supersymmetry) or new bound states similar to the composite God particle (in the case of technicolor or compositeness). If we don't find any new physics several orders of magnitude above the electroweak scale, it seems to mean that oldfashioned explanations of the God particle's lightness are wrong. The precise moment when a physicist says that he or she gives up is subjective but any physicist who understands the argument behind naturalness has a point where he or she gives up. For most of them, the transition would occur already during the experiments at the LHC i.e. before 2020.
Anthropic explanations
This brings me to the second possibility, namely that there is no oldfashioned explanation why the God particle is so much lighter than the Planck mass. We won't find an explanation of the mass ratio in terms of some exponential of a reasonable number. We won't find anything like that. We will just have to swallow that the laws of physics contain a parameter \(v^2\) which is \(10^{30}\) times smaller than the squared mass of the lightest black hole worth the name (or the squared Planck mass which may be described in many other ways).
After all, we know that this particular hierarchy – the lightness of the God particle – is needed for us to exist. We need light enough quarks etc. because they still contribute to the mass of the proton although these bare quark masses are linked to the electroweak scale, not the QCD scale. And we need protons that are much lighter than the characteristic mass scale of quantum gravity, the Planck scale, for reasonably large stars not to collapse to black holes. And we need large enough stars to receive enough energy for billions of years in order to run evolution, the LHC, and other experiments.
If we accept the existence of a multiverse, we also assume that there exist other universes in which the God particle is too heavy i.e. too close to the Planck mass. Those universes won't produce any longlived stars and intelligent observers that depend on them. So we're bound to found ourselves in one of the universes where the God particle is very light – even though these universes are very rare in the multiverse.
You see that the explanation is "consistent with the rough observed data" or "compatible with what we see" but much like in the manmade climate cataclysm theory's case, this "compatibility" isn't enough to actually establish that the explanation is right. This compatibility is a very weak "test" and it was a priori too easy for the anthropic theory to pass it. In fact, the anthropic explanation's basic property is that it may "explain" everything and anything and make it look less mysterious (if you believe its underlying logic); a problem is that it could easily "explain" wrong claims about the Universe just as easily.
On the other hand, the fact that we can't find "more challenging tests" that the anthropic theory would be able to pass doesn't mean that it's wrong, either. It just means that we should keep on trying. However, if the LHC continues to show that there is no new physics related to the God particle's lightness and raise the scale below which there's no new physics, it will be getting less sensible to believe that there exists an oldfashioned explanation and more sensible to accept an anthropic explanation.
I also want to say that the anthropic reasoning is analogous to the "egalitarianism of outcomes". In proper democracies, people usually enjoy equal rights i.e. equality of opportunity. However, depending on the situation, skills, hard work and decisions, they will still be led to different outcomes. Analogously, in proper scientific theories, one has to pick some reasonable assumptions (or values of parameters) from a limited list of possibilities that look comparably likely a priori and derive the consequences. The consequences may be very different but if a scientific explanation requires the assumptions to be a priori highly unlikely, it also means that we consider this scientific explanation to be unlikely to be valid. That's why the Standard Model with a very tiny God particle mass looks finetuned and therefore unlikely.
However, the anthropic reasoning postulates the equality of outcomes. We don't give the weights (probabilities) to different values of parameters according to some a priori distributions. We may actually look what kind of a world emerges from one set of values or another and if such a world contains lots of observers, we may give the parameters that produced such a world a higher weight, a higher "prior probability".
I dislike this type of reasoning for numerous reasons, of course. It's similar to the equality of outcomes, something I consider a classical example of a pathological leftwing thinking. It seems acausal because the prior probabilities (of various initial conditions of the Universe) actually depend on the future development, but because the future still depends on the past, we obtain closed timelike curves. The anthropic people pretend that this problem doesn't exist because our Universe really has an infinite past in an eternally inflating multiverse. But I believe it is impossible as well because one can't define any uniform normalizable probability distributions on infinite sets.
The last objection is a sketch of my objection against the anthropic reasoning and I am convinced that most if not all of the existing papers about the anthropic measures that exist in the literature may be shown inconsistent with basic maths and properties of all measures. So I am not worried that there already exists a large class of anthropic papers that is right. The anthropic literature is rubbish.
However, I still can't exclude the anthropic reasoning in its most general form and yes, I am worried that it could be right. That would mean that many things that look perfectly accurate, essential, and universal across our Universe are just coincidences that won't ever be given a good or quantitative explanation. It would suck but it's possible. It's a rather welldefined scientific statement and science may accumulate evidence against it or in favor of it.
The hypothetically continuing absence of any new experimentally discovered physics that is responsible for the lightness of the God particle will make some form of the anthropic reasoning increasingly more likely.
Third way
You could still propose that there exists a "third way", an explanation that predicts no new physics but one that doesn't require a multiverse, either. Well, maybe. I just want to say that if you construct a theory in which the values of some parameters such as the God particle mass are selected by God and you tell everyone that they shouldn't ask why the value is what it is, you haven't made any progress beyond the multiverse. You just denied its existence and overlooked it but you still implicitly assumed it exists. Whenever there are other possibilities that are equally consistent with some rules, they exist at least in the "space of possibilities" and you may assign them with probabilities even if the other possibilities are not considered "real".
And if you say that you don't care that there seem to be many other possibilities and our world is just one of them – something that is a priori very unlikely – it's OK but you are giving up the struggle for an explanation and using your logic, one could believe in any miracles and abandon any part of wellestablished science in favor of "things are what they are and don't ask any questions". This is not just about your satisfaction with ignorance; in many cases, ignorance is clearly a worse explanation than some more specific theories that work.
But there could still be an explanation of the lightness of the God particle that avoids all these problems. The God particle could be elementary up to very high energy scales – but its mass could still be guaranteed to be much lighter than the Planck scale by some alternative calculation of its value that makes the smallness manifest. Let me offer you my favorite numerological example of what I mean. Consider the number\[
v = \sin \zav{\pi\cdot\exp{\zav{\pi\cdot \sqrt{163}}}}
\] I've used the same name of the variable as if it were the God field's vev. Now, how large \(v\) is? It's the sine of a multiple of pi. A very large multiple. A noninteger multiple. A seemingly totally random multiple. Moreover, the argument doesn't seem to be excessively finetuned. It's made out of a few symbols.
So you expect that it is the sine of a random number so it is a random number between \(1\) and \(+1\). The numbers near \(\pm 1\) are actually more likely because that's where the sine spends a lot of time. However, you may calculate that\[
v \approx 0.000000000002355966,
\] about two trillionths. This is really close to the God particle's vev in the GUT mass units. And there exists a perfectly sensible mathematical explanation, one related to the \(j\)function that is helpful for oneloop string theory partition sums, why the result is so incredibly tiny.
There may exist other ways to "visualize" the number \(\exp(\pi\sqrt{163})\) but these other ways may completely fail to explain why this number is so incredibly close to an integer. Analogously, we may be fooling ourselves into thinking that our methods to calculate the mass of the Higgs particle (yes, at least one Higgs instead of God!) or the cosmological constant – which naturally end up with lots of numbers that never seem to cancel accurately – are the only ways to estimate the magnitude of these constants. But there may exist another calculation that, for an important or canonical vacuum or class of vacua in string theory, easily explains why these parameters are so small.
Summary: science is addressing deep questions
The possible explanations why the God particle mass or the cosmological constant are so tiny relatively to the simplest estimates from the dimensional analysis are brutally different when it comes to their philosophy. The conflict between the proponents of the different types of explanations may look like a war between diametrically different religious sects. But those are just emotions. At the end, we are comparing several explanations of the same observed pattern and science is in principle able to do this job although it may be a hard and long journey.
Science is not over and the fact that we must address these "seemingly religious" questions to make further progress only shows how far we have gotten. Science is no longer just a technical tool that advises us what is the better place for a screw in a steam engine. Science is eager to confront much more farreaching questions related to the very logic that organizes the world. Some nasty people feel very uncomfortable about this very fact because they still want to treat science as a small irrelevant slut that just solves some very small problems of theirs. However, there's nothing wrong about the fact that science is confronting farreaching, fundamental questions. That's where many great minds wanted to see science.
Another question is whether we will actually answer the big questions. The answer is No if we don't even try. If we do try, the answer may be either No or Yes; we don't know.
And that's the memo.
Other Interesting arXiv Papers (Week ending November 22, 2014)

The best of the rest from the Physics arXiv preprint server.
Can BioInspired Information Processing Steps Be Realized As Synthetic
Biochemical Processes?
5 hours ago
snail feedback (0) :
Post a Comment