## Sunday, February 03, 2019 ... /////

### Naturalness, watermelons, populism, intuition, and intelligence

A technical comment at the beginning: The updated Disqus widget allows the commenters to press new buttons and apply basic and not so basic HTML formatting on their comments – bold face, italics, underline, strikethrough, links, spoilers (!), codes, and quotes. Feel free to play, children. ;-)

Like her famous countrymate, Sabine Hossenfelder believes that the lie repeated many times becomes the truth which is why she brought the 347th rant against naturalness to her brainwashed, moronic readers.

In one way or another, naturalness is a principle saying that the dimensionless parameters in our physical theories should be of order one – and we should seriously ask "Why" and look for an explanation otherwise. Philosopher Porter Williams just wrote a 32-page-long paper Two Notions of Naturalness whose main point is to distinguish two different ideas that are labeled "naturalness". It's great but the number of flavors of naturalness is greater than two – yet all of them are flavors of the same thing.

At the beginning, she posts the image above [source] with the caption: "Square watermelons. Natural?"

Too bad she hasn't even tried to answer her own question because it's an excellent metaphorical example of naturalness – which can teach you a lot about the actual physicists' motivations when they assume naturalness in one way or another.

OK, excellent, if or when you see the square watermelons for the first time, don't you think "it's strange"? I surely did. And I think that so does every damn curious human being on Earth. Why? Because watermelons that we know in Nature tend to be rather round. In fact, most fruits are quasi-round.

The fruits are quasi-round because there's a lot of water or mass in them, some shell or skin has to confine that water or mass, and the plant minimizes the amount of material for the skin or shell. In effect, the optimum shape is determined by similar mechanisms as the shape of the bubble and it ends up close to round. If you have really never heard of the square watermelons, don't click at the hyperlink and don't read the next paragraph. And try to answer the question: Are these square watermelons made of plastic matter in factories, do they grow in wild nature, or something in between?

The answer is "something in between". They are really naturally grown fruits – but in an environment that was engineered by humans. These watermelons are grown in boxes! The Japanese like to do it (clearly, it is a similar habit of a clever bastardizing of Mother Nature like their bonsai trees) and charge $100 for one watermelon. As you can see, the surprising shape isn't "quite" natural. It's been helped by the humans and their seemingly artificial business tricks. We could discuss whether square fruits like that exist somewhere in Nature or whether they could. It could be an interesting discussion and someone could bring some surprising data. I don't want to claim that it's completely impossible for the cubic fruits to emerge naturally. But I surely want to claim that a curious, intelligent person is expected to be surprised and ask what is the origin of these unusual shapes. Because the shapes are unusual for fruits, the answer has to be a bit unusual, too! This is really the general point of naturalness. When the mechanism – for the growth of fruits or the generation of a low-energy effective field theory out of a deeper theory – is dull and straightforward, the dimensionless constants one generates are usually of order one. And if they're not of order one, there has to be something special – something that is very interesting even if we understand it just approximately. For example the tool you may buy for$31.12, a somewhat unnatural sequence of digits 1-2-3. ;-)

Note that I could map the square watermelon example to the case of parameters in quantum field theory a bit more closely. On the watermelon, we may calculate the ratio $P=A_{\rm min}/A_{\rm average}$ of the minimum curvature radius and the average curvature radius evaluated over the smooth green surface. For round watermelons, you may get the ratio of order $P\approx 0.5$. For square watermelons, you could get $P\approx 0.01$ or so if the square watermelon's edges end up being sharp enough. The square watermelon looks intuitively unnatural to us because $P\ll 1$. And it should.

People who don't ask "Why?" when they see a square watermelon simply lack the scientific curiosity. There is nothing "great" about that intellectual deficiency! And indeed, if you concluded that the "playful human hand" had to be involved in the creation of the square watermelons, your intuition was totally right. This is an example of the generalized naturalness reasoning.

Can we rigorously prove naturalness – that the parameters are never much smaller than one? We can't prove it rigorously. We can't even make the statement rigorous. For example, we would have to decide "which numbers are small enough to be called much smaller than one". There is clearly no sharp bound. And if we picked a bound arbitrarily, there's no reason to expect that a sharp theorem starts to hold from that bound, e.g. for all $P\lt 0.01$.

But we can justify naturalness by some statistical reasoning. A deeper level of the laws of Nature allow the values of $P$ to be calculated in some way – think about some detailed theory about the growth of watermelons. And if these theories don't depend on a pre-existing dimensionless parameter that would play the role of $P$ or its functions, it's sane to expect the calculable $P$ to be of order one.

In other words, there may be a probabilistic distribution $\rho(P)$ for various values of $P$. We don't know what that distribution should be, either. But again, it should be natural which means that its main support is probably located at values of $P$ that are of order one (let's assume that $P$ is a positive number here). The probability distribution has to be normalizable so it can't just allow arbitrarily small or large values of $P$ to be equally likely. The most likely values of $P$ could be comparable to trillions but then the distribution $\rho(P)$ itself would be unnatural.

If the distribution $\rho(P)$ is uniform and supported by the interval $(0,1)$, then the probability that $P\lt 0.01$ will end up being $0.01$ itself. Extremely small values of parameters translate to extremely small probabilities. And if they're extreme enough, the precise way of the "translation" doesn't matter much. If there's some fine-tuning, reasonable enough translations can't change it. Judge Potter Stewart once defined "hard porn" by saying "I can't define it but I know it when I see it." If that organ's hardness in some location is some very large number $X$, then it really no longer matters much whether it's $X$ or $2X$ and it is simply "hard porn". It's similar with naturalness: In a huge fraction of situations, we can rather safely say "it's natural" or "it's not" because the borderline cases (where the details of the definition would matter) are relatively rare.

As you can see, my argumentation is strictly speaking circular. But it doesn't mean that it's wrong, unimportant, or that it can be ignored. If you reject naturalness of any flavor and completely, well, I can't change your mind. That's what Ms Hossenfelder is doing, of course. But even in the absence of a proof of any flavor of the naturalness principle, there's still an empirical fact and it's the following:
Smart theoretical physicists simply do care about naturalness.
Sabine Hossenfelder encourages her readers to elaborate upon their conspiracy theories about group think etc. It must surely be a matter of group think that physicists care about naturalness. Well, if you can buy this sort of populist stuff, you may praise your peers – communities of commenters such as those at Backreaction are clearly among the best examples of group think in the Universe, so the complaint is really cute – but you can't change the fact that your brain is a rotten chunk of junk. There is absolutely nothing sociological about the physicists' tendency to be sensitive about fine-tuning vs naturalness and there's nothing mob-like about the tendency to care about naturalness.

Bragging isn't the main purpose of this paragraph but I've realized that naturalness was extremely important since I was 5 years old or so. There are various reasons why it's so. First of all, by that time, I had seen lots of round enough watermelons and similar fruits to instinctively understand what shapes of fruits and other things are natural. (Clearly, the fruits are neither the main application of naturalness nor my expertise. We are really talking about the form of the equations describing the physical laws.) Of course near round shapes are more natural. More importantly for a theorist, I had done a sufficient number of straightforward calculations of dimensionless numbers that ended up with values that may be considered "values of order one". And when some results were "very far from order one", I could give you a quick explanation why it was so.

I don't know how widespread the realization is. As far as I can say, even when I was 5, I could have been the only person in the city of Pilsen who actively realized – and could articulate, in a somewhat childish physics speak – that numbers of order one are natural and those that are much higher are not. The idea of the overwhelming, dull majority that doesn't care or that doesn't know what I am talking about that my lonely realization is an example of group think is an amazing case of chutzpah. At any rate, the basic realization was never shaped by anyone else, let alone "group think" that requires an actual "group" of people.

The people who keep on growing like theoretical physicists become increasingly appreciative of naturalness. So the importance of naturalness had its manifestations in the basic school, high school, college... and let me jump to the graduate school. We had to prepare for the qualifiers – a Russian friend was scaring me that I could really fail which wasn't realistic, as I saw afterwards, but it was still fun to achieve some historical great scores. But the point is that the oral exams – and perhaps some problems in the written exam – contain a huge number of problems "quickly make an order-of-magnitude estimate of some quantity".

The number of such problems – sometimes extraordinarily practical problems, at least from a fundamentally theoretical physicist's viewpoint, you know – that a soon-to-be physics PhD can solve is huge and they really cover all important things in the Universe, if I can borrow a clarifying phrase from Sheldon Cooper. Almost none of them were related to particle physics. Just a totally trivial example: Estimate the drift velocity of electrons in a wire of some thickness and some normal current. Great. The student has to divide the current density (current over the cross section) by the charge carrier density (I mean the number density times the electron charge) and gets a result below 1 millimeter per second or so. So much smaller than the Fermi speed above 1 kilometer per second! The individual electrons are flying very quickly but their "center of mass" is moving rather slowly when the currents are realistic.

I am talking about the order-of-magnitude estimates – and parameteric estimates in general – because they undoubtedly belong to the toolkit that every good theoretical and experimental physicist must have mastered. Not only as a philosophical thesis that she can mindlessly parrot; but as something that he can use in thousands of examples (I've used both sexes so that the feminists can't whine about the discrimination).

A big part of the general, instinctive understanding of physics by the physicists is about these parameteric estimates in which the numerical prefactors and other "details" (such as subleading corrections in some expansion) are simply ignored. Perhaps one-half of the physicists' efficient understanding of the phenomena in the Universe would be impossible without this methodology – the dimensional analysis etc.

Now, why does it make sense to ignore the (dimensionless) numerical prefactors as details? Why don't they totally change the result? Well, because they're assumed to be of order one. They're assumed to differ from $1$ at least by one or two or three orders of magnitude – while the other known (mostly dimensionful) parameters a priori belong to huge intervals that may span dozens (a larger number than the previous one) of orders of magnitude! This assumption may be more justified in some cases and less justified in others. But it's something that is so extremely useful and informative in such a huge fraction of the examples in physics (and other natural sciences, not to mention some everyday problems and even social sciences) that a physicist surely cannot throw away this whole industry of order-of-magnitude estimates.

Every competent physics graduate student surely understands that the principle that "we may consider the numerical factors as details in our estimates" and "the dimensionless parameters are assumed to be of order one" are closely related if not almost equivalent. So when Sabine Hossenfelder writes
Williams offers one argument that I had not heard before, which is that you need naturalness to get reliable order-of-magnitude estimates. But this argument assumes [...],
you can immediately see that she is simply not a real theoretical physicist and she has never been one. Otherwise she would have heard about the relationship – well, she would really be capable of figuring it out herself, and she would have actually done it decades ago. She couldn't have meritocratically passed tests like the PhD qualifying exams. Her PhD is demonstrably fraudulent. She has never understood these basic things – even though before one gets a physics PhD, she should not only understand them but be capable of applying them in basically any physical context.

New physics beyond the Higgs that could have been seen at the LHC may be said to be a "failed prediction of naturalness". As I said, it was a very soft prediction and I have never thought that the odds were too different from 50-50. Of course even a "big desert" is plausible – and, from a point of view, beautiful (and no, I will not erase this word from my vocabulary) – given some mechanisms that enable it. "New physics at the LHC" was always partly a matter of phenomenologists' (and other excitement-loving physicists') wishful thinking, not a hard prediction of anything. But naturalness isn't a high-precision formula. Naturalness is a grand principle, a manifestation of basic Bayesian inference applied to parameter spaces, or a strategy to get closer to realistic answers in physics and other natural sciences. You can't really refute it or falsify it – just like you cannot "falsify" mathematics as a discipline.

Hossenfelder, her readers, and this whole organized crackpot movement is obsessed with the idea that they will falsify all of theoretical physics or all of physics or all of the logical thinking. You just can't. You live in a fantasyland, cranks. These cranks love to define some "rules of the game" and they decide that an outcome classified as a "failure" may be interpreted as a "falsification of naturalness or theoretical physics" or whatever they like etc. But all these conclusions are just absolutely irrational and more generally, cranks are simply not defining the rules of the game. Nature and the laws of mathematics do. The scientists' job is to increasingly understand them, not to beat them by their opinions.

At the end, Hossenfelder writes:
The LHC data has shown that the naturalness arguments that particle physicists relied on did not work. But instead of changing their methods of theory-development, they adjust their criteria of naturalness to accommodate the data. This will not lead to better predictions.
Even if that outcome were a "failure", it was one failure of a probabilistic strategy, not a falsification of anything, and "another failure" doesn't follow from the first one at all.

Of course the adjustment of the people's understanding of naturalness is the first sensible thing – and quite possibly, the only sensible thing – that the theoretical physicists should do after the strategy was seen as unsuccessful due to the null results at the LHC – and the direct dark matter searches, for example.

No one knows – and, despite her self-confident suggestions to the contrary, even Ms Hossenfelder cannot know – whether the adjusted notions of naturalness will lead to better predictions. But we will never know if we don't try. Of course it's the professional – and moral – duty of the physicists to play with naturalness, adjust it, combine it, recombine it, and try to shape a more accurate cutting-edge picture of the laws of physics. Physicists are obliged to do exactly what she claims they're not allowed to do.

Maybe someone invents a new principle that will de facto eliminate naturalness from physics – because it will be replaced by something completely different. But this scenario is a pure speculation at this moment. We can't pretend that it has taken place because it hasn't taken place. So naturalness and its variations will remain a tool of the physicists whether a crackpot likes it or not. Physicists will keep on playing with this methodology.

That process may lead to an additional bunch of soft predictions that will be confirmed or refuted by an experiment. But it is a part of science. The alternative, to throw away basic physicists' tools and principles such as the principle of naturalness in the most generous form and without any replacement, is a road to hell – a method to abolish science as we have known it for centuries. Physicists will surely try to find and apply as many novel ways to guess what may happen in future experiments – but in the relative absence of alternatives, of course they will still need to exploit some sort of naturalness.

As long as theoretical physicists are being picked according to their ability to predict or retrodict phenomena or explain and quantify the phenomena in Nature (and I sincerely hope that graduate students won't be given physics PhDs for their vigor while licking the aßes of the likes of Ms Hossenfelder), these physicists will be vastly more appreciative of naturalness than the average laymen – because the thinking in terms of naturalness and fine-tuning is almost inseparably connected with the scientific curiosity and quantitative instincts of the physicists' type. You can't do anything about it, Ms Hossenfelder and her equally worthless sycophants.

Incidentally, Pavel Nadolsky tried to explain to her that there are extra problems in the absence of naturalness – the predictive or explanatory power of different theories can't even be properly compared in the absence of naturalness. The previous half-sentence is already too complex for her poultry brain so she said Nadolsky was only talking about the individual theories' explanatory power – that's the maximum complexity that her brain can contain. But he was talking about the comparisons of the explanatory power of several theories – that's way too difficult for her populist stupid knee-jerk reactions. So of course Mr Nadolsky couldn't teach anything to her. She is unable and unwilling to learn anything and frankly speaking, it's way too late for her to learn these basic ideas about physics.

She's already living a different epoch – it is Ihr Kampf to persuade as many morons as possible that her fundamental problems with some basic insights and methods of physics are just fine. They are not.

And that's the memo.

Oops, it isn't quite the end yet. She also added a bunch of fast appraisals in a footnote at the very bottom – and these two short sentences remind me why I simply couldn't stand that fudged-up pretentious pseudointellectual for a millisecond:
The strong CP-problem (that’s the thing with the theta-parameter) is usually assumed to be solved by the Pecci-Quinn mechanism, never mind that we still haven’t seen axions. The cosmological constant has something to do with gravity, and therefore particle physicists think it’s none of their business.
It's so arrogant yet so stupid. We haven't seen an axion yet but we're allowed to talk about the Peccei-Quinn [note the spelling] mechanism because this is a damn theoretical prediction made by a theory. The theory is ahead of the experimental confirmation which fully explains that the experimental discovery hasn't taken place yet and there's nothing wrong about it. This is how the discovery process works in about 50% of cases in the history of physics.

In the remaining 50%, the experimenters are the first ones who discover something new and physicists create a theory explaining that new phenomenon afterwards. Needless to say, Ms Hossenfelder doesn't like this ordering, either – she doesn't want to build colliders because theorists don't have "firm predictions" for the new collider in advance. As you can see, she hates both scenarios, one in which the theorists start and one in which the experimenters start – she hates 100% progress in physics.

As of February 2019, it is unknown whether there are axions in Nature and whether one of them is responsible for solving the strong CP-problem. Her claim that the answer is known to her is just a pile of šajze.

Similarly, she writes that "it's not particle physicists' business" to discuss the cosmological constant because the cosmological constant has "something to do with gravity". Holy cow. Some particle physicists sensibly neglect gravity simply because it's very weak in most particle physics experiments which they're focusing upon but some particle physicists, especially those who are closer to formal theory, do discuss the fundamental laws of physics and they do include and have to include (quantum) gravity. Moreover, everything that has ever been observed has "something to do with gravity" – because all mass/energy/momentum is gravitationally coupled. Particle physics and fundamental terms in cosmology are really inseparable.

What she actually wants to implicitly push down your throat is that the cosmological constant phenomena (the accelerated expansion of the Universe) are consequences of some modifications of general relativity which is why relativists, and not particle physicists, should discuss it. But that's just an unsupported hypothesis of hers. An equally plausible – if not much more plausible – hypothesis is that the tiny cosmologically observed vacuum energy is simply what we see and it simply has to correspond to the energy density terms in a theory of particle physics.

Both possibilities are viable and researchers must preserve their freedom to do research of both.

Some particle physicists ignore the problem of the tiny cosmological constant because they're focusing on quantum field theory which isn't compatible with quantum gravity (at least due to renormalizability), and their QFT assumptions and principles may be assumed to fail. But others, like string theorists who study the cosmological constant, simply cannot ignore this effect and the tiny magnitude of the constant. They're shocked by the tiny constant much like physicists are shocked by any unnatural constant in physics – and this one is tiny, indeed. They spend their energy by looking some anthropic/naturalness/other explanations of the puzzle because that defines their work, at least the part of their work linked to the cosmological constant.

The fact that she can successfully pump some easy answers and solutions or non-solutions to all these difficult questions down the scumbags' throats doesn't mean that this indoctrination has something to do with proper scientific research.