tag:blogger.com,1999:blog-86660912019-05-19T11:11:27.680+02:00The Reference FrameSupersymmetric world from a conservative viewpointLuboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.comBlogger1133125tag:blogger.com,1999:blog-8666091.post-5584821198353530342019-05-17T08:15:00.001+02:002019-05-17T08:27:09.928+02:00Heckman, Vafa: QG bounds the number of hierarchy-like problemsEvery competent physicist knows that fine-tuning is a kind of a problem for a theory claimed to be a sufficiently fundamental description of Nature.<br /><br />Fundamental physicists have wrestled with the cosmological constant problem, the Higgs hierarchy problem,... and perhaps other problems of this kind. Fine-tuning is a problem because assuming that the fundamental "theory of everything" works like a quantum field theory and produces the couplings of the low-energy effective field theories via renormalization group flows, the observed hierarchies between the scales etc. seem extremely unlikely to emerge.<br /><br />In principle, there could be arbitrarily many couplings and even fine-tuned couplings which could cause an infinite headache to every theorist. In a new paper, Cumrun Vafa – the Father of F-theory and the Swampland Program (where this paper belongs) – and Jonathan Heckman, a top young research on both topics, present the optimistic evidence that in string/M-theory and/or quantum gravity, the infinite fine-tuning worries are probably unjustified:<br /><blockquote><a href="https://arxiv.org/abs/1905.06342">Fine Tuning, Sequestering, and the Swampland</a> (just 7 pages, try to read all)<br /></blockquote>What's going on? Effective field theories outside quantum gravity may be built by "engineers". You may apparently always add new fields, new sectors, and they allow you to tune or fine-tune many new couplings. There doesn't seem to be a limit.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />String/M-theory is more predictive and chances are that even if there were another consistent theory of quantum gravity, it would be more predictive, too. In particular, as they say, the number of couplings that can be independently fine-tuned to unnatural values is finite. <br /><br />I have a feeling that they count the moduli among the couplings that can be "fine-tuned", even if they correspond to physical fields. But that doesn't invalidate their statement because they say that the number of moduli is bounded, too.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />Moreover, the bound is a fixed finite number for every choice of the number of large dimensions and the number of supercharges. Fine, what's the evidence?<br /><br />First, the number of the Minkowski, flat spacetime solutions in string/M-theory seems to be finite. Also, the number of Calabi-Yau topologies seems to be finite. The latter statement hasn't been quite proven but propositions that are very close have been proven. For example, if you restrict the manifolds to be elliptically fibered and the base to be a toric geometry, it's been proven that Calabi-Yau three-fold topologies form a finite set. It seems very likely that the manifolds that cannot be represented like that are a "minority", so even the number of all Calabi-Yau topologies should be finite.<br /><br />Their first full-blown discussion is in 6D field theories. Conformal field theories have either \((1,0)\) or \((2,0)\) supersymmetry; \((1,1)\) cannot be conformal. Infinitely many classes of such theories with lots of deformations exist as CFTs. But if you want to couple them to gravity, you see restrictions. The cancellation of anomalies requires the total number of tensor multiplets to be 21 which is a particular finite number. In fact, all stringy 6D CFTs only allow deformations that result from operators that exist in the theory. In this 6D case, their new principle largely reduces to the anomaly cancellation.<br /><br />In another related example, the total rank of some gauge group is 22. Perturbative string theory obviously restricts these ranks by the central charge – the rank cannot be too high for the same reason why the spacetime dimension cannot be arbitrarily high. Well, the central charge is also a gravitational anomaly – on the world sheet.<br /><br />They discuss a few more rather specific examples – so their paper has many more equations and inequalities than what is actually needed for their main claims. But the overall new swampland principle has ramifications. In particular, if you imagine many sequestered or hidden sectors in artificially engineered apartheid-style models of particle physics, all their couplings seem to be independent, and could therefore admit independent fine-tuning.<br /><br />According to Heckman and Vafa, if the number of such sectors is too high, quantum gravity actually implies some correlation between the fine-tunings. At the level of effective field theory without gravity, many parameters \(g_i\) could be independently adjusted and very small. But if you require that the theory may be coupled to quantum gravity, it already follows that there are equations that correlate almost all these constants \(g_i\), up to a finite (pre-determined) number of exceptions.<br /><br />Sometimes people express their doubts about the reasoning involving naturalness and the disfavoring of fine-tuned theories. Indeed, the thinking based on quantum field theories is ultimately imprecise and incomplete and has to be adjusted. But "just ignore all the fine-tuning problems" isn't a scientifically valid adjustment to the problem. The problems cannot be completely ignored because they're implied to be problems by a rather specific, successful framework of physics that we use all the time – quantum field theory – combined with the probability calculus. To ignore the problem would mean to cherry-pick what we like about the framework – quantum field theory – and what we don't.<br /><br />Instead, the adjustment to the fine-tuning rules must have the form of "quantum field theory isn't an exact description of Nature and the correct framework differs in respects A,B,C, and these differences also imply different predictions concerning the fine-tuning". This new Heckman-Vafa swampland may be counted as an actual <em>scientific</em> way to go beyond the existing rules about the naturalness and fine-tuning in effective field theories. The paper tells us how string/M-theory <em>actually</em> modifies the semi-rigorously proven intuition or lore about the fine-tuning in our effective field theories. <br /><br />The modification primarily says that the couplings are automatically more constrained than naively indicated by the low-energy effective field theory analysis. In other words, string/M-theory is – in a new specific sense – more predictive than quantum field theory. It shouldn't be surprising because quantum gravity needs to reconcile the low-energy behavior with the high-energy behavior – where the particle spectrum must gradually merge with the black hole microstates whose entropy is again dictated by a low-energy effective field theory (including Einstein's gravity). When you're playing with the low-energy couplings, quantum gravity actually tells you that you have to aim at and hit several targets for the trans-Planckian behavior of the theory to remain consistent (with gravity).Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-3291177966943201782019-05-14T07:22:00.002+02:002019-05-15T10:59:02.128+02:00EVs vs ICEs, NOx, critics of science as thought police, Ponzi scheme, SophThere are too many terrible events happening in the world right now – every day, both famous and unknown people are getting fired and hunted for saying the truth or for not being far left extremists; scientifically illiterate snake oil salesmen are receiving the Hawking Prizes; media are bombarding us with lies against science and the Western civilization.<br /><br /><a href="https://www.volkskrant.nl/wetenschap/parallelle-universums-tijdmachines-zijn-theoretisch-fysici-de-weg-kwijt~b18fa264/" rel="nofollow">A major Dutch publication</a> has written a text on the topic "is physics a Ponzi scheme?". My once co-author Robbert Dijkgraaf and Juan Maldacena are the only voices that actually and calmly explain the state of theoretical physics now. They're overwhelmed by critics who don't understand the field at the technical level at all and who are being presented as if they were equal – Maldacena is the top theoretical physicist of his generation and Dijkgraaf is, among other things, the director of IAS Princeton where Einstein used to work.<br /><br />Those special attributes don't seem to matter to the journalists anymore. Random angry activists and hecklers who are allies of the journalists are often made more visible.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />The critics say it's very important that they're receiving supportive e-mail from other scientifically illiterate laymen and the journalist implicitly agrees with that. Meanwhile, Robbert is correctly pointing out that research works when it's not constrained by a thought police. These witch hunts against physics are obviously just another part of the thought police that is gaining strength in our society – and theoretical physics is naturally another expected target of the far left movement, as something evil because it has been overwhelmingly built by the white males. People who don't have the ability to do meaningful science are being happily hired by the fake news media as the inquisitors who are presented as equal to the top physicists.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />This anti-meritocratic distortion of the life, Universe, and everything in the media affects all fields and all age groups. A hysterical (adjective chosen by Czech president Zeman), inarticulate, brainwashed, psychologically troubled 16-year-old Swedish girl who believes that the Earth is burning – probably much like an <a href="https://youtu.be/sUFugN6wo7Q?t=41">impolite Bill Nye with a flamethrower</a> – is presented as a celebrity. (Sorry, the IQ of the people who are affected by these things by Bill Nye has to be so low that I refuse to count them as full-blown members of the homo sapiens species.) Readers are supposed to be interested in her book deal – she can obviously write just a worthless incoherent rant because she is not an intelligent girl, and this rant will be written by some older but almost equally unremarkable environmentalist activists, anyway.<br /><br /><b>Meanwhile, the contemporary teenagers are way more conservative and sensible than the millennial generation. Everyone who cares about the future of mankind must make sure that this generation will grow into a pro-Western, sane, mostly right-wing bunch. And it's possible. We must only start to care about the education!</b><br /><br />OK, there's a wonderful comparison of Greta Thunberg with someone on the other side. If you don't know her, look at the videos by <a href="https://www.youtube.com/channel/UCT7BLBDnD-wEXeqZSg24aJw/videos" rel="nofollow">Soph</a>. Soph is a 14-year-old (*9/23/2004 as Sophia Totterman) girl – two years younger than Greta Thunberg – whose YouTube videos (don't be afraid and try the latest <a href="https://www.youtube.com/watch?v=OdaUDeAGIck" rel="nofollow">Be Not Afraid</a>) get three hundred thousand views per video in average. (Update: hours after this blog post was posted, this particular latest excellent video by Soph was removed by some nasty YouTube aßholes as "hate speech". It was far from the only one. Here you have a <a href="https://www.bitchute.com/video/OdaUDeAGIck/" rel="nofollow">backup</a>.) And she is discussing rather adult topics, indeed (starting with the co-existence of cultures and high school students' life). By the counting of the viewers, this girl is a self-made millionaire (OK, I still believe that there are some adults helping her with her videos – she says the older brother is a key but they say she's more radical than he is – but the result looks both more true, more impressive, more entertaining, and more authentic than Thunberg's). Do the media celebrate an actually brilliant girl who has achieved something by herself, without the media machinery? <br /><br />Not at all. In fact, the answer "not at all" is far too optimistic. Yesterday, Joseph Bernstein wrote a disgusting hit piece against the 14-year-old girl at <a href="https://www.buzzfeednews.com/article/josephbernstein/youtubes-newest-far-right-foul-mouthed-red-pilling-star-is">BuzzFeed News</a>. Using a giant media machinery to attack teenage girls is how your far left movement defines a Gentleman today, isn't it? <br /><br />Mr Bernstein, it hurts when someone is 14-year-old and more sophisticated and smarter than you and all your far left comrades combined, doesn't it? She makes you realize where (in the political sense) are the people who have some talent – and which remainder of the mankind is just a field of weeds that fail to achieve anything remarkable despite their usage of all immoral and illegitimate tools and weapons we may think of. And you dislike the truth, don't you? Soph's sentence that follows your "or how about, simply" is spot on.<br /><br />In two or three decades, if the likes of Soph happen to be outnumbered by the brainwashed sheep of her generation, Soph et al. will have the duty to fully appreciate that she's equivalent to 100 or so sheep, and adjust the rules of democracy accordingly. It will be your world, Soph, and you can't allow sheep to overtake it.<hr><br />But I want to talk about a relatively lighter topic, the electric vehicles (EVs). OK, so we have exchanged some e-mails with Gene about the advantages and disadvantages of EVs and cars with internal combustion engines (ICEs). I won't cite the precise sentences but I needed to mention the e-mail conversation for you to understand why I was surprised by Gene's comments posted a few hours ago that indicated that I should celebrate Brussels for encouraging Audi to produce EVs.<br /><br />What? Surely you have understood that I am absolutely against this push to spread EVs by now, Gene. And indeed, this push is largely empowered by the European Union. It's another example of the criminal behavior of that international organization, a reason why most of the powers that this organization has acquired must be reversed, another reason to disband the EU in its current form.<br /><br />Just days ago, I translated <a href="https://motls.blogspot.com/2019/05/neff-when-reason-is-pushed.html?m=1">Ondřej Neff's essay</a> which clearly stated that the statements by the Volkswagen Group that they only want to produce EVs in 2030 or something like that are terrifying, sick, ideologically driven, and directly threatening at least a quarter of the Czech economy. You won't get any support of mine for the EVs from Audi. They may produce some, the products may have some good characteristics, they will probably lose money on them, but the idea that this should be supported – and maybe even by the likes of me – is absolutely insane.<br /><br />Gene pretends to be more open-minded and less ideological than the rest of Northern California and maybe he is. But I still find his PC virtue signaling unbearable way too often. He must have understood that I am generally against the expansion of the EVs at the moment because the disadvantages clearly trump the advantages. Have I been unclear about this elementary point? I don't believe it's possible. So why would Gene assume that I am going to praise the EU for Audi's EVs? Let me tell you why.<br /><br />He doesn't really believe it but he's one of the promoters of this ideology – and a part of the strategy of such people is to create the atmosphere in which it is "believed" that all the people, perhaps including your humble correspondent, support the transition to EVs. He likes to strengthen the perception that the preference for ICEs is an unthinkable heresy, a thought crime – and he personally helps to nurture this atmosphere of non-freedom. I don't support the transition to EVs. Do you need this simple sentence to be translated to many languages? Sensible people who have thought about the issue know that the ICEs are superior at this moment and the EVs are inferior and if someone is telling you something else, he is not saying the truth.<br /><br />The price that the actual buyer pays for an EV – when the vehicle is bought in the first place – is about twice as high than for an otherwise comparable ICE right now. This is the primary difference which is enough to conclude that the EVs are simply not competitive with the ICEs now. But even if the progress were much faster in EVs than ICEs – there's no reason to believe so – and EVs became as cheap as comparable ICEs, ICEs would still have other, secondary but very important, advantages.<br /><br />These advantages of the ICEs, if I include the lower price, are e.g.:<br /><ol><li>lower price of the vehicle in the first place</li><li>much shorter refuelling times of the ICEs than charging times of EVs</li><li>existing network of gas stations, minimum of superchargers</li><li>environmental disadvantages of EVs: toxic elements</li><li>safety involving some special processes, e.g. self-ignition of EVs</li><li>a centennial experience with the ICEs showing that there's no time bomb waiting for us</li></ol>This list is far from complete but it's quite a list. The price of the car is clearly a primary variable and the ICEs win 2-to-1 over EVs. The charging times are incomparable. You spend a few minutes by refuelling petrol or diesel but you need 20-40 minutes to recharge 50-80 percent of a Tesla battery. This difference is huge, I will discuss it later.<br /><br />Now, you only recharge an EV if you're lucky and there's a nearby supercharger. Are these networks comparable? Czechia gives us a shocking example. We have over 7,000 gas stations and <a href="https://www.tesla.com/en_GB/findus/list/superchargers/Czech%20Republic?redirect=no">3 Tesla superchargers</a> – in Prague, Humpolec, and Olomouc. That's where you recharge the car as "quickly" as in 30 minutes. Outside these places, you find at least overnight chargers where you need to be connected... you know, overnight.<br /><br />Now, will the network of superchargers grow? It will. Will it be fast? Are there good reasons for the growth? There aren't because the number of EVs is small. So it's clearly too bad an investment to build too many chargers for too few EVs. This is a vicious circle. A century ago, a similar "vicious circle" arguably slowed down the growth of the normal gas stations. But there was a difference. A century ago, ICEs were competing against <em>horses</em>, and cars are more convenient than horses even <em>despite</em> the rare network of gas stations.<br /><br />Now, the EVs are competing against the ICEs which are really comparable – it's not a difference similar to the difference between a horse and a car. So the construction of a dense network of superchargers is clearly an investment that will create a financial loss for quite some time. The belief that it's worth to do it is just a belief. And it is clearly a belief that is driven by an ideology right now.<br /><br />I mentioned that there are 7,000+ gas stations and 3 Tesla superchargers in my country. The ratio looks huge. But what about the ratio of the cars? In 2018, Czechs bought some 250,000+ new cars, about 30% of them were diesel, a drop from 37% in the previous year. Aside from petrol and diesel, all the other cars are <a href="http://www.hybrid.cz/rok-2018-v-cesku-diesel-se-propada-rekordni-prodeje-elektromobilu-skvele-si-vedou-i-hybridy" rel="nofollow">negligible</a>: 5,000 hybrids, 2,000 CNGs, 1,000 LNGs, and 1,000 purely electric vehicles, including 85 Teslas. In 2018, 0.03% of the cars sold in Czechia were Teslas. 3 superchargers are 0.05% of the 7,000 gas stations – so within a factor of two, it's fair.<br /><br />There is absolutely no reason to think that the EVs will naturally beat the ICEs anytime soon. In particular, the market obviously wants to keep the petrol/diesel gas stations up to 2030 because in 2030, there will still be lots of cars purchased recently because it's normal for many people to keep the same car for a decade.<br /><br />Now, the environmental advantages of ICEs. They produce just H2O (water vapor) and CO2, harmless and beneficial gases. There's some NOx, nitrogen's oxides, in the diesel case. This must be compared to the noxious elements that are used in the production of the batteries for EVs, that occasionally burn when a car self-ignites (Hong Kong saw another self-igniting Tesla yesterday) or when a whole EV factory burns (which seems to be a frequent event, too). People don't really know whether it's possible to safely deal with the worn old lithium batteries.<br /><br />Gene admits that the real pollution from ICEs is much smaller than it used to be – a drop by 97%, using his numbers. But even the world with the high pollution was OK enough. When it drops to 3% to what it used to be, should we still consider the situation unacceptable? I don't think so. This opinion is nothing else than an extremist ideology. Look at the death rates.<br /><br />Every year, some 1.3 million people die in the world as a result of a car accident – some mechanical damage to the body. It's estimated that the NOx emissions may be blamed for 10,000 deaths in the EU per year. The total for the world is probably below 100,000. Now, is it too high? It's clearly not too high. The deaths blamed on the <em>fuel</em> are less than 10%, and maybe around 5%, of the deaths caused by the vehicles in total. In what sense could we claim that it's too much?<br /><br />Every year, some 55 million people die globally. Those 50,000-100,000 from NOx are between 0.1% and 0.2% of the deaths. If you eliminated petrol and especially diesel cars, you would reduce the deaths by 0.1%-0.2% or so. Great. Temporarily, of course. After some time, the population would be upgraded to a higher life expectancy and the same number of people would be dying at a higher age as without the reduction of NOx.<br /><br />But imagine that the ratio of the deaths is comparable to the increase of the life expectancy – it's not quite so but it's a good order-of-magnitude estimate. So the NOx emissions from cars may be reducing the lives of the people by 0.1% or 0.2%. Great. What about the waiting times in front of the superchargers? If you recharge every other day, you waste 30 minutes per 2 days (48 hours) in front of the supercharger. That's about 1% of your time! To a large extent, this has shortened your useful life. And 1% is 5-10 times larger than 0.1% or 0.2%. <br /><br />The result is that the superchargers are robbing you of a greater portion of your life than the NOx car pollution in average!<br /><br />Even if CO2 emissions were a problem, and they're not, one may show that in the present real-world conditions, the total CO2 emissions connected with the production and usage of an EV actually trump those of a diesel car.<br /><br />It's similar with all such comparisons. If you actually compare the variables on both sides fairly, you may see that the ICEs are superior than the EVs. It may change in some time – as the technologies evolve – but the difference is so significant that it's unlikely to change for many years. But this discussion has been largely hijacked by dishonest ideologues who are close to the environmentalist movement and the deceptive "mainstream" media of the present. Because they have decided to stick this particular EV agenda mindlessly, they only push memes about advantages of EVs and disadvantages of ICEs down into their viewers' and readers' throats. Virtually all of this is garbage. People intuitively know it – they subconsciously perform many of these calculations which make them keep their ICEs and avoid EVs. But the massage by the media and their allied ideologues is unbelievable. The percentage of the EVs in a given country or state may be considered a very good measure of "how much the population of that territory likes to be brainwashed".<br /><br /><b>Now, advocates of EVs also say that the EVs are simpler, and therefore less likely to break.</b><br /><br />This is another totally demagogical sleight-of-hand. EVs have fewer mechanically moving parts but they have a greater number of "transistors" and other electronic parts. Can they break? You bet. The electric cars depend on lots of software and it can break – and cripple your car – too. It's happening. Functionalities of cars are often broken after a software update. It's completely analogous to the mechanical breaking of an ICE. More importantly, the probability that an engine breaks isn't a simple increasing function of the "number of parts". It depends which parts, how well they're made, how robust the material is, and other things.<br /><br />In practice, the breaking of the ICEs is not such a problem. Many problems may be fixed. It's been business-as-usual for a century. And we don't really want to assume that the cars serve for more than 20 years or something like that. Cars that are this old look obsolete. They have other disadvantages. People usually prefer to buy a new car after a shorter time – perhaps 5 years in such cases – and carmakers obviously want this "refreshment" to take place sufficiently often, too. So the "simplicity advantage" of the EVs only exists under assumptions that are utterly unrealistic.<br /><br />Even more conceptually, simplicity is heavily overrated. I have also often said that I preferred things to be simple. But I saw others saying similar things – and saw that their reasons for saying such things are totally bad. In most cases, people say "they prefer simple things" because they're lazy or intellectual limited. They want "things" to be simpler because harder things mean extra work for them and they don't like it! It's that simple. My explanation is actually <em>simple</em> which is why you should appreciate it! It's also true. That's why schoolkids prefer a simple homework, for example. There may exist legitimate justifications of "simplicity" but they're rare.<br /><br />But does it mean that "simple" is "superior" in general? Not at all. The schoolkids and their adult counterparts are doing some work. And if the work were "simple", it probably means that they didn't do too much work, and that's "bad" for the client or buyer. The buyer has a completely different perspective than the producer. If something is simple, it should often be expected to be cheap and unremarkable because not much work has been done! There is almost nothing inside the Tesla Model 3's interior which is why it should be an extremely cheap car. An extensive essay should be written about the simplicity in fundamental physics – which is a sufficiently different topic than simplicity in engineering. We prefer as simple things as possible, but not more so, as Einstein wisely said. Again, the laymen usually want things to be simpler than possible and that's too bad.<br /><br />This "simplicity" has been added to the preferred buzzwords of the Luddite movement, too. "Simple" things are supposed to be preferred. That may include organic food. But much of this "simple" stuff is the same as the "cheap stuff before the technological advances reshaped the industry". So the "simplicity" often directly contradicts "technological progress"! It's not a shame for an engineer to design complex engines. If someone denies that this is really the bulk of the work of every engineer, then this someone is a Luddite who fights against the technological progress in general. And even complex engines may be made more reliable, more resilient, and more lasting. "Complexity of an engine" isn't any insurmountable lethal flaw.<br /><br />An ICE has a lot of parts, especially if it has various gadgets to reduce emissions of various compounds or particles. But that doesn't mean that it's bad. Complex engines are the standard product of engineering. Engineering also wants to keep things simple <em>if all other things are equal</em>. But the "if" condition isn't obeyed here – it is almost never obeyed. Things aren't equal. You can't compare things that aren't commensurable. And the "number of parts in an EV or an ICE" is not commensurable. To achieve certain things, a certain degree of complexity is often needed – EVs and ICEs mainly have a different <em>kind</em> of complexity, not a different amount. So we just don't want things to be too simple. At the end, a car or a phone should have "many functionalities" and some complexity is necessary for that. In the 1980s, I was surely happy that my watch received from my Australian uncle had a calculator, stopwatch, and many other functions. Whoever thinks that a small number of functions is a universal advantage is simply a Luddite.<br /><br />Now, Gene and others say that "the market will decide". But sadly, that's not what is happening today. Liars and charlatans in the media and unelected EU officials who are actually controlled by brain-dead members of various NGOs and other pig farms owned by the likes of George Soros are determining whether companies – perhaps, by 2030, including Škoda Auto – will be allowed to produce proper cars that the consumers actually want at all, and whether the buyers will be "allowed" to buy the cars of the kind they prefer. It's too bad.<br /><br />In 2019, EVs are a niche market and every argument building on the assumption that the EVs are as important as or more important than the ICEs is just self-evidently fraudulent. If it is allowed to speak, the market will speak but to some extent, it has already spoken, too. Both EVs and ICEs have been around for more than a century but ICEs became and remained dominant. Given the political atmosphere and the amount of lies and illegitimate pressures that we see everywhere around, it seems very likely that a hypothetical suppression of ICEs and proliferation of EVs may be explained by the emerging totalitarianism, not by the natural and legitimate market forces.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-28035283474167660712019-05-10T11:48:00.000+02:002019-05-10T11:56:58.286+02:00Pheno papers on \(96\GeV\) Higgs, trilepton excess, and \(60\GeV\) dark matterI want to mention two new hep-ph papers about supersymmetry-like anomalies seen by the accelerators. In the paper<br /><blockquote><a href="https://arxiv.org/abs/1905.03280">An N2HDM Solution for the possible \(96\GeV\) Excess</a>,<br /></blockquote>B+C+Heinemeyer discuss some detailed models for the apparent weak signals indicating a <a href="https://motls.blogspot.com/2017/09/cms-locally-28-sigma-diphoton-excess-at.html?m=1">new Higgs boson of mass around \(96\GeV\)</a>. Recall that the only well-established Higgs boson has the mass of \(125\GeV\).<br /><br />Concerning the \(96\GeV\) little brother, the CMS has seen an excess in the diphoton channel; and decades ago, LEP has seen an excess in the bottom quark pair channel. Heinemeyer and friends say that these excesses may be explained by a two-Higgs model with an extra Higgs singlet. Is that surprising at all? There seems to be a lot of freedom to accommodate two independent excesses, right?<br /><br />At any rate, concerning supersymmetric models, the NMSSM – next-to-minimal supersymmetric standard model – and its extension, µνSSM seem like aesthetically pleasing completions of the two-Higgs-plus-a-singlet models. In the model with the two Greek letters, the singlet is interpreted as a right-handed neutrino superfield and the seesaw mechanism is incorporated. These models look OK for the excesses – there are other reasons to prefer NMSSM over MSSM. But they're also less constrained and predictive than the MSSM, so I think the good news isn't remarkably victorious.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />Another paper on the excesses is<br /><blockquote><a href="https://arxiv.org/abs/1905.03768">The Return of the WIMP: Missing Energy Signals and the Galactic Center Excess</a><br /></blockquote>by Carena+Osborne+Shah+Wagner. They promote a model with the dark matter of mass \(m_\chi = 60\GeV\) and its justification by anomalies that exist out there.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />The dark matter of that mass would be the lightest neutralino. It could naturally agree with the 3-sigma <a href="https://arxiv.org/abs/1806.02293">trilepton ATLAS excess</a> (and a confirmation by GAMBIT), the gamma <a href="https://motls.blogspot.com/2013/03/bubbles-support-10-gev-or-50-gev-dark.html?m=1">ray excess at the center of our galaxy</a> seen by Fermi-LAT, as well as the <a href="https://motls.blogspot.com/2016/12/sam-ting-claims-that-1-tev-wimp-is-only.html?m=1">antiproton excess observed by AMS-02</a>.<br /><br />In their model, the LSP is a bino-like neutralino and another, wino-like neutralino should exist with the mass of \(160\GeV\). \(\tan\beta\) should be greater than ten. This paper may be viewed as a counter-argument against the recent efforts to claim that the central galactic gamma-ray excess was "due to some boring pulsars" only.<br /><br />At any rate, dark matter of mass \(60\GeV\) within supersymmetry is still plausible and somewhat recommended by some observations, much like the NMSSM-like new Higgs of mass \(96\GeV\). I can't tell you the probability that these particles exist – it depends on lots of priors and methodology – but I am sure that it is just wrong and prejudiced to behave as if these probabilities were zero.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-26722179952270949442019-05-07T06:52:00.001+02:002019-05-07T08:19:13.542+02:00Carroll's interview with SusskindOn his Mindscape Podcast (<a href="http://seancarroll.libsyn.com/rss" rel="nofollow">RSS subscribe URL</a>), Sean Carroll published an unusually good 74-minute-long interview with Leonard Susskind:<br /><blockquote><a href="https://www.preposterousuniverse.com/podcast/2019/05/06/episode-45-leonard-susskind-on-quantum-information-quantum-gravity-and-holography/">Episode 45: Leonard Susskind on Quantum Information, Quantum Gravity, and Holography</a> (audio)<br /><br /><audio class="wp-audio-shortcode" id="audio-2327-1" preload="none" style="width: 100%;" controls="controls"><source type="audio/mpeg" src="https://chtbl.com/track/15E2/traffic.libsyn.com/seancarroll/leonard-susskind2.mp3?_=1" /><a href="https://chtbl.com/track/15E2/traffic.libsyn.com/seancarroll/leonard-susskind2.mp3">https://chtbl.com/track/15E2/traffic.libsyn.com/seancarroll/leonard-susskind2.mp3</a></audio><br /><br /></blockquote>Both men are very good speakers and in this case, especially because he has avoided words like "many worlds" (he preferred "agnostic"), "Donald", and others, I could have subscribed to nearly 100% of Susskind's statements. <br /><br /><img src="https://upload.wikimedia.org/wikipedia/commons/e/e0/Leonard_Susskind_at_Stanford.jpg" width=407><br /><br />Susskind was introduced as a visionary, storyteller, mentor, a co-father of string theory who has done a lot in QFT, a popularizer etc. He prefers to call himself "a theoretical physicist" rather than a "string theorist" because it gives him more freedom to jump around, to be researching anything he wants, and to be bullšiting about anything he wants (the B-verb actually <em>is</em> Susskind's favorite word, but you can't know it if you don't know him in person and if you're not a TRF reader).<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />Concerning the value of string theory, Susskind primarily said that it has given us extremely accurate models which contain quantum mechanics, gravity, electromagnetism, particles, bosons, fermions etc. – so we know for sure that these properties of the Universe are compatible with each other, which hadn't been clear at the beginning, and it's a big deal even if we're not sure it's the theory of the real Universe around us.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />Susskind is also asked to sketch the information loss paradox. The information is preserved, quantum mechanics won. Carroll asked Susskind about his love for an alternative picture – the time becomes a rope that chokes itself inside the black hole, which causes an abortion and the aborted baby universe makes everything fine or something like that – and Susskind said No. But this is a great alternative theory by philosopher Tim Maudlin, don't you know this genius? Good for Lenny, he doesn't know Maudlin. You would know dozens of such wannabe physics revolutionaries if you spent more time with the Internet, Lenny! <br /><br /><iframe align="left" scrolling="no" frameborder="0" style="width:140px;height:245px;" marginheight="0" src="//ws-na.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&OneJS=1&Operation=GetAdHtml&MarketPlace=US&source=ac&ref=tf_til&ad_type=product_link&tracking_id=lubosmotlsref-20&marketplace=amazon®ion=US&placement=0000000000&asins=0465093345&show_border=false&link_opens_in_new_window=false&price_color=BBBBBB&title_color=FFAA44&bg_color=002211" marginwidth="0"/></iframe>The black hole evaporates at the end so every "natural" time slicing "after" the evaporation agrees that nothing is left from the black hole whatsoever, except for ordinary particles from the Hawking radiation. The baby time inside the black hole... it's just some random talk that could have been widespread in the 1970s but there is absolutely no mathematical backing for these philosophical words. Science went elsewhere, Carroll's apparent defense of Maudlin's nonsensical story notwithstanding.<br /><br />Susskind discussed the no-cloning theorem in quantum mechanics: quantum information cannot be duplicated. He had thought he discovered the no-xerox theorem but it turned out that he only discovered the new corporate branding term for the previously known no-cloning theorem. The duplication is a problem, conflicts with linearity of QM (duplication is a quadratic map, I add), and there was a discussion whether the duplication "as such" has to be banned or whether any <em>detection of duplication</em> has to be banned.<br /><br />In this discussion, like in most others, Susskind was the wiser one and pointed out that it's the latter. A potential paradox may make you nervous but it only becomes a real paradox if it can actually be measured. Needless to say, Carroll – the "realist" who still thinks that classical physics is right (not that Susskind may always be said to avoid this basic mistake, but he does avoid the anti-quantum zeal in his positive stories here) – disliked all these comments. Carroll also made some totally wrong statements about Heisenberg's uncertainty principle such as "neither the position nor the momentum of a particle exist, only the wave function does". No, it's the other way around. Physically, only observables – such as the position and momentum – really exist because they're "observable", that's why they are called in this way, while the wave function is <em>not</em> an observable, and it is not observable without "an", either.<br /><br /><iframe align="left" scrolling="no" frameborder="0" style="width:140px;height:245px;" marginheight="0" src="//ws-na.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&OneJS=1&Operation=GetAdHtml&MarketPlace=US&source=ac&ref=tf_til&ad_type=product_link&tracking_id=lubosmotlsref-20&marketplace=amazon®ion=US&placement=0000000000&asins=0465062903&show_border=false&link_opens_in_new_window=false&price_color=BBBBBB&title_color=FFAA44&bg_color=002211" marginwidth="0"/></iframe>OK, Lenny's careful formulation is that any attempt to observe both \(x,p\) – and similarly any attempt to observe the duplication – will always get frustrated. Exactly. If you can measure what's inside, you can't measure what's outside, and vice versa. Well, it's not just analogous to the uncertainty principle for \(x,p\), it's really a special case. The relevant operators inside and outside refuse to exactly commute, as locality would predict.<br /><br />For example, if you collect enough information from the Hawking radiation, and you want to fall into the black hole and see the same information inside once again, you will fail because the "collection" requires a certain time and a hard calculation shows that that somewhat surprisingly, after that time, the original matter is already destroyed at the singularity. So the physical dynamics doesn't necessarily avoid the paradox immediately, by some urgent immediate huge prevention policies, but the physical dynamics always prevents the paradox, sometimes "barely" or "at the last moment". This is enough and quantum gravity or string theory <em>love</em> to save the consistency "at the last moment".<br /><br /><iframe align="left" scrolling="no" frameborder="0" style="width:140px;height:245px;" marginheight="0" src="//ws-na.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&OneJS=1&Operation=GetAdHtml&MarketPlace=US&source=ac&ref=tf_til&ad_type=product_link&tracking_id=lubosmotlsref-20&marketplace=amazon®ion=US&placement=0000000000&asins=0465075681&show_border=false&link_opens_in_new_window=false&price_color=BBBBBB&title_color=FFAA44&bg_color=002211" marginwidth="0"/></iframe>This time, thankfully, Susskind calls himself an agnostic on interpretations of quantum mechanics – "uncertain whether there is a problem", a phrase he copied from his friend Feynman. QM will always work. But he can't get rid of some confusion about the relationships of QM and reality. Quantum gravity/cosmology will probably affect the foundations, he thinks. Possible, indeed. It won't happen in Susskind's lifetime, he thinks. Carroll promoted some "work" of his on the anti-quantum zeal, Susskind politely avoided commenting on that even though some of Susskind's papers could be claimed to be in nearly the same category.<br /><br />Why would someone think something as crazy as the holographic principle (independently found by Susskind and 't Hooft), Susskind is asked? There are entropy bounds. Volume and information are proportional for a while but if you put too many memory chips in a volume, they collapse in a black hole that is larger than the originally reserved volume. Some insights from the 1970s are recalled. Another level above the entropy bounds is a hypothetical theory that lives on the boundary. 't Hooft's paper looked worth ignoring to most physicists because he had used the term "dimensional reduction" incorrectly, Susskind recalls. Susskind chose a better word in the title, "hologram", and physicists started to understand stuff. Witten liked the principle but Juan Maldacena finally turned holography into an almost clearcut rigorous construction and a tool. Susskind claims that Maldacena had not known Susskind's and 't Hooft's papers on holography. Note that for Maldacena, AdS/CFT was a ramification of his work on the Strominger-Vafa-like black hole entropy in string theory. The acronym AdS/CFT is expanded with some history about de Sitter, that Susskind clearly doesn't know too well, and neither do I. AdS has negative energy density. Witten, familiar with Susskind's and Maldacena's papers, introduced the term "holography" to AdS/CFT (along with a whole machinery to compute the correlation functions but Carroll and Susskind skip such details LOL).<br /><br /><iframe align="left" scrolling="no" frameborder="0" style="width:140px;height:245px;" marginheight="0" src="//ws-na.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&OneJS=1&Operation=GetAdHtml&MarketPlace=US&source=ac&ref=tf_til&ad_type=product_link&tracking_id=lubosmotlsref-20&marketplace=amazon®ion=US&placement=0000000000&asins=0316016411&show_border=false&link_opens_in_new_window=false&price_color=BBBBBB&title_color=FFAA44&bg_color=002211" marginwidth="0"/></iframe>Susskind admitted we still don't have an idea how to build a holographic theory for a generic "finite region". I find it plausible that no such nice theory exists. In particular, I think one can almost prove that such a theory on a generic finite surface couldn't be local. In AdS/CFT, the boundary CFTs are local because of the infinite warp factor – it seems like a necessary condition for the locality.<br /><br />"What do we learn from the weird construction?" Carroll asks. He may play a Devil's advocate but it looks he is serious. "What do we ever learn?" Susskind replies. Susskind then clarifies that theories aren't always obtained by quantizing a classical starting point, and this "quantization of classical" is dysfunctional for gravity. Maybe the difficulty is because gravity and QM are worlds apart. Susskind thinks that the resolution is the opposite one: gravity and QM are too close and the problems in "quantizing GR" come from the efforts to <em>separate</em> these almost identical things. Susskind described a nice experiment showing the inseparability of gravity – it could be done as a quantum computer simulation. He talked about general simulations of superconductors at quantum computers and other things. And about quantum error correction, something that wasn't needed in classical computers due to the resiliency of classical bits. Cutely, black hole research has led to advances in the real practical error correction industry. His only example is ER-EPR.<br /><br />Now, Carroll repeated some weird doubts about holography. How much is holography relevant for the tables around just "crossing our fingers"? I think it's hard to answer such vague questions whose only comprehensible content is hostility. Susskind didn't lose his nerves, as I could have ;-), and repeated the basic point. We have found a theory that has QM, GR, particles, all the basic qualitative things, and within its AdS spacetimes, the AdS/CFT is true. Does it prove that holography is always true, especially in the real world? Strictly speaking no. But it excludes the assumption that one could have – and Carroll probably has it – that holography is ludicrous or impossible. There's no question that holography may happen now, especially in AdS spaces. In dS spaces, we're less confident.<br /><br />The interview was recorded in the Google X buildings because they have been consultants. Susskind discusses how he learned some complexity theory in the computer science sense – the absolute possible minimum number of steps from A to B. Susskind celebrates the genuine collaboration between computation guys and theoretical physicists. Good ideas often penetrate to many directions and that's how you know that there's something strong about them. Examples involving condensed matter physics, fluids, and black holes follow. The SYK model was invented by Sachdev, a Harvard CM physicist. Great but Subir is also a de facto string theorist now so this example of multiculturalism isn't "too" multicultural. OK, at any rate, SYK stands for a condensed matter physicist, an information physicist... and it's used in the black hole research.<br /><br /><a href="https://lh5.ggpht.com/lubos.motl/SLzWAIwIlpI/AAAAAAAAA5I/l6c7FCXx4Jk/dienes-lennek-9-landscapes.jpg?imgmax=1600" rel="nofollow"><img src="https://lh5.ggpht.com/lubos.motl/SLzWAIwIlpI/AAAAAAAAA5I/l6c7FCXx4Jk/dienes-lennek-9-landscapes.jpg?imgmax=400"></a><br /><br />Ten minutes before the end, Susskind says that the multiverse (a package of many patches, like jungle, savana etc.) is still the best picture for the biggest questions and for fine-tuning in cosmology. He always asks for better things and the answers come out empty. It may turn out to be wrong but Susskind doesn't see how. Also, we know that the Universe is bigger than the visible portion – just like we know it for the Earth. Both are very flat. CNN may write about the Stanford professor who believes in a very "Flat Earth". ;-)<br /><br /><iframe align="left" scrolling="no" frameborder="0" style="width:140px;height:245px;" marginheight="0" src="//ws-na.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&OneJS=1&Operation=GetAdHtml&MarketPlace=US&source=ac&ref=tf_til&ad_type=product_link&tracking_id=lubosmotlsref-20&marketplace=amazon®ion=US&placement=0000000000&asins=B000SEOB2Q&show_border=false&link_opens_in_new_window=false&price_color=BBBBBB&title_color=FFAA44&bg_color=002211" marginwidth="0"/></iframe>Susskind doesn't want to predict the next revolution in physics but thinks that strings, qubits etc. will have to address the cosmological questions. The proper relationship is unknown.<br /><br />Carroll asks why Susskind did the popularization, like the popular books. It was done for Susskind's father who was a plumber with a simple education – but with other plumbers in Bronx, these rough men were intellectuals. They talked about everything, history, science – a nice mixture of intellectualism and crackpotism. And Susskind obviously believes that the crackpotism was due to no access to science literature. Susskind believes that he has taught some science to his father. Well, I don't believe these things much. People's crackpotism is <em>not primarily</em> due to the bad access to scientific research. OK, some people in Palo Alto were already annoyed by Susskind's SciAm-level talk, and wanted real physics with equations, so he did some. That's where the Theoretical Minimum came from.<br /><br />Right, it's important to explain science at a decent level. But I think that there's something more fundamental that decides about people's scientific approach than some detailed equations. It's some understanding why and when the equations should be taken seriously at all. I think that most people who are exposed to equations don't connect them with the reality because they misunderstanding something more conceptual about how science works – so they end up lost in mathematics. Some hardwired irrational disbelief that "equations may not or should not really be decisive" is the main stumbling block that keeps otherwise capable people's thinking about the physical world unscientific.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-41124065833615378482019-05-06T17:56:00.001+02:002019-05-06T18:25:55.320+02:00Axion weak gravity conjecture passes an extreme Calabi-Yau testThe first hep-th paper today was posted 1 second after the new business day at arXiv.org started, indicating that Grimm and van de Heisteeg (Utrecht) really think that people should read their paper:<br /><blockquote><a href="https://arxiv.org/abs/1905.00901">Infinite Distances and the Axion Weak Gravity Conjecture</a><br /></blockquote>The first thing I needed to clarify was "what is the exact form of the 'axion weak gravity conjecture'" that they are using. There must surely be a standalone paper that formulates this variation of our conjecture. And oops, the relevant paper was <a href="https://arxiv.org/abs/hep-th/0601001">[4] AMNV</a>. I have already heard the M-name somewhere.<br /><br /><iframe width="407" height="277" src="https://www.youtube.com/embed/FfHyVVrBjWU" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe><br /><br />Yes, of course I knew the main point we wrote about the "axion weak gravity conjecture". That point – discussed in a paper by <a href="https://arxiv.org/abs/hep-th/0303252">Banks, Dine, Fox, and Gorbatov</a> (and in some lore I could have had heard from Tom many years earlier, unless I told him) – had largely stimulated the research into the "normal" weak gravity conjecture itself.<br /><br />The conjecture says that the decay constant of an axion shouldn't be too high – in fact, its product with the action of the relevant instanton is smaller than one in Planck units. This is a generalization of the "normal" weak gravity conjecture because the instanton is a lower-dimensional generalization of the charged massive point-like particles (higher-dimensional ones exist as well) and its action is a generalization of the mass/tension of the objects.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />Our claim implies (the previous formulation by Banks et al.) that either the decay constant or the instanton action or both have to be small. And this condition has a nice implication: quantum gravity doesn't want to allow you to emulate flat potentials too closely, unless they're exactly flat, so the axion "wants" to be visible either because its decay constant is low or because the instanton corrections to its potential are sufficiently wiggly.<br /><br />This is one of the particular insights that indicates that string theory's predictivity always remains nonzero – string theory doesn't want you to approximate the effective field theory of one vacuum by another vacuum too closely.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />In the older Banks et al. formulation, the "axion weak gravity conjecture" was considered as a bad news because it indicated that some natural attempts to construct natural inflation were actually forbidden in quantum gravity.<br /><br />Fine, now the two Dutchpersons look at a sufficiently wide and rich class of string compactifications to test the "axion weak gravity conjecture" – at type IIA string theory vacua on Calabi-Yau compactifications. Note that type IIB has the "point-like in spacetime" instanton, the D(-1)-instanton, and similarly all the other odd ones. The Dutch paper looks at type IIA so they need to look at the even D-brane instantons.<br /><br /><img src="https://lh3.ggpht.com/lubos.motl/SNzHOSzB_JI/AAAAAAAABCo/hQHOSzwUfcY/animated-quartic.gif" width=407><br /><br />OK, the "generic" Calabi-Yau has everything of order one. To make the decay constants and instanton actions parameterically large or small, so that you may study whether some inequalities are parameterically obeyed or violated, they need to study extreme shapes of Calabi-Yaus. They look at extreme corners of the complex structure moduli space. The analysis of these "extreme directions" is somewhat analogous to my and Banks' <a href="https://motls.blogspot.com/2009/02/dualities-vs-singularities.html?m=1">dualities vs singularities</a>.<br /><br />And indeed, for every extreme direction in the Calabi-Yau complex structure moduli space, they find a tower of the D2-brane instantons that is predicted by the "axion weak gravity conjecture" – with the parameterically correct actions. That's quite a nice test of the conjecture. Curiously enough, to argue that the instantons exist, they need to use another swampland conjecture, the "swampland distance conjecture". Because the weak gravity conjectures should be counted as "swampland conjectures", they use one swampland conjecture to complete the partial proof of another one. I guess that a "swampland skeptic" could remain skeptical and call the proof circular.<br /><br /><iframe width="407" height="277" src="https://www.youtube.com/embed/MXXRHpVed3M" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe><br /><br /><em>OK, Vengaboys are Dutch, too.</em><br /><br />At any rate, the "axion weak gravity conjecture" has passed a test (at least assuming that other conjectures hold) and it looks like a nontrivial test because the limits in the space of shapes of a Calabi-Yau aren't quite simple. The authors of the weak gravity conjectures arguably weren't idiots, it seems once again. The situation is really provoking because the weak gravity conjectures may be motivated and formulated rather easily and have a "philosophical beauty, naturalness, and coherence" which are very important in theoretical physics. <br /><br />On the other hand, the proofs are partial, context-dependent, and very technical.<br /><br />Cannot there be a universal proof of the "weak gravity conjecture(s)" that really unifies and clarifies all the partial proofs and that is as straightforward as the proof of Heisenberg's\[<br /><br />\Delta x \cdot \Delta p \geq \frac{\hbar}{2}<br /><br />\] or the generalized uncertainty principle inequalities? And don't these weak gravity conjectures have some direct far-reaching philosophical consequences for quantum gravity – much like the uncertainty principle basically implies that probabilities must be predicted relatively to an observer and from complex amplitudes?<br /><br />Well, let me give you another, more detailed hint what you need to do to make a breakthrough analogous to the quantum mechanical one. In quantum mechanics, you first needed to realize that \(x,p\) from the inequality should be replaced with Hermitian operators. Here, we are talking about the values of parameters in <em>effective actions</em> of quantum gravity. So these parameters that enter the WGC-like conjectures must correspond to <em>some objects</em>, let's call them <em>prdelators</em> because they're like operators but probably not quite, constructed within the full theory of quantum gravity or string/M-theory (which is more abstract than just an effective field theory). Your main task is to figure out what a "prdelator" is and why it has the property analogous to noncommutativity that is responsible for the swampland inequalities. And Czech readers must be warned that their partial understanding could be illusory.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-75611575873680162822019-05-04T10:14:00.000+02:002019-05-04T16:50:17.312+02:00Farmelo's interview with WittenLast year, physicists' (especially Dirac's) biographer <a href="https://grahamfarmelo.com/the-universe-speaks-in-numbers-interview-5/">Graham Farmelo interviewed Edward Witten</a>. (Hat tip: John Preskill, Twitter.) If you have 27 spare minutes, it's here.<br /><br /><audio class="wp-audio-shortcode" id="audio-2327-1" preload="none" style="width: 100%;" controls="controls"><source type="audio/mpeg" src="https://grahamfarmelo.com/wp-content/uploads/2019/04/USIN-Pod_Ep5_EdwardWitten_Final.mp3?_=1" /><a href="https://grahamfarmelo.com/wp-content/uploads/2019/04/USIN-Pod_Ep5_EdwardWitten_Final.mp3">https://grahamfarmelo.com/wp-content/uploads/2019/04/USIN-Pod_Ep5_EdwardWitten_Final.mp3</a></audio><br /><br />Farmelo speaks like an excellent host – the framing, background music, and intonation seem professional for someone who is mostly known as a writer. OK, Witten was relaxed and said he was interested in astronomy as a kid. Many kids were – there were astronauts and other things at that time. Witten mastered calculus at the age of ten or eleven (depending on the type IIA coupling constant – and yes, he is an M-theory guy with a high coupling LOL), it's a bit later than your humble correspondent, but OK. He couldn't quite hide that his mathematician father had something to do with this mathematical exposure.<br /><br /><img src="https://www.nsf.gov/news/special_reports/medalofscience50/images/witten_1.jpg"><br /><br />He was interested in other things, worked on a failed Democrat Party candidate's presidential campaign (the victorious president above brought more smile to both men!), and realized physics was his cup of tea after the age of 20.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />I was a bit surprised to hear that his first professional-level exposure was to physics, not mathematics. Intuitively, Witten just looks like a man whose formative years were shaped by mathematics, not physics, but it's apparently an illusion. He was only dragged to deep mathematical aspects and mathematicians' mathematics later.<br /><br />(OK, I verified with Wikipedia and I insist that I am not completely wrong. After an economics graduate school, Witten joined as an applied mathematics graduate student, and switched to physics only later, earning PhD under Gross in 1976. Applied mathematics grad school, father-mathematician... I just don't buy it's completely wrong to say that Witten the kid hasn't been shaped as a mathematician.)<br /><br />Witten talked about the fast pace of experimental discoveries a few decades ago. When he became a graduate student, the pace slowed down considerably. So I guess he rationally figured out that he had to work more theoretically to optimize the output. <br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />Witten mentioned some interactions with Steven Weinberg – who had some fun while explaining current algebras to the current algebra infidels that included Witten. Sidney Coleman was important for Witten – the only guy who was really interested in strongly coupled QFTs (and instantons etc.). OK, Witten was probably the second guy. Michael Atiyah, Saša Polyakov, and Albert Schwartz are mentioned in connection with Witten's interest in instantons. (Georgi and Glashow are mentioned as weak-scale phenomenologists.)<br /><br />That's where Witten probably got his mathematical edge – he learned sheaf cohomology groups and stuff like that. "Sheaf cohomology group" sounds like more hardcore mathematics than your favorite hardcore server but for Witten, it was apparently a matter of "learning about the physical topic of instantons just a bit more deeply than the average physicist". Index theorems got to the game, Witten did some influential mathematics work there. Some Polyakov's instanton program hasn't worked for physics as expected but it turned out interesting in other, more mathematical contexts.<br /><br />Witten finally learned some Morse Theory while in swimming pool in Aspen, Colorado. Again, Witten does his best to deny he had any sort of mathematician's thinking. He was like other physicists who heard "Morse Theory" (or what it's good for) for the first time etc. He crisply explains the theory – as counting of the maxima, minima, and saddle points whose combination is determined by a topological invariant of the domain.<br /><br />Some basics of string theory and the 1984 First Superstring Revolution are discussed. The anomaly cancellation needed some mathematics that was known to mathematicians (and to a limited extent, to Penrose and perhaps a few others) but not to physics graduate students. String theory has obviously made difficult mathematics more important.<br /><br />The harmony between mathematics and physics is a "fact of life". He doesn't know what an "explanation of this harmony" would sound like, and neither do I. ;-) The very distinction between mathematics and physics depends on some slightly contrived definitions which introduce a conceptual boundary – and the relationship between them really means that you may remove the boundary that you just artificially added. What a big deal. ;-)<br /><br />At 17:00, Farmelo uses the term "string framework". For years, the classification as "framework" has actually been favored over "theory" by some people such as David Gross. The string framework is an "alternative" to the quantum field theory framework. Nice but there is still a difference: the QFT framework is composed of many theories/models while string theory really <em>is</em> a single theory. So the two "frameworks" aren't analogous in most key respects. <br /><br />Like your humble correspondent, Witten says that string/M-theory is the <em>only interesting</em> direction to go beyond the well-established framework of quantum field theory. A year ago or two, an article in The Quanta Magazine forced Witten to recant and an anti-string activist was able to force a deceitful edit of Witten's phrase to say that "string theory is one among many". But "it is revolving, anyway", Witten somewhat bravely reiterated, and string theory is indeed the only game in town in the absence of activists in the IAS dining hall or on the sofa in Witten's office (he was sitting at the point (0,0,0,0)).<br /><br />So Witten got some small positive points for bravery on this point. But if it remains in isolation, and if he's not facing those who find "it's the only game in town" politically incorrect, the points are really small. You have to try harder, Prof Witten, to get rid of the relatively cowardly image. OK, here he corrected Farmelo's statement about "other routes". "There aren't any other routes," Witten has informed Farmelo. And "loop quantum gravity"? It's just words. Exactly. String theory is the only interesting way to go forward. Loop quantum gravity is a triplet of words used for some spin networks – and these spin networks have rather clearly nothing to do with quantum or any gravity which is why the triplet of words is misleading.<br /><br />(Similarly, "asymptotically safe gravity" are just words. They could mean something but they surely don't mean a theory – not even a sketch of a theory – that has some well-defined rules and that has, according to some available evidence, a significant potential to solve some actual problems counted as "quantum gravity". The phrase "asymptotically safe gravity" represent some scale-invariant local QFT that is gravitational at the same moment. Because of holography and the black hole spectrum etc., it's unlikely that quantum gravity could be equivalent to a local or even scale-invariant quantum field theory in the bulk spacetime. I am pretty sure that Witten agrees with all these claims of mine about all the "alleged alternatives" to string theory.)<br /><br />Concerning the absence of post-Higgs LHC discoveries, Witten recalls the widespread belief that the Higgs physics would have come together with new physics that fixes the Higgs scale – as a very low scale in comparison with the Planck scale. The absence is "extremely shocking" to him. Well, I would surely use less dramatic words – the fine-tuning may still be 1-in-100 and I don't find events with a 1% probability "extremely shocking". For him, a huge surprise was also the positive (but tiny) cosmological constant. Witten used to feel a big discomfort with the landscape or multiverse but he is less upset about the multiverse now than a decade ago – the Universe's purpose is not to make him feel comfortable. Well said. When the daily wars about the multiverse are over, I totally calmly accept this general picture as a clear possibility.<br /><br />Some down-to-Earth questions may be solved in physics of the near future. The biggest new ideas may be completely unknown to everybody now. Among the known routes, the entanglement-geometry (spun as "it from bit") link seems most interesting to him now. Sensible in all counts. So the next upheaval could come from the entanglement-glue duality. Witten, usually too shy to forecast visions for the future (perhaps after some wrong predictions), regained some (totally reasonable amount of) composure and argued that his closet prophetic bones could have predicted such upheavals "a few years in advance" two or three decades ago – not bad. ;-) He should allow his prophetic bones to speak a little bit more often.<br /><br />Witten says that people with various opposing opinions about "new physics" were all a little bit wrong. Expectations of lots of new (experimentally found) physics were wrong but so were the ideas that it would mean that the progress would stop. The progress since his graduate school was cool enough – and it has enriched mathematics, condensed matter physics, and more.<br /><br /><iframe align="left" scrolling="no" frameborder="0" style="width:140px;height:245px;" marginheight="0" src="//ws-na.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&OneJS=1&Operation=GetAdHtml&MarketPlace=US&source=ac&ref=tf_til&ad_type=product_link&tracking_id=lubosmotlsref-20&marketplace=amazon®ion=US&placement=0000000000&asins=0465056652&show_border=false&link_opens_in_new_window=false&price_color=BBBBBB&title_color=FFAA44&bg_color=002211" marginwidth="0"/></iframe>Farmelo praised Witten for the precision of speech and avoidance of philosophically sounding verbal gibberish. At the end, aside from promotion of his books, Farmelo admits that a purpose of that interview was to fight against the myth that Witten is a mathematician self-framed as a physicist. I surely believe in this myth less than the actual believers (Witten is an amazing physicist – and a universal top phenomenologist, he learned much of SUSY phenomenology from Gordon Kane etc.) but I still believe it's somewhat true and the interview hasn't substantially reduced this partial belief of mine, sorry. ;-)<br /><br /><iframe width="407" height="277" src="https://www.youtube.com/embed/nMjXz5hKAR4" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe><br /><br />Totally off-topic. I had wanted to embed this animated history of Czechia's map (with Smetana's Vyšehrad/Upper_Castle, a part of My Country) sometime in 2019 – and it's 2019 now. <br /><br />If you find the years 1019-1041, you will see that Czechia's shape was almost identical to the Czech Republic as of today. In 1019, although it could have been 1029 as well and no one is sure, Duke Oldřich of Bohemia finally conquered Moravia, the Eastern 40% of Czechia that was organized as a margravate-not-kingdom throughout the feudal history.<br /><br />So within miles on all sides, Czechia has had the same shape for 1,000 years in this year. The territory firmly controlled by Prague has been larger at some moments and we used to have access both to the Baltic and Adriatic Sea – but it used to be smaller, too, like nothing LOL. We're probably the world's most territorially stable country of the last 1,000 years now, congratulations to us.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-82995968384419953742019-05-02T12:57:00.001+02:002019-05-02T17:16:40.554+02:00How string theory irreversibly changed our understanding of the physical lawsIn the previous text, I tried to focus on the <a href="https://motls.blogspot.com/2019/05/first-stringy-steps-how-young-fieldist.html?m=1">differences in the treatment of QFT</a> (quantum field theory) that may discourage too naive students of "mundane QFT" when they are trying to switch to modern advanced QFT and string theory in particular.<br /><br />This text is somewhat similar but it focuses on the "later differences" – what string theory actually tells us about the world and the physical laws that we didn't know when we were confined in the mundane QFT paradigm – or that we couldn't even imagine. There's some overlap with texts such as <a href="https://motls.blogspot.com/2006/06/top-twelve-results-of-string-theory.html?m=1">top 12 achievements of string theory</a> – Joe Polchinski had added the last two – but here I am looking at the issue from a different, less marketing and more heureka, perspective.<br /><br />So what do I see differently than when I was in the mundane QFT phase?<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />Note that all these insights, and some others, show that it's childish when someone wants to "return" physics to a pre-stringy era. We – I really mean the body of the best theoretical physicists – have simply learned some things that cannot be "unlearned".<br /><br /><b>Theories that seemingly look different may actually be the same.</b><br /><br />I chose this as the #1 point not because I am sure it's the deepest one but because it's arguably the most self-evident and well-defined class of insights among the sufficiently important ones.<br /><br />In physics jargon, <em>dualities exist</em>. Dualities have become omnipresent in string theory – and in QFT. Only when physicists had several examples, they were pushed to qualitatively change their perspective. Before the discovery of dualities, physicists thought that if two theories differ in some technical aspects and if there's no "sufficiently obvious" field redefinition or a map that shows their equivalence (usually some kind of drudgery that brings a Lagrangian in one form to another form), the theories have to be physically unequivalent.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />Before string theory, people knew various equivalences. Different formulae for a function – e.g. Riemann zeta function – were equal to each other. Great but in some sense "straightforward". Schrödinger's definition of quantum mechanics is equivalent to Heisenberg's or Dirac's one, when properly interpreted. That's also great but it boils down to a rather simple unitary transformation.<br /><br />But physicists used to assume that these equivalences only apply to "objects of limited size". When you describe the <em>laws of physics for a whole Universe and everything in it</em>, you have specified so much information that it just can't be equivalent to a completely different set of physical laws. But the equivalences actually exist everywhere.<br /><br />So we have the equivalence of sine-Gordon and the massive Thirring models; fermionization and bosonization. S-duality, T-duality, U-duality, string-string duality, M-theory – type IIA duality, mirror symmetry. And many more that I will discuss separately. Most of these dualities were first found in string theory and/or in QFT that is close to string theory, or at least in QFTs by string theorists i.e. physicists who have the ability to do research on string theory. <br /><br />We know that similar dualities, like S-duality of Yang-Mills, exist even in local QFTs. The lesson may be more general. Maybe if you find some other theories different from QFTs or string theory, they will also exhibit dualities. But string theory has been the playground where we actually learned this general lesson. Intelligent enough theories of Nature exhibit dualities.<br /><br />You shouldn't underestimate the philosophical importance of dualities. It really means that an "apparently different collection of basic building blocks and interactions" may be completely equivalent to a "seemingly completely different one". It means that our description of a theory – our way of thinking about the allowed objects and their evolution, their shape in the spacetime, and even the dimension of the spacetime etc. – "overdetermine" the actual physical identity of the theory. Our language – even when it's mathematical language – is too talkative. The physical beef of the Universe is just the language modulo some powerful equivalences and redundancies.<br /><br />Dualities are clearly important, even at the fundamental philosophical level, and those who keep on assuming that dualities don't exist are just wrong. They assume something that could have been natural to assume centuries ago. But we've learned that it's wrong – just like the Flat Earth is wrong – and the newer picture we have learned is actually more exciting. "The perfectly precise physical equivalence" between "two worlds" whose basic laws look qualitatively different – and aren't related by any obvious enough field redefinition – could have looked "infinitely unlikely" before string theory but we know that such things are omnipresent so they just can't be assumed to be infinitely unlikely!<br /><br />Some dualities are considered a great playground by pure mathematicians, such as mirror symmetry, but this blog post isn't about "achievements" but about "paradigm shifts in physics", so I won't dedicate special sections to the "ways how string theorists have impressed pure mathematicians" here.<br /><br /><b>In quantum gravity, the maximum information doesn't scale with the volume.</b><br /><br />It scales with the surface. The black hole entropy is proportional to the event horizon's surface. You just can't compress too much information at a fixed information density to excessively large regions. The black hole entropy was known to be proportional to the surface in the 1970s, before it was derived from string theory. But string theory has confirmed the formula and has added more explicit pictures, especially the AdS/CFT correspondence, that make it clear that at least in some contexts and descriptions, the information about the whole region in quantum gravity "lives on the surface" and looks rather local on the surface.<br /><br />To make it brief, string theory has been rather essential to realize – and make explicit – all the ideas that we call the holography of quantum gravity.<br /><br /><b>There's no qualitative difference between elementary particles and black hole microstates</b><br /><br />Black holes look like qualitatively different, large "beasts" that differ from the elementary particles. But string/M-theory has shown us that the black hole microstates – there are many microstates because the black hole entropy is large for a large black hole – are nothing else than the "very massive" counterparts of elementary particle species.<br /><br />The qualitative difference between an electron and a black hole could have looked – and arguably did look to most people – "obvious" but we already know it's wrong. Even if one found a theory of quantum gravity that is totally different from M-theory or type I/IIA/IIB or heterotic string theories, it would almost certainly be true that elementary particles, including the graviton, are just light siblings of the exponentially many black hole microstates describing heavy black holes.<br /><br />We have learned a lesson here. We know how to interpolate between particles and black holes in various examples. We will never return back. When we were mundane QFT theorists, we thought that a realistic theory of quantum gravity required us to make two objects – black holes and elementary particles – peacefully co-exist. We know it was wrong: there is no qualitative difference between the two and a promising theory produces both kinds of objects simultaneously, from the same underlying material and laws.<br /><br /><b>All dimensionless parameters are ultimately determined in quantum gravity, unless there are exactly massless scalar fields.</b><br /><br />The Standard Model has some 30 parameters, the MSSM has about 105. We got used to the parameters in mundane QFT. It looked like we could pick the spectrum and then we could also adjust the masses, mixing angles, and the renormalizable couplings, among a few others. And we also had the tendency to disfavor effective QFTs with many parameters – an Occam razor's instinct. But string/M-theory shows that it's wrong. The couplings are ultimately all determined in quantum gravity, and if they're not, there has to be a modulus, an exactly massless field that causes a new long-range "fifth force" and that violates the equivalence principle (so this possibility is heavily disfavored experimentally).<br /><br />So the "freedom" to adjust the parameters – which looked like the final answer in mundane QFT – is actually an illusion in quantum gravity. Perhaps because quantum gravity, like string/M-theory, must negotiate the peaceful co-existence between the black holes and the light elementary particles, it imposes new constraints and those imply that the allowed values of the dimensionless parameters are <em>discrete</em>.<br /><br />This change of thinking also means that it is utterly irrational to disfavor effective QFTs with a higher number of couplings. The number of couplings in a low-energy effective QFT is whatever it is predicted to be (and you should always allow all couplings that keep the symmetry, consistency, and degree of renormalizability of the theory!). 30 or 105 may look like many but it doesn't measure any "sickness" or "contrivedness" of the theory because these collections of parameters are derivable from a more fundamental viewpoint that in principle has no adjustable continuous parameters.<br /><br />Again, it's a paradigm shift that is almost certainly correct – moving us from a naive, wrong answer to the mature, correct one – and it's a paradigm shift that largely took place thanks to string theory. Even if you imagined that string theory will be superseded by a different one, it's very likely that the new one will in principle determine all the parameters, up to a discrete set of choices. Such string vacua or similar theories clearly do imply low-energy predictions, including the number of "seemingly free but not really free" parameters, and that's why all the people who are "repelled" by a higher number of parameters, or by one number or another, just don't understand the non-fundamental character of effective QFTs.<br /><br /><b>Topology of the spacetime manifold isn't a good observable.</b><br /><br />Before some string theory advances, people already knew that the spacetime was able to get curved – like in Einstein's general relativity. They did believe that in quantum gravity, it was right to imagine it as a "quantum foam" where the geometry oscillates and changes the topology. So many things may be hard. But they only talked the talk – they didn't walk the walk.<br /><br />Whenever they considered how gravity interacts with itself and other fields, they were actually <em>completely ignoring</em> these warnings that "topology in quantum gravity may be hard" etc. In particular, people were assuming that whatever quantum physics you discuss, you first need to determine some background spacetime's topology, and that gave you some subsectors of the Hilbert space. And in each Hilbert space, you could discuss various states that differed by the fields – and continuous deformations – on top of the fixed topological spacetime background.<br /><br />We know that this can't work. The "total Hilbert space" simply isn't neatly split to these "superselection sectors" separated according to the spacetime topology. In the Calabi-Yau topological transitions, we know that excitations of one topology may be said to be equivalent to other excitations of a "nearby topology". In the ER-EPR correspondence, a wormhole is equivalent to an entangled black hole pair. The two sides have different topologies but they correspond to the same states.<br /><br />So even the spacetime topology is a part of the description, a "way of thinking about some physical states", but if the two ways of thinking differ from one another, it doesn't mean that they're not the same states! So one can't uniquely associate topologies to a basis of the Hilbert space. One can't say what is the probability that a generic, chosen state of the Hilbert space has the spacetime topology X or Y. There are many possible answers to such a question. There's no general way to "measure" the spacetime topology, at least not for microscopic (Planckian) objects or highly entangled states.<br /><br />Also, the spacetime topology may "continuously" change by physical processes, in flop transitions and other critical transitions... A whole discussion of the "emergent character of spacetime" could be added here but I don't want to focus on that important point in this blog post. Again, the emergent nature of the spacetime has been a "lore" for some time but people didn't know how to deal with it mathematically. In string/M-theory, we have increasingly known how to convert the "emergent character of the space" into equations.<br /><br /><b>Supersymmetry is a rare fermionic symmetry allowed in physics</b><br /><br />In Russia, supersymmetry was basically discovered by "mathematicians" who studied some advanced "group theory". In the West, almost simultaneously, supersymmetry (the world sheet supersymmetry) was first discovered by Pierre Ramond when he worked to add fermions to the 2D string world sheet. After Ramond, simple 4D SUSY theories were built by Wess and Zumino. Supergravity theories etc. were added in the late 1970s, the MSSM has been studied since 1980 or so.<br /><br />Supersymmetry is a new kind of symmetry whose generators are Grassmannian. It is a "moral loophole" in the Coleman-Mandula theorem. Almost all string theorists' preferred models of the real Universe require supersymmetry. The alliance between string theory and supersymmetry is obvious, and so is the importance of string theory for the discovery of supersymmetry (at least in the West).<br /><br />Also, supersymmetry restricts the maximum dimension of the spacetime – basically because the dimension of the spinor-like representations, needed for fermions, grows exponentially with the dimension but it still has to match the degeneracy of the bosons. M-theory's 11 dimensions, and more subtle and debatable "12 dimensions" of F-theory, is about the maximum. I think that S-theory in 13 dimensions etc. are already extremely problematic and you just shouldn't assume that they're as physical and decompactified theories as M-theory. Even F-theory is already problematic but F-theory's usage for the construction of stringy vacua is a hard science that works (but two dimensions out of 12 are simply not quite decompactified in fully physical vacua, they are a torus). S-theory is not, so far.<br /><br /><b>Near the Planck scale, the idea of finitely many local fields is not OK.</b><br /><br />People sort of knew it from the beginning – if one studies quantum gravity, something prevents you from localizing objects and particles with a better precision than one Planck length or so. String theory makes these guesses quantitative in various ways. Particles can't be quite point-like, they are typically objects such as vibrating strings whose size cannot be smaller than the Planck length. Their internal fluctuations make it unavoidable that their limbs may fluctuate at least one Planck length away.<br /><br />Also, if you study all particle species at the Planck length resolution, you will find infinitely many, like infinitely many excited string modes. At a higher coupling, they're not quite independent because of ER-EPR and other things.<br /><br /><b>QFTs are ultimately not man-made, and they're connected within a natural theory and its "landscape".</b><br /><br />As I mentioned, in mundane QFTs, one thought that physics is an inventors' game. You pick your building blocks – particle species or fields – and their interactions. These are like different car models, separated from each other.<br /><br />String theory makes it clear that physicists are ultimately discovering, not inventing or constructing, effective QFTs. QFTs compatible with quantum gravity form a particular set that looked "unlimited" when people were in the naive mundane QFT stage. But now they have analyzed a lot of physics and we arguably know "a big chunk of the allowed effective QFTs". <br /><br />Their particle spectrum and the qualitative characteristics of the interactions aren't something you can really choose freely. For some choices, there may exist no vacuum of quantum gravity that allows the particle spectrum or some forms of the potentials and other interactions. Such QFTs forbidden within quantum gravity are referred to as the swampland.<br /><br />The allowed theories are actually created by Nature – they are solutions to some fundamental equations. They are long-distance limits of string/M-theoretical vacua. A theory with certain properties at low energies may exist or it may refuse to exist. The answer isn't up to you. There are deeper laws and the "man-made construction" of a QFT is just a "guess" how a limit could look like.<br /><br />In the mundane QFT phase, people were always "bottom-up model builders", assuming that they had the freedom to build the theory in any way. They actually <em>knew</em> that the low-energy laws were just <em>derived</em> from more fundamental ones – by taking the limit or by the RG flows. But as in other cases, they talked the talk but didn't walk the walk. Now we know that we have to walk the walk. We are really forbidden from considering some type of effective QFTs in quantum gravity.<br /><br />And some pairs or sets of QFTs may have looked equivalent at low energies – but they may still be limits of string vacua that differ at higher energies. The similarity or equality of two low-energy QFTs is therefore "an illusion", the underlying string vacua may be "very far from each other" according to some relevant measure at Planckian energies. People always knew that long distances were "derived" but they often thought as if the low-energy QFTs were "fundamental", anyway. We already know it is wrong to do so.<br /><br /><b>The choices of the QFT spectrum result from a geometric picture that should be looked for, that doesn't have to be unique, but the properties of that geometric picture may clarify special properties of the QFT.</b><br /><br />In mundane QFT, the particle spectrum was an arbitrary man-made input. In string theory, the spectrum is derived from the modes of strings and other buildings blocks that propagate and co-exist (with branes, fluxes etc.) in some higher-dimensional geometry.<br /><br />Even if we don't know what's the right stringy geometrization of our favorite QFT, like the Standard Model, we know that such pictures exist and they're almost certainly more fundamental. Also, they explain some special "accidents" in a QFT. For example, the decoupling of two sectors in a QFT may be due to the geometric separation of the excitations in the direction of extra dimensions, e.g. in a braneworld where the sectors arise from non-interacting brane stacks.<br /><br />You can still imagine that the choice of the fields and interactions is a "man-made process based on the human freedom" but you're simply not at the cutting edge of theoretical physics if it is so. If you keep on making this "man-made" assumption, you're like Kepler who identified the planets with Platonic bodies, assuming that such guesses could be right. But in our new stringy pictures, we really realize that random guesses like that have no a priori reasons to be right. Either you have some experimental evidence or it's just a silly unlikely guess. The likely guesses are those that may arise from a natural – complete or incomplete – UV-completion of the QFT, from a string compactification.<br /><br /><b>The choice of gauge symmetries isn't fundamental, either: gauge symmetries come and go and they're not real symmetries.</b><br /><br />In particular, the Standard Model starts with a choice of the \(SU(3)\times SU(2)\times U(1)\) gauge group. Like the spacetime topology, it's a "first choice" and everything else has to adapt it. In the new stringy picture, we know it's not the case. The gauge group is derived from the geometric properties of the compactification as well. And the choice of the gauge group isn't fundamental. After all, the gauge group isn't a real symmetry because physical states have to be invariant singlets.<br /><br />Heterotic string theory allows to interpolate between \(E_8\times E_8\times U(1)^2\) and \(SO(32)\times U(1)^2\), to mention a cool example. The groups have the same dimension and the same rank but they are clearly different. Just like we can gradually change the spacetime topology, we can deform the compact spacetime dimensions so that the relevant low-energy gauge group changes from one to another. Also, we know that the Yang-Mills gauge groups may be continuously connected to the diffeomorphism symmetry of GR – like in the Kaluza-Klein construction. But string theory gives us lots of new constructions that show the "sibling status" of Yang-Mills symmetries and general covariance. We actually know that these things, again thought to be rather separate choices in the past, are connected aspects of the same underlying substance.<br /><br /><b>Second quantization of fields isn't the only way to describe multi-particle states</b><br /><br />Relativistic quantum mechanics requires quantum fields and their creation and annihilation operators automatically allow antiparticles and multi-particle states. That was a great insight around 1930. In the mundane QFT stage, people thought it was the only way to get or describe the theories with multi-particle states.<br /><br />But we know it's not the only one now. In particular, matrix models allow the description of composite systems in terms of "block-diagonal matrices" in some theories whose degrees of freedom are large-size matrices. The BFSS paper is the simplest example of an equivalence between such a non-gravitational matrix model and something that should look like an effective QFT at long distances – namely 11D supergravity. The BFSS matrix model was the first full complete definition of M-theory at all energies.<br /><br />People could have always said that "the fundamental theory of our Universe doesn't have to be given by a strict QFT" but they didn't know how to reproduce all the realism and advantages of a QFT by a non-QFT, or something that explicitly avoids the "man-made construction of multiparticle states" by simply combining creation operators in a chain. Matrix models allow N-particle states to co-exist for many values of N.<br /><br />So we really know that the QFT apparatus isn't even needed to get multiparticle states in a relativistic theory – we have other descriptions that achieve the same goal without explicit chains of creation operators. The inevitability of QFT as a framework for the multi-particle states has decreased or disappeared.<br /><br /><b>Summary</b><br /><br />This list is surely not complete and some important entries are missing – and I will realize some of them in an hour from now. But there simply are entries like that which show that our assumptions about how we should proceed in QFTs, what is natural in QFTs, and whether QFTs are necessary at all were simply wrong – once you try to construct complete theories that incorporate quantum gravity. String theory has shown that lots of things previously considered impossible or extremely unlikely are actually possible if not omnipresent. Some unnatural things became possible. Other, previously possible things, are banned.<br /><br />So we know that to stick to the old picture means to be attached to something analogous to the medieval prejudices about science. There were vague reasons to believe those prejudices in the mundane QFT stage. But the research into string/M-theory has simply falsified them – much like the Flat Earth has been falsified. Our modern stringy proofs that settle these questions are much more reliable than the vague guesses that have led to the old answers – many of which are believed to be incorrect now. So if you're a competent theoretical high-energy physicist as of 2019, you simply need to know the modern answers obtained with some very explicit and indisputable evidence – and you have to abandon the prejudices that used to be justified by sloppy evidence and that have been proven wrong.<br /><br />The idea that physicists will "return" to an epoch in which string theory and its lessons may be ignored is as childish as the idea of a "return" to the Flat Earth. Science just doesn't work like that. Even in the absence of some "characteristically stringy empirical evidence", string theory has brought us proofs of many important, even philosophically game-changing statements that have falsified some incorrect hypotheses in the future.<br /><br />The falsification of those old expectations – and this falsification had the form of a nearly rigorous mathematical "disproof" – cannot be undone. Falsification can never be undone. And the fact that no actual new experimenters were needed for the advances changes nothing about the irreversibility of the disproofs whatsoever. So unless you undergo lobotomy or burn all books and web pages that carry the knowledge about string theory, it's just impossible to "unlearn" string theory. Everyone who suggests that top theoretical high-energy physicists of 2025 could work on something that denies the whole history and lessons of string theory are completely detached from any kind of rational thinking about science and you should never assume that they're "equivalent" to actual physicists because they are not.<br /><br />And that's the memo.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-73627374296053368262019-05-01T09:19:00.001+02:002019-05-01T13:30:59.719+02:00First stringy steps: how a young fieldist expands her mind to become a string theorist<b>And yes, "she" is probably but not necessarily a young man</b><br /><br /><a href="https://motls.blogspot.com/2019/04/string-theorists-approach-status-of.html?m=1">Three days ago</a>, I mentioned that a "string theorist" is a description of expertise that includes most of "quantum field theory" but it goes beyond it, too. Seeing the world in the stringy way opens new perspectives, new ways to look at everything, and unleashes new powerful tools to theoretically wrestle with all the world's scholarly problems.<br /><br /><img src="https://d35c7d8c.web.cern.ch/sites/d35c7d8c.web.cern.ch/files/bg_13.gif" width=407><br /><br />In practice, string theory isn't some philosophical superconstruction on top of quantum field theory (QFT) that is very different from the QFT foundations. Instead, string theory calculations are almost entirely identical to QFT calculations – but QFT calculations with new interpretations and new previously neglected effects. Most of the fundamental insights of string theory are irreversible, nearly mathematically rigorous insights about <em>previously neglected properties and abilities of QFTs</em> and especially previously overlooked properties of some special QFTs.<br /><br />What are the limitations of a QFT student that prevent her from seeing physics through the new, stringy eyes? Let me look at these matters a little bit technically.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />OK, let's first review the QFT. The Standard Model is the most "practical yet comprehensive" QFT relevant for the experiments that are actually being made. All the details are technical and only roughly 100,000+ people in the world understand them well enough. But the "verbal summary" is rather concise.<br /><br />QFT is a special kind of quantum mechanics (QM). So we calculate probabilities of possible outcomes of observations. These probabilities are computed as the squared absolute values of the complex probability amplitudes – some matrix elements of linear operators on a complex Hilbert space.<br /><br />In practice, in QFTs, those are computed as sums of the Feynman diagrams, such as one at the top. The internal lines are "propagators", linked to the two-point functions of quantum fields (and to the bilinear terms in the Lagrangian) and representing "virtual particles" that are seen neither in the initial state nor in the final one. The vertices come from higher-than-linear terms in the Lagrangian and they are needed for all interactions.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />These Feynman rules – probability amplitudes are sums of Feynman diagrams – are derived either from some Dyson-like operator approach or from the Feynman sum over histories, the path integral. Each Feynman diagram translates to an integral – over locations of the vertices in the spacetime or over momenta of the propagators. <br /><br />In the Standard Model or any particular QFT, there is a spectrum of possible propagators. They correspond to spin 0 or 1/2 or 1 particles in the Lagrangian. Some of them are gauge fields, you learn about the gauge symmetry, and if you're a bit advanced, you really master the renormalization, renormalization group, and non-perturbative effects such as instantons, among a few other things. I wanted to be really concise – so that's it. You must only understand that these several simple paragraphs translate to some 1,000 pages from textbooks if you really want to understand what my words mean – so that you can use the QFT apparatus! ;-)<br /><br /><b>Now, what are the new objects or treatments that string theory adds? How do you upgrade yourself from "one of the 100,000" QFT experts to "one of the 2,000" more or less string theorists?</b><br /><br /><iframe align="left" scrolling="no" frameborder="0" style="width:140px;height:245px;" marginheight="0" src="//ws-na.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&OneJS=1&Operation=GetAdHtml&MarketPlace=US&source=ac&ref=tf_til&ad_type=product_link&tracking_id=lubosmotlsref-20&marketplace=amazon®ion=US&placement=0000000000&asins=0521672279&show_border=false&link_opens_in_new_window=false&price_color=BBBBBB&title_color=FFAA44&bg_color=002211" marginwidth="0"/></iframe>Open a basic textbook on string theory such as Polchinski's book. I could only <a href="https://books.google.com/books?id=jbM3t_usmX0C&printsec=frontcover&hl=cs&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false" rel="nofollow">open Volume I</a> of Polchinski because Nima Arkani-Hamed has borrowed my Volume II, I think, and he hasn't returned it yet. ;-) Already the initial chapters and sections of the textbook bombard the reader with great new insights that are spiritually "beyond" the mundane QFT apparatus sketched enough – apparatus optimized for the scattering amplitudes in the Standard Model. But I want to present the novelties independently.<br /><br />The first novelty is that there are scale-invariant, conformal field theories (CFTs) and they have some special characteristics and allow new constructions and objects.<br /><br />In the primitive QFT as sketched above, it doesn't matter much whether a particle is massive. A propagator may contain an extra \(-m^2\) or not. It's not a big deal. Gauge bosons and gravitons really have to be massless at the fundamental level – well, gauge bosons may get masses through the Higgs mechanism – but the calculational framework isn't affected much. At most, the massless particles are a pain in the buttocks because they may add long distance, infrared divergences and related problems.<br /><br />In CFTs, massless particles aren't a liability. They are a virtue if not a necessity. Well, CFTs don't really allow massive elementary particles because those carry a special mass scale \(m\) and the corresponding special distance scale \(1/m\) which would destroy the scale invariance of the theory. Theories with massless particles are the beef of any CFT research. And CFTs bring new spacetime symmetries beyond the Poincaré group of translations, rotations, and boosts: the scaling and conformal symmetries.<br /><br /><img src="https://inspirehep.net/record/939745/files/cylinder.png" width=407><br /><br />In the \(d\)-dimensional spacetime where one of the dimensions is time, the Lorentz group is \(SO(d-1,1)\), isn't it? The conformal group is \(SO(d,2)\), I added one temporal and one spatial dimension. Relatively to the smaller Lorentz group, we have the extra \(J_{+-}\) that generates the scaling, \(J_{+i}\) that generates the usual translations, and \(J_{-i}\) that generates special conformal transformations. In two spacetime dimensions, there is an exception: the conformal group is infinite-dimensional, at least locally: any holomorphic function of the complex variable \(z\) preserves the angle at every point of the plane (we are talking about the Euclideanized spacetime or world sheet – the relationships to the Minkowski-signature ones is obtained by a Wick rotation or a similar analytical continuation). This is what you learn in the complex analysis – a mathematics course – as an undergraduate. Now, the action is invariant under these conformal transformations.<br /><br />For example, an infinite cylinder is equivalent to the plane with the origin removed – the exponential map \(z=\exp(-iw)\) is how you do it, the situation is clarified by the picture. So the insertion of some operator at the point \(z=0\) is equivalent to the information about the state inserted to the evolution of the infinite cylinder at \(w\to -i\infty\). Quite generally, in CFT, you do want to study operators inserted at particular points, including very complicated operators, and the behavior of the theory when two or many such operators are inserted somewhere.<br /><br />This is new relatively to the mundane QFT at the top. The mundane QFT really tells you that you should better not insert too many operators to several points, especially not nearby points, because that's a way to get ultraviolet (UV) divergences, i.e. short distance divergences, and those are a liability. But in CFTs which don't care about scales, there's nothing wrong about short distance and UV (just like there's nothing wrong about the IR) because all distances are physically equivalent by the scaling symmetry. So in fact, you do want to play with correlators of operators that are very close to each other. These correlators encode – in a new way – all the physical information about the interactions at the "finite" distances.<br /><br />CFTs are generally important in QFT – they're the "fixed points" of the renormalization group, and therefore an essential starting point to understand the set of all QFTs according to the renormalization group paradigm. But CFTs are also vitally important in string theory. While the mundane QFT doesn't tell you anything about CFTs, as an upgraded QFTist or string theorist you must be ready to probe special properties of QFTs with massless particles and scale invariance and new constructions that are only possible when the conformal symmetry works.<br /><br />In string theory, CFTs are important in the AdS/CFT realization of holography – CFTs on boundaries of the anti de Sitter space are equivalent to the full quantum gravitational (string/M) theory in the AdS bulk. But 2D CFTs are also the defining theories of any perturbative string theory – whose predictions are always calculated from the appropriate world sheet CFT.<br /><br />You need to recall what you should have learned when you studied the holomorphic maps. How do you write down a complex holomorphic function that maps one region of the complex plane to another? You may need many of these things, especially the most elementary ones such as the exponential, logarithm, and the rational function \(z'=(az+b)/(cz+d)\).<br /><br /><b>State-operator correspondence</b><br /><br />You should understand why the spectrum of "states on a closed string" is the same as the spectrum of "operators inserted at \(z=0\)". It has to be so because the states or the operators are needed to clarify what's happening at \(z=0\) i.e. \(w\to -i\infty\) and the rest of the 2D spacetime, the plane or the cylinder, is equivalent through the conformal transformation.<br /><br />This correspondence, SOC, is the only good thing in the Universe that starts with "soc", the rest is some social, societal, and socialist junk.<br /><br /><b>New important spacetime symmetries</b><br /><br />You need to learn the basic mathematics of the conformal symmetries. Why are the angle-preserving transformations isomorphic to a Lorentz group in a higher-dimensional spacetime? How do these maps work? What about the spherical inversion? Why is the CFT invariant under the spherical inversion?<br /><br /><b>Shocking new equivalences: bosonization and fermionization</b><br /><br />Especially when the masses are zero, and you deal with CFTs, there are some new equivalences between theories that would sound impossible from the mundane QFT viewpoint. One of them is the equivalence of bosons and fermions. In QFT, you think that the Fock space built from a bosonic field is totally different from the fermionic Fock space. It's different from a pair of fermionic Fock spaces, too (OK, by the pair, I really meant the tensor product of two fermionic Fock spaces, sorry). If the occupation numbers are any non-negative integers, it must be a totally different spectrum than the spectrum of a theory where the occupation numbers are either zero or one, right?<br /><br />In CFTs, this "obvious" conclusion is wrong. In fact, a free boson is equivalent to two free fermions. Some generalizations of this statement exist for interacting bosons and fermions, too. A boson with a sine self-interaction is equivalent to fermions with a quartic interaction in \(d=2\) CFTs. How is it possible?<br /><br />I believe that it's a good idea for a "mundane QFTist who is just upgrading herself to a string theorist" to verify this equivalent up to the extent that convinces her that something really works here – or perhaps more rigorously than that. One check is to count the degeneracies of excited states on an open or closed string. Two fermions may lead to the same degeneracies at each level as a free boson, assuming the corresponding matching choice of boundary conditions in both theories.<br /><br />Another one – which is equivalent to the counting of the states above – is through operators. The fermions may be defined as exponentials:\[<br /><br />\psi = \exp(i\phi), \quad \bar\psi = \exp(-i\phi)<br /><br />\] Well, there should also be some "ordering" sign, \(:\exp(\dots):\), which you need to master once you study these things really seriously. The exponential mapping between operators may sound very strange from a mundane QFT viewpoint but it's natural in CFTs. The bosonic fields \(\phi\) may be viewed as "generators of some operations" so if you exponentiate them, you may get a finite operation which may be equivalent to the insertion or destruction of a fermion. In effect, the exponential of the bosonic field creates a "kink", a discontinuity that can't be combined with another copy of the same discontinuity, so it ends up having the Fermi statistics (Pauli's exclusion principle).<br /><br />The inverse relationship is bilinear, of the form \(\phi\sim \bar \psi \cdot \psi\), because you need to cancel the charges of the fermionic fields. The current for this \(U(1)\) charge is \(\partial\phi=\bar\psi\partial \psi\). You need to study this equivalence – bosonization or fermionization – to be sure that the mundane QFT viewpoint prevented you from seeing some relationships that are clearly true and almost certainly important.<br /><br /><b>Operator product expansions (OPEs)</b><br /><br />The mundane QFT apparatus allows you to think in terms of "states" most of the time, like the people who think that QM is about states and not operators. However, the advanced QFT or string theory really forces you to admit that actual physics is about operators. So for example, in a QFT textbook, you could have learned about the anomalous dimensions of operators. But you didn't care – you didn't need such stuff for the computation of scattering amplitudes which seemingly included "all the interesting physics".<br /><br />In CFT, you need anomalous dimensions of operators. In mundane QFTs, the anomalous dimensions start with terms proportional to \(g^2\) etc., the squared coupling constant (that's also how the couplings "run" etc.). In CFTs, the anomalous dimensions may be "non-adjustable", fractional numbers such as \(1/16\). It's all very exciting. You may see interesting, both free or interacting, CFTs that can't be understood as deformations of a free QFT with an interaction that has a coupling constant. Instead, the coupling constant seems to be "fixed". Even for the free fermion, the spin field that creates a special point making the fermion antiperiodic around the location of the spin field insertion happens to have the dimension of \(\Delta=1/16\). You couldn't have constructed fields of dimensions \(1/16\) in the mundane QFT, could you? All dimensions were integer multiples of \(1/2\). You thought that only "de facto polynomial" functions of the fields and their derivatives were possible and more complex dimensions were impossible for that reason. But that conclusion was premature.<br /><br />So you need to learn what happens when two operators are inserted next to each other. There is some singularity. You know that the commutator of two operators \(F(\vec x)\) and \(G(\vec y)\) in mundane QFTs may produce a delta-function. But the simple product is harder – and the leading term when \(|\vec x-\vec y| \to 0\) is encoded in some Green's functions. In CFT, you need to focus on these things from advanced chapters of QFT textbooks that looked as "useless complications".<br /><br />The insertion of the two operators at points \(\vec z\) and \(0\) may be replaced by the insertion of one operator at \(\vec z =0\). You may expand this new operator in some power series in \(\vec z\). The leading terms are the singularities, usually \(c\)-numbers, that may be extracted from the Green's functions. These OPEs end up being important because they encode the transformation of operators under various symmetries generated by other operators, stringy scattering amplitudes in some limits, and more.<br /><br /><b>Monodromies: operators orbiting each other</b><br /><br />I mentioned many new things about QFTs that emerge when you study CFTs in any dimension. But the stringy world sheet has \(d=2\) where many new things occur. In particular, in a plane, a point may orbit another point and this is a topologically non-trivial operation. One may generate a phase or something nontrivial when one operator completes a full orbit around another.<br /><br />You need to understand how these operations may be linked to boundary conditions on a closed string. You need to understand that the situation in which the orbiting does "nothing" is special, we say that the operators are mutually local. And you need to learn how to calculate such things not only for the "basic" operators such as \(\phi,\psi,\bar\psi\) I mentioned above; but also for operators such as \(\exp(a\phi)\).<br /><br />Quite generally, the calculations involving the operator \(\exp(a\phi)\) where \(a\) is a number and \(\phi\) is a bosonic field are very important in CFTs. That's another fact that would look shocking from a mundane QFT viewpoint – that viewpoint only "encouraged" you to consider polynomial operators made of the basic fields. But I mentioned that these exponential operators with a particular value of \(a\) – well, there should have been \(1/2\) in my bosonization exponents, I can tell you now, at least in the normal conventions – are important for bosonization and fermionization. <br /><br />But these operators are needed to define string states with a generic momentum, too. You should learn how to compute their anomalous dimensions which scales like \(a^2\) and is related to the mass of the string. You should learn how to orbit these operators around each other, and more. There was nothing special about "exponential of fields" in the mundane QFTs but these objects are important and omnipresent in CFTs and string theory that uses world sheet CFT.<br /><br /><b>Virasoro algebra</b><br /><br />It is an infinite-dimensional Lie algebra generating all the reparameterizations of a circle, a periodic \(\sigma\) variable. It's generated by \(L_m\) and the commutator is \[<br /><br />[L_m,L_n] = (m-n) L_{m+n}<br /><br />\] in the simplest case. You should understand how it works, perhaps learn the central charge extension of the algebra as well, and basics of how to look for its representations. It is important because this algebra is a residual symmetry on the world sheet. It plays a similar role as the Yang-Mills symmetry or diffeomorphism symmetry (of GR) in the spacetime. On the other hand, the unphysical states of the Virasoro symmetry on the world sheet may be <em>matched</em> to the unphysical states in the spacetime – due to the Yang-Mills and diff symmetries. The world sheet gauge symmetry principles "produce" all the spacetime gauge symmetries that you need.<br /><br />There are less and more rigorous ways to deal with the Virasoro algebra, the BRST treatment is a modern advanced one.<br /><br /><b>Topologies of world sheets, cohomology etc.</b><br /><br />The higher-order string scattering amplitudes may be written as path integrals over world sheets of harder topologies – pants-like diagrams where strings merge and split, a sort of thickened versions of Feynman diagrams. Up to conformal transformations, the moduli spaces of possible shapes of such higher-genus Riemann surfaces are finite-dimensional. You should understand what the dimensions are, why they're finite at all, how the moduli spaces roughly look, and understand something about why the unitary S-matrix in string theory requires you to integrate over the moduli spaces in the most natural way, and what the most natural way is.<br /><br />The genus \(h\) topologies have some non-contractible loops. This is a kind of "topology 101" – and algebraic geometry – that you may need to analyze spacetime (compactification spaces), too. Homology, cohomology, their relationships with forms and cycles matter.<br /><br /><b>CFT on sphere, torus, and other important low-genus topologies</b><br /><br />The world sheet is normally considered compact – because all the infinite cylinders corresponding to the external particles may be "shrunk" and conformally mapped to disks. You should know the moduli space of such low-genus diagrams, with and without extra operator insertions. For the sphere, which is conformally equivalent to a plane by a stereographic projection, you need to see the Mobius \((az+b)/(cz+d)\) transformations.<br /><br />A half-plane is a \(\ZZ_2\) quotient of the plane, the \(\ZZ_2\) is generated by the complex conjugation of \(z\).<br /><br />But the torus is a "one-loop" diagram and has some special mathematics. A torus is a plane modulo a 2D lattice. The lattices that produce the same tori are equivalent via the \(SL(2,\ZZ)\), the modular group. The 2D torus may be read as a spacetime diagram in two different ways: the Euclideanized time is either the vertical or the horizontal direction. This gives you an equality between two different partition sums for different, basically inverse, temperatures! You should roughly know why it works – and then how it works precisely.<br /><br />At the mathematical level, you have a great opportunity to learn the modular forms, eta and theta functions, and similar stuff to express these partition sums and their symmetry properties (under the modular group in particular).<br /><br /><b>T-dualities and other equivalences</b><br /><br />The T-duality is a reparametrization of the fields on the world sheet that is somewhat analogous to the fermionization and bosonization but the basic form only requires bosonic fields. There's a way to switch from a bosonic field \(X\) to the T-dual field \(\tilde X\) on the world sheet. What actually happens is that \(X\) may be split into the left-moving and right-moving part (or the holomorphic and antiholomorphic modes, if you use the Euclideanized world sheet). <br /><br />And the T-duality is the reflection \(X_L \to -X_L\) that "mirror reflects" the spacetime coordinate \(X\), a field describing the embedding of the world sheet into the spacetime, but the T-duality only reflects the left-moving part of \(X\) while the right-moving one is conservatively kept fixed! (Or vice versa, but physicists' conventions admit that it's more natural for the right-movers and right-wingers to be conservative.)<br /><br />If you already know how string theory amplitudes are extracted from the 2D world sheet CFT, you will realize that this implies the equivalence of string theory on two totally different spacetimes.<br /><br /><b>Derivation of Einstein's equations and other spacetime effective equations</b><br /><br />2D CFTs are rather rare. They include the free bosons, free fermions – with lots of equivalences between the two – then things like the Ising models and minimal models. The latter are basically "countable", there is a spectrum of "exceptions" that still manage to be CFTs.<br /><br />But there are also CFTs with lots of parameters. The non-linear sigma model is the most important master example. The kinetic term \(\partial_\alpha X^\mu \partial^\alpha X_\mu\) in the world sheet Lagrangian is generalized by its being multiplied and contracted with a general function of \(X\),\[<br /><br />g_{\mu\nu}(X^\gamma)\cdot \partial_\alpha X^\mu \partial^\alpha X^\nu<br /><br />\] So all the values of the function \(g_{\mu\nu}\), for every value (point in spacetime) \(X^\gamma\) of the argument and for every choice of spacetime vector indices \(\mu,\nu\), is adjustable. It's exactly the information that defines a metric tensor field in the spacetime. Great. For every spacetime geometry, you may write down a theory for strings propagating on that spacetime.<br /><br />This theory looks conformal for every choice of the tensor. However, there are quantum effects that also violate the scale invariance in general. In particular, for each point \(X^\gamma\) and each choice of \(\mu,\nu\), the coupling constant \(g_{\mu\nu}\) has its \(\beta\)-function encoding its "running with scale", and that \(\beta\)-function has to vanish for the world sheet theory to be actually scale-invariant at the quantum level.<br /><br />And the cancellation of these "anomalies" actually tells you that the spacetime metric tensor must obey Einstein's equations! The \(\beta\)-function for the coupling \(g_{\mu\nu}(X^\gamma)\) ends up being basically the Ricci tensor at the same point, \(R_{\mu\nu}(X^\gamma)\). Its vanishing requires the Ricci flatness i.e. Einstein's equations in the vacuum. You may derive the defining equations of general relativity just from the requirement that the "conformal" strings may propagate on that spacetime!<br /><br />This is true for all other effective field equations in the spacetime. If open or closed string modes produce gluons or electrons, their Yang-Mills or Dirac equations may be deduced from the conformal invariance of the world sheet theory at the quantum level! The right hand side of Einstein's equations (and all other spacetime equations) also correctly emerges if you calculate other contributions to the \(\beta\)-function.<br /><br /><b>Lots of extra technicalities</b><br /><br />Weyl and diffeomorphism symmetry of the world sheet dynamics, fixed into the conformal symmetry, \(bc\) ghosts needed for that. Closed and open strings, various boundary conditions, how it affects both the states and the operators (open string vertex operators live on the boundary of open world sheets). Orbifolds and how their consistency requires something to work for the toroidal world sheets (modular invariance I mentioned). D-branes and how T-duality changes the dimension of the locus where open strings end. How the D-branes carry new fields. Why their dynamics is often Yang-Mills like. Addition of fermions to the world sheet, superstrings. Unorientable strings, orientifolds, and world sheet diagrams that are the projective sphere, Möbius strip, and Klein bottle – all those may be obtained from the sphere and the torus. And infinitely many harder topologies with boundaries and crosscaps.<br /><br />And of course the critical dimension. Why the scale invariance of the world sheet theory at the quantum level implies \(D=26\) for bosonic string theory and \(D=10\) for the superstring. Polchinski calculates \(D=26\) in seven different ways, to assure a sensible reader that there's some "deep truth" about that result.<br /><br /><b>Summary</b><br /><br />There are lots of wonderful insights about QFTs that happen to be CFTs – and especially CFTs in \(d=2\) which is appropriate for a string world sheet. These things can't ever <em>disappear from physics again</em> because they're really <em>established mathematical facts</em> about some classes of QFTs. If and when you study these things, and if you're intelligent, you will realize that it has been silly for you to be ignorant about them. You will know that they cannot be ignored. To "ban them" would be about as weird as banning molecular or nuclear physics or condensed matter physics (e.g. crystal lattices) for someone who has just mastered atomic physics.<br /><br />Lots of special identities hold in CFTs or \(d=2\) CFTs and lots of new consistent objects may be defined and many consistent operations may be performed. There's a way to define a unitary S-matrix for states in the spacetime that looks just like one from an "advanced QFT" but also includes consistent quantum gravity. All these things look at least as natural as those in spacetime QFTs – but the gravity is added on top of that.<br /><br />You will encounter some old objects – anomalous dimensions etc. – more often than in mundane QFT. You will learn some new functions, gamma functions for the tree-level amplitudes; eta and theta functions and modular forms for the toroidal partition sums and correlators. You will deal with some previously "unnatural" operators such as exponentials of bosonic fields. You will often treat the left-moving and right-moving (or holomorphic and antiholomorphic) parts of the fields separately, something that is impossible in \(d\gt 2\). Mundane QFT was telling you that "you shouldn't do certain things" but many of these things are extremely important, useful, and lead to new deep insights.<br /><br />Already at the level of perturbative string theory, basically Volume I of Polchinski, you will see that too many things seem to work. The amount of great surprises and unbelievable consistency gets even more formidable once you study non-perturbative string theory, S-dualities, string-string duality, maps between D-branes and black \(p\)-branes, once you can microscopically calculate black hole entropy, geometerize the gauge symmetries in many new ways, find many more dualities (unexpected equivalences between vacua of string theory or QFTs), and more. The existence of string/M-theory "explains" all these particular coincidences and equivalences as well as other unexpectedly constrained yet consistent constructions – and it also "happens" to be a theory that is capable of producing all the predictions as the QFT class (plus consistent quantum gravity amplitudes).<br /><br />At some psychological level, the transition from "one in 100,000 QFTists" to "one in 2,000 string theorists" in the world starts by realizing that the mundane QFT picture is not the whole story. It hides many wonderful, mathematically natural things that may be done with quantum fields and many of their properties. It hides many special QFTs, like CFTs or supersymmetric QFTs or CFTs, and even more special kinds of those, that have even more striking properties. You will only make the transition from a "quantum field theorist" to "string theorist" if you have the sufficient curiosity and desire to understand how "things really work"; and sufficient intelligence – so that you know that you haven't run out of your mental capacity once you got to the mundane QFT level.<br /><br />Academically speaking, you don't need to be "certain" that string theory correctly describes our real Universe at a much better accuracy than any spacetime QFT. But if you actually master this material, so that you could get an A or B from most of the exercises e.g. in Polchinski's book, you will surely agree that it's utterly idiotic to <em>ignore</em> the existence of string theory or pretend that theoretical high-energy physics may continue or should continue while carefully <em>avoiding</em> all these stringy and similar (or similarly advanced) insights, constructions, and coincidences (that aren't quite "coincidental" because they're really "explained" by the existence of a unifying, deeper theory that unifies them all, string theory).<br /><br />String theory is more than the mundane QFT but they are tightly connected and inseparable. They form one continuum of insights – one may be more or less familiar with that continuum but there exists no meaningful framework that could present "less familiar" as an advantage. You clearly become a better expert in the properties of QFTs once you master at least basics of string theory.<br /><br /><hr><br /><iframe width="407" height="277" src="https://www.youtube.com/embed/NSYn8QArAD4" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe><br /><br />The left-wing establishment has restored propaganda, censorship, politically motivated dismissals etc. but they haven't revived the tradition of huge May Day parades yet. Check what the May 1st 1986 rally in Prague, five days after Chernobyl, looked like. Included are kids who are there for the first time, excited black students of biochemistry, history's criminals such as Marx, Engels, Lenin, and Gottwald, as well as the glorious Czechoslovak leaders of the mid 1980s. At the beginning of the march, you might have met the workers from the technological Tesla factory – some things aren't changing at all. I remember that such parades looked rather high-tech to me, at least in Pilsen, but when I watch this video, it is embarrassingly low-tech.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-88955969585124070352019-04-28T09:38:00.001+02:002019-04-28T21:35:36.262+02:00String theorists approach the status of heliocentric heretics<a href="https://en.wikipedia.org/wiki/Galileo_affair">Galileo Galilei was legally harassed</a> between 1610 and 1633. Most of us agree that the Inquisition was composed of dogmatists who were suppressing science. Some of them were rather smart but they were still dogmatists. However, what would be wrong to imagine is that Galileo was tortured in a dungeon. <br /><br /><a href="https://artuk.org/discover/artworks/milton-visiting-galileo-when-a-prisoner-of-the-inquisition-125949" rel="nofollow"><img src="https://d3d00swyhr67nd.cloudfront.net/w1200h1200/CDN/CDN_WELL_L_51761.jpg" width=407></a><br /><br />Instead, this is how Solomon Alexander Hart (1806-1881) saw Milton's visit to Galileo when the latter was imprisoned. Galileo lived in a pretty fancy prison, right? He had what he needed to keep on thinking. You may compare Galileo's fancy spaces to the <a href="https://www.google.com/search?q=witten+office&um=1&ie=UTF-8&hl=en&tbm=isch&source=og&sa=N&tab=wi&biw=1317&bih=708" rel="nofollow">modest, prison-like office of Edward Witten's</a> or, if your stomach is strong, to <a href="https://www.google.com/search?q=guth+office&um=1&ie=UTF-8&hl=en&tbm=isch&source=og&sa=N&tab=wi&biw=1317&bih=708">Alan Guth's office</a>, voted the messiest office in the Solar System. ;-)<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />Hasn't the Catholic Inquisition provided Galileo with a kind of luxury that Guth can't dream about? (Sorry, Alan, I have abused the fact that no one has access to my rooms LOL.)<br /><br />OK, Galileo wasn't murdered by those intellectually inferior Catholic apparatchiks. Even his local comfort wasn't locally reduced. He was really "just" prevented from enjoying the freedom to interact with the mankind and to publish anything he wanted, from fully and directly influencing the intellectual world which a man of Galileo's caliber has deserved and which would have been beneficial for the mankind.<br /><br />These days, it's happening to conservative philosophers and also to thinkers who study ideas more deeply than the masses indoctrinated by embarrassing antiscientific superstitions such as the climate change panic, psychological equality of men and women, and similar nonsense which may be classified as overwhelmingly far leftist these days.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />I am convinced that the number of young people who want to do very high-brow things – like string theory research – has dropped sharply in a recent decade. I still try to follow who these people are. But as recently as two decades ago, the identity of these smartest people on Earth would be a matter of exciting debates. Who is the new young Susskind, Witten, or Schwinger? These days, I don't want to mention the names of the smartest theoretical physicists below 30 or stuff like that because I feel that the very publicity would hurt them.<br /><br />These ingenious people have to hide from the public eye because the mass culture of 2019 prefers mediocrity, mindless obedience, laziness, and superficial spitting on all the essential structures and mechanisms in Nature and the society (Greta Thunberg is quite a symbol for many pathologies of the present) and these people don't fit into that picture.<br /><br />Under the most recent post "Falsifiability and physics" (promoting the <a href="https://motls.blogspot.com/2019/04/popper-self-described-anti-dogmatist.html?m=1">dogmatist and fundamentally flawed Popperist memes</a>), an <a href="http://www.physics.rutgers.edu/~lath/">experimental (and therefore impartial) particle physicist</a> from Rutgers, my Graduate Alma Mater, has pointed out that the students planning to learn and do string theory are the cream:<br /><blockquote><b>Amitabh Lath</b>: the longevity of string theory is not due to the middle-aged practitioners you mention but kids in their early 20s who continue to choose to go into the field. Some of the best undergraduate students in our high energy experiment group have over the years chosen to go to grad school in theoretical physics 🙁 <br /><br />Some go into phenomenology but some are indeed doing string theory. <br /><br />These students are the smartest and most sensible I have ever met, the cream of the Garden State [New Jersey]. They devour the literature, they are fully aware of the arguments on all sides. I cannot in any seriousness entertain the idea that they are led astray by hyperbole. I believe all the arguments about string theory not having made any progress in decades, not producing any testable results, being stuck in a made-up universe nothing like our own reality; these are not deterrents but attractions to this type of student.<hr><br />I understand your point but the decisions made by these top tier students does much more to sway these “people who might have something to say about whether string theory research gets supported” than some national lab’s public outreach ‘zine.<br /><br />Every grad program wants these students: sky-high physics-GRE, letters dripping with superlatives, transcripts with half a dozen graduate level courses completed as undergrad. They are courted with fellowships and awards. Their eagerness to join the field is seen as proof of vibrancy. If a big-name string theorist leaves your department and the acceptance rate for these blue-chips drops, you know the search committee will form quickly.<br /></blockquote>It's natural that this is how it works. A young person who has the ability to master these cutting-edge questions in physics has a significant probability to <em>exploit the ability</em> and actually try to move the cutting edge a little bit further. This is an instinct. An instinct that starts with curiosity. When they have an intellectual weapon in their skull, they're rather likely to realize it and they don't want the weapon to be wasted. Finding important things in physics is, in many ways, more exciting than sex. But in many ways, these two instincts are analogous. The men who have a very potent weapon in between their limbs also want to exploit it in many cases.<br /><br /><iframe width="407" height="277" src="https://www.youtube.com/embed/d5TUFF0G79w" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe><br /><br /><em><a href="https://motls.blogspot.com/2019/04/alessandros-essay-in-quillette.html?m=1">The Czecho-Slovak Easter Monday Whip</a> is over. On April 30th, be ready for another nice tradition, the <a href="https://en.wikipedia.org/wiki/Walpurgis_Night#Czech_Republic" rel="nofollowing">burning of the witches</a>. Prepare every witch – basically every obnoxious woman or a female thief who has a visually understandable problem with beauty, even if she claims not to be a witch – pour some petrol to her glass to improve her mood, and burn her on Tuesday.</em><br /><br />Moving the cutting edge of physics a little bit (or a big step) forward isn't easy, all these people know. If one succeeds, he or she unavoidably establishes that the likes of Edward Witten, Andy Strominger, Leonard Susskind, and/or others have been doing something silly. Or they overlooked some known insights that were relevant at another place. To have a chance to establish these far-reaching insights and change the <em>status quo</em>, one has to be smart <em>and</em> hard-working, these young people know.<br /><br />It seems that the critics of science don't even get this simple point – that one needs intelligence and hard work to move theoretical physics forward.<br /><blockquote><b>A vanilla science critic:</b> ...but if you are writing about a controversy, aren’t you supposed to contact people on both sides?<br /></blockquote>What the author of these comments apparently fails to get is that science isn't a subset of journalism and scientists aren't assistants to journalists. You may call some people's irrational hostility towards a theory that they haven't mastered a "controversy". Be my guest. There is a "controversy" about string theory – there is a "controversy" about everything else, too. It's still much more accurate to call it a "difference between experts and ignorant yet self-confident simpletons". A journalist may cover the "controversy" but by doing so, he doesn't contribute to the science itself, either, and the audience expected to consume stories about these "controversies" can't be scientists, either, because the scientists know that the opinions of those who just haven't mastered the subject that they scream about are 100% worthless.<br /><br />In particular, the "journalistic" critique above was about the Symmetry Magazine. But the <a href="https://motls.blogspot.com/2019/04/physicists-views-have-been-confined-to.html?m=1">Symmetry Magazine isn't a generic journalists' outlet</a> describing "controversies" between the experts and the laymen. The Symmetry Magazine is an outlet whose purpose is to inform particle physicists and people who feel close to that field about the events in their field, not another outlet for the scientifically illiterate public that wants to read about "controversies" and unreasonably think that both sides are always equal. Angry ignorant laymen's rants <em>aren't</em> events in particle physics. The readers who read about such "controversies" are scientifically illiterate and scientifically inconsequential simpletons themselves. People who understand what the scientific method is <em>know</em> that the actual controversies in science are fought with scientific arguments, not just with the screaming of random angry men and their mobs in the popular books, mainstream press, or comment sections of random websites on the Internet.<br /><br />The simple fact that screaming by these critics is 100% irrelevant for science is sometimes proven to a comical extent. Peter Shor of MIT, the guy who invented an algorithm for quantum computers, wanted to discuss whether a recent AdS/CFT paper involving quantum error correction was right, consistent with another, whether a deformation brought the authors outside the error correction codes, and whether this fact invalidates the analysis (these are two different questions – a point Shor seems to misunderstand). The host intervened:<br /><blockquote><b>A vanilla critic of science</b>: All, I fear this is the wrong place to debate the issues raised by Peter Shor about CFT and error-correcting codes, partly because the moderator knows nothing about the topic (he would like to someday understand what that’s about, but today is not the day…)<br /></blockquote>You know, this emerging discussion was a part of something broader – they weren't sure whether the conclusions of the AdS/CFT ("quantum gravity in a box") were telling us something about our dS Universe, too. But as soon as any <em>actual scientific arguments</em> start to emerge, the host immediately stops the discussion because <em>science isn't allowed there at all</em>. The host even admits that the reason is that he actually knows <em>nothing</em> about the relevant science himself. Not only the host fails to encourage science (like I do here) – he actively bans it. Only superficial would-be philosophical prejudiced slogans are allowed. It's not even wrong. It's not even wrong. Orange man is bad. Orange man is bad. A worthless website run by mindless NPCs. In spite of that complete isolation from any insights about the AdS/CFT, the host still loves to make far-reaching claims about AdS/CFT. How dumb does a reader has to be to take any of these statements seriously?<br /><br />I think that every person whose IQ is above 80 understands that the relevance of such discussions for the cutting-edge theoretical physics is much closer to the relevance of opinions of the cattle utilized by the McDonald's Corporation than to the relevance of young or old string theorists' opinions. But we are surrounded by mobs that tend to threaten you even if and when you make this self-evident innocent point. A journalist may write for the readers with the IQ below 80 who think that the string theorist's and critic's opinions about string theory are equally valuable. But journalists aren't and mustn't be <em>obliged</em> to address all their texts to moronic readers!<br /><br />Before the discussion about any detailed issues in AdS/CFT, some participants mentioned the question whether string theorists are actually doing string theory these days:<br /><blockquote><b>A vanilla critic of science</b>: What I see happening now (at least in the US) is that the best students are, as always, going to a small number of the top graduate programs (e.g. Harvard, Princeton, Stanford), where most of the theory faculty often identify tribally as “string theorists”, but are now working on topics in GR/QFT/quantum information, etc. that have nothing to do with quantized strings or with string-theory based unification. The odd thing I keep hearing is that such students arriving at such a grad program are encouraged to spend a lot of time studying actual string theory (e.g. by reading Polchinski’s two volumes) to prepare to start research, even though the research likely won’t use any of this. <br /></blockquote>What's going on here? The smartest undergraduate students are still capable of figuring out which places actually have the best theoretical high-energy physics in the world – and be sure that Harvard, Princeton, Stanford are <em>at least</em> near the top of the list. So they go there and the top physicists over there push them to study string theory.<br /><br />Is that right?<br /><br />Of course it's right. If you are a graduate student who says that your specialization is formal enough, non-phenomenological theoretical high-energy physics, you simply <em>have to</em> master string theory which is the state-of-the-art picture of theoretical high-energy physics as of 2019. In fact, string theory was born 51 years ago. It would be ludicrous to say that it's some recent fad or something that theoretical high-energy physicists may ignore in 2019. And if graduate students at Princeton, Harvard, Stanford were ignoring it, it would be really really bizarre.<br /><br />In principle, you may make some important contribution to theoretical physics <em>without</em> knowing the state-of-the-art apparatus. You deserve a PhD if you do so. But you don't deserve a PhD just for a <em>chance</em> that it happens. If you haven't made a real breakthrough, you only deserve a theoretical physics PhD if you have mastered the tools close enough to the cutting edge of a sub-discipline that give you a reasonable chance to make a breakthrough later. In particular, <em>you should learn the damn string theory</em>.<br /><br />I think that the percentage of non-stringy papers written by string theorists is much higher than two decades ago or even one decade ago. I also believe that the political atmosphere in the broader society is one of the main culprits – probably the main culprit. So people do various things – string theorists may do <em>many things</em>, indeed. I am also convinced that most of the stringy authors of such non-stringy articles realize that their non-stringy research is less profound than the string research they could do a decade or two ago. But it's OK enough for them.<br /><br />The situation is completely analogous to the times of Johannes Kepler and Tycho Brahe. They were employed by our glorious and playful leader, Rudolph II who reigned from Prague (I just watched the hilarious Czech movies The Baker's Emperor and The Emperor's Baker), and they were getting much of their income for <em>astrology</em>. Is it right to criticize these famous astronomers for getting some money from astrology? I don't think so. It wasn't primarily their fault. They preferred to do things that would soon lead to Newton's physics. But the <em>society and the powerful</em> wanted them to do things like the horoscopes. And these activities were easy enough for the astronomers because the skills are similar to the <em>serious astronomy</em> which helped to make sure that they actually did some astrology. Well, these old physicists and astronomers actually liked astrology to some extent, too. But this positive attitude wasn't a <em>characteristic</em> trait of them. They were also products of their epoch.<br /><br />Obviously, what the string theorists do outside string theory is much more scientific than the horoscopes but the basic dynamics is the same. What the researchers do <em>is affected</em> by the societal pressures and pressures from the sponsors etc. And because lots of the ignorant activists have pushed the image of string theory to something similar to the heliocentric heresies 4 centuries ago, string theory is also being hidden from the public eye to a similar extent as heliocentrism was 4 centuries ago. It became at least questionable whether you may materially benefit from stringy results that you produce – even if they are rather important ones. It doesn't mean that there's something non-essential or even wrong about heliocentrism or string theory. It is just a reflection of irrational beliefs that are prevalent among the laymen in one epoch or another.<br /><br />The top theoretical physicists still have the duty – and internal instincts – to preserve the field. So even when the pressures make it likely for the new PhDs to work on something else or to produce horoscopes, it's still essential that the knowledge of string theory doesn't evaporate when a new generation replaces the previous one. A top university simply cannot give a theoretical physics PhD to someone who just solves the average exercises in a textbook of quantum field theory – or who writes diatribes against theoretical physics. If this became normal at such a university, that university would clearly cease to be a top one because <em>almost everybody can do such things</em>. People who succeed as writers of anti-scientific diatribes aren't exceptional because they are <em>exceptionally good</em>. They succeed because they are <em>exceptionally close to the average people</em>.<br /><br />If the last 20 papers by a string theorist were about "non-string theory", does it make sense for him or her to be called "a string theorist"? You bet. If he or she hasn't forgotten the theory, it's still the most accurate description of his or her expertise. A string theorist is someone who has mastered and/or done some research on string theory – which also required him or her to become a good enough expert in quantum field theory (and all of its prerequisites; and it's likely that an average "string theorist" understands QFT more than an average "quantum field theorist"), some algebraic geometry, general relativity, quantum information, and more. These folks first needed to master the prerequisites and <em>then</em> they could jump on string theory which added some expertise that is equivalent to a few more years of studying.<br /><br />The reason why such people – regardless of the detailed content of the recent papers – call themselves "string theorists" and not e.g. "quantum field theorists" is exactly the same as the reason why a person both with a bachelor and doctor degree prefers to call herself a "doctor": it's simply the superior degree! Being a string theorist <em>does incorporate</em> being a quantum field theorist and other things. So why would a string theorist call himself a quantum field theorist? Why would a doctor call himself a bachelor? Why would Kepler call himself an astrologer (as the primary job description) if he were also and primarily an astronomer?<br /><br />During Kepler's times, certain people wanted to turn astronomy, especially the heliocentric astronomy, into a heresy. Some of them might have preferred the masses to imagine that being an astrologer was more important. But the actual <em>experts</em> knew it wasn't the case. They <em>already knew</em> that astronomy was more important than astrology. It was more scientific. It also required more time and hard work to be studied and researched. The smartest folks actually knew that the astronomers (and consumers of astronomy) were smarter in average than astrologers (and consumers of horoscopes in particular). This knowledge has always affected what they emphasized while talking to each other.<br /><br />Completely analogously, some people want to mislead masses and hide the simple basic fact that e.g. the <em>string theory graduate students are generally smarter and more advanced</em> than the average graduate students who have learned quantum field theory at a decent level. And this misinformation of the masses may work. But by definition, it doesn't affect the genuine experts who have actually studied these things and who interact with string theorists as well as the people in adjacent fields. Those still <em>know the truth</em>.<br /><br />Most of the critics of string theory <em>know</em> that they're simply lying 24 hours a day, 7 days a week. And I have doubts that they get some real psychological relief from the deception. Why? Because they still know that the people who buy this ridiculous garbage – such as the claim that it's just OK for an intelligent theoretical physicist to dismiss string theory – are just easily manipulated dimwits. They still know that these dimwits' opinions don't matter for the science itself. One can manipulate hundreds of easy-to-manipulate dimwits but it still doesn't change the underlying truths. Is it more psychologically pleasing when some uncritical readers repeat some slogans? Does it make the "trainer" more psychologically satisfied than when he trains a parrot – an actual bird – to repeat a sentence? The parrots' achievements look more remarkable to me because the birds are punching above their weight. The average human's repetition of average stupid slogans is the business-as-usual.<br /><br />I find it staggering how completely these critics of science misunderstand what science is and how it works:<br /><blockquote><b>A vanilla critic of science:</b> What’s disturbing to me is that, increasingly, the string unification/SUSY research program seems to have moved from “evaluate us by LHC results or progress on these crucial problems that are in between us and a testable theory” to “there is no way to evaluate us, you just have to believe us, because there are so many of us and we’re so smart.” That’s not the way science is supposed to work, for good reason.<br /></blockquote>The number of string theorists isn't large (as in "many of us"). The currently professional ones are around 1,000 – it's a big part of the intellectual cream of the mankind. But the claim that "the layman has no way to evaluate them" is self-evidently true and it was always true. A non-expert who hasn't mastered even the basic chapters of a textbook about a given field obviously cannot evaluate – and could have never evaluated – the statements about the field. Why would someone doubt this self-evident fact? Only if one actually works and becomes an expert herself, to one extent or another, she can start to (meaningfully) evaluate the statements about the field.<br /><br />In theoretical high-energy physics, the results from the LHC influence the physicists' beliefs about many questions (but surely not all questions that these physicists investigate), but you still need expertise to figure out what the LHC collisions actually imply for the validity of various big statements about particle physics. The layman just doesn't know and can't know how to deduce some truths about deep questions from the LHC collisions. For example, a layman just cannot have a reasonably justified opinion on whether the Standard Model or the MSSM is more likely at this point, after some 160/fb of data collected by major LHC detectors (and after many theoretical advances). Everyone who has been persuaded that this is possible without real expertise in theoretical physics has been deceived.<br /><br />The non-expert may choose to believe or not believe, it's his psychological dilemma, but whatever he chooses doesn't <em>affect</em> the scientific truth in the field – in this case string theory. A rational expert who doesn't really understand anything about the theory at the technical level should primarily realize that <em>he doesn't know</em> what the truth is. To some extent, even the experts <em>do not know</em> the answers to many questions, even the very important ones. The ability to live in the state of ignorance is one of the first conditions for the scientific attitude to the world. A person who just "needs" to pick some answers, even if they have at least 50% probability to be wrong, just isn't approaching the truth in the scientists' way.<br /><br />The very act of choosing to "believe" – or "not believe" – is an irrational move. And "not believing" is obviously as irrational as believing! Well, it's a bit more irrational because even a layman should be able to figure out (a sociological argument) that the critics are less informed and less intelligent and therefore less likely to be true than the experts chosen by the intelligence.<br /><br />And the actual dynamics of the funding and support of pure science in a healthy society <em>should work</em> exactly in the way that the vanilla critic of science tries to mock: the society <em>should</em> give some support and funding to the <em>smartest yet curious people to do the research wherever it seems to lead</em>, whatever the generic members of the society think about the direction in advance. This really <em>is</em> the cornerstone of science or any honest research (or police investigation). You just follow the evidence wherever it leads. And it's only the smart yet hard-working people who meaningfully manipulate with the evidence who have a high enough chance to move theoretical physics forward.<br /><br />The researchers must have the freedom to do their pure scientific research as they see fit. And the people who are allowed to (and have the material backing) do this kind of a job should be chosen meritocratically – as the most intelligent and those who have mastered the "previous" picture of physics better than others – and not according to the proximity of their opinions and beliefs to the opinions of masses! The other, critic's approach would liquidate the science and it would turn the ex-scientists into corrupt defenders of sponsors' or the public's prejudices.<br /><br />So yes, please. When these dark ages are over and almost all people realize once again why the ideology and methods of the critics have been medieval and pathological, the nations will support at least a few thousand of the smartest yet curious people to do the pure scientific research according to <em>their own judgement and their own evaluation of the evidence as they see it</em> – not according to the judgement and prejudices of the society – because this picture where "only the scientific evidence matters" is how science differs from the irrational and oppressive enforcement of orthodoxies for the masses!<br /><br />And that's the memo.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-48980495040395845062019-04-25T07:53:00.000+02:002019-04-25T12:15:34.383+02:00Four interesting papersOn hep-ph, I choose a paper by <a href="https://arxiv.org/abs/1904.10809">Benedetti, Li, Maxin, Nanopoulos</a> about a natural D-braneworld (supersymmetric) inspired model. The word "inspired" means that they believe that similar models (effective QFTs) arise from a precise string theory analysis of some compactifications but as phenomenologists, they don't want to do the string stuff precisely. ;-) It belongs to these authors' favorite flipped \(SU(5)\) or \({\mathcal F}\)-\(SU(5)\) class of models and involves new fields, flippons, near \(1\TeV\). The LSP is Higgsino-like and the viable parameter space is rather compact and nice. The model seems natural – one may get fine-tuning below \(\Delta_{EW}\lt 100\).<br /><br />It's an example showing that with some cleverness, natural models are still viable – and Nature is almost certainly more clever than the humans.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />On hep-th, I start with <a href="https://arxiv.org/abs/1904.10577">Shirai-Yamazaki</a> who point out some interesting tension between two types of ideas both of which arise from Cumrun Vafa's swampland program. One of them is a scalar-force generalization of our weak gravity conjecture; the other is a more recent Vafa-et-al. de Sitter swampland conjecture.<br /><br />The tension has a simple origin: the (scalar-force-type) weak gravity conjectures tend to prohibit light scalars like quintessence because those would yield "weaker-than-gravity forces mediated by scalars". On the other hand, as you know, the Swampland de Sitter conjecture wants to outlaw inflation and replace it with quintessence, so it needs quintessence. Whether there is a real contradiction depends on the precise formulation of both conjectures, especially the scalar-force weak gravity conjecture, they conclude.<br /><br />It's very interesting. We're apparently still not certain about the precise form of these inequality-like principles. There are contexts in which I am even uncertain about the direction of the inequalities. So the vagueness may be as bad as saying that "something new happens when some ratio of parameters approaches some value from either side".<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />Finally, a cool research of the scattering theory involving new or understudied asymptotic states. There are two new hep-th papers, by <a href="https://arxiv.org/abs/1904.10831">Pate, Raclariu, and Strominger</a>; and by <a href="https://arxiv.org/abs/1904.10940">Nandan, Schreiber, Volovich, Zlotnikov</a>. They have some overlap – the authors know each other (Volovich was a student of Strominger's; yes, Strominger's current two co-authors are nice young ladies, too) so they probably knew about their shared interests in advance.<br /><br />Normally, we study scattering with asymptotic states which are particles with nicely well-defined momentum vectors \(p^\mu\). Those are eigenstates under spacetime translations. However, they study "celestial" scattering amplitudes of asymptotic states that are eigenstates of the boosts, basically of the Lorentz group \(SO(3,1)\). These asymptotic states should be eigenstates \((h,\bar h)\) of the \(SL(2)\times SL(2)\) complexified version of the Lorentz algebra. OK, you want eigenstates of \(j_L^2,j_R^2,j_{L,3},j_{R,3}\) in the complexified \(SL(2,\CC)\times SL(2,\CC)\).<br /><br />Either one or two papers study new kind of soft theorems – designed for these "celestial" states instead of the momentum eigenstates. Pate et al. say these new theorems cannot be derived from low-energy effective action. It seems rather incredible or extraordinary because they seem to work just with another "differently singular" basis of the states – all their laws should be just some rearrangement of the usual ones. OK, but we are told that they are not. These new claims about soft theorems etc. spiritually agree with Strominger-and-collaborators' recent claims that the information about the infalling matter may be stored in rather "classical" properties (or hair) of the black hole, and similar stuff.<br /><br />Nandan et al. analyze the "celestial" scattering and try to reproduce the counterparts of the insights we normally do with the momentum-based scattering: new, "celestial" partial wave decomposition, crossing symmetry, and optical theorem. They also study some soft limits.<br /><br />The momentum scattering states are the gold standard and I believe they will remain dominant in the literature in coming years. But these "celestial" states and symmetries – and all the laws in the new bases – are arguably "comparably" fundamental. To focus on the momentum eigenstates and not the "celestial", Lorentz group eigenstates means to be biased in a certain way and the authors of both of these papers are trying to remove the bias from the literature and fill the holes.<br /><br />The claim that these new states – with some singular support near the light-cone or high-energy particles – allow us to derive something that was previously unknown is extremely provocative or exciting. It should be possible according to some intuition, like mine, but the intuition isn't backed by any real evidence. Similar patterns obtained from the "celestial" viewpoint could be relevant for deep questions in physics as well, including the clarification of the twistor/amplituhedron forms of some amplitudes, the information loss problem, and naturalness, I vaguely think.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-54331821131527249632019-04-07T09:21:00.000+02:002019-04-07T09:57:55.564+02:00Physics knows a lot about the electron beyond the simple "Standard Model picture"Ethan Siegel wrote a text about the electron, <a href="https://www.forbes.com/sites/startswithabang/2019/04/06/ask-ethan-what-is-an-electron/">Ask Ethan: What Is An Electron?</a>, which includes some fair yet simplified standard conceptual facts about the electron's being a particle and a wave, about its properties being statistically predictable, and about the sharp values of its "quantum numbers", some discrete "charges" that are either exactly or approximately conserved in various interactions.<br /><br /><span class="noborimg"><img src="https://upload.wikimedia.org/wikipedia/commons/thumb/0/00/Standard_Model_of_Elementary_Particles.svg/400px-Standard_Model_of_Elementary_Particles.svg.png"></span><br /><br />While his statements look overwhelmingly right, there is a general theme that I expected to bother me and that bothers me: Siegel presents a frozen caricature of the particle physicists' knowledge that could be considered "a popularization of the snapshot from 1972 or so". There doesn't seem to be any added value of his text relatively to e.g. the Wikipedia article on the <a href="https://en.wikipedia.org/wiki/Standard_Model">Standard Model</a>. After all, the images such as the list of particles above were just taken from that article.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />The final two sentences of Siegel's article suggest that he realizes the "work in progress" character of science – and particle physics is what matters here – and he supports the research:<br /><blockquote>...Why it works the way it does is still an open question that we have no satisfactory answer for. All we can do is continue to investigate, and work towards a more fundamental answer.<br /></blockquote>The only problem is that he talks the talk but doesn't walk the walk. This text he wrote – and basically everything he wrote about particle physics – indicates that he has displayed no interest whatsoever to learn something new about the electron that would go e.g. beyond the elementary undergraduate freshman factoids from his article.<br /><br />The same comment can be said about virtually all popular writers about particle physics these days. They either try to sell pseudoscientific theories that were created from scratch and that have nothing to do with the accumulated knowledge of physics – they're really not capable of reproducing the Standard Model predictions and they are usually very far from passing this test.<br /><br />Or they avoid such theories but then they avoid <em>all research</em> altogether and spread the misconception that the frozen high school caricature of particle physics from 1972 is what will be enough as "summary of physics" forever.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />But the scientific truth and the scientific approach is something completely different – in some sense, it is in between the extreme approaches by the popular writers. Scientific research does take all the established knowledge into account – as compressed e.g. into the Standard Model, an effective theory – but it just didn't freeze when the last terms of the Standard Model Lagrangian were written down.<br /><br />I think that he was actually asked – pretty much explicitly – about some things that transcend the simplistic picture he has summarized, but his answer included nothing to address that question whatsoever. OK, first, Siegel hasn't really explained any <em>conceptual</em> ideas relevant for the answer, not even at the level of the Standard Model. Note that the original question read:<br /><blockquote>Please will you describe the electron... explaining what it is, and why it moves the way it does when it interacts with a positron. If you'd also like to explain why it moves the way that it does in an electric field, a magnetic field, and a gravitational field, that would be nice. An explanation of charge would be nice too, and an explanation of why the electron has mass.<br /></blockquote>OK, Siegel responded with the tables of elementary particles and lists of quantum numbers. But I think that those really don't explain any of these matters – why the electron moves the way it moves in the electromagnetic field (and analogously gravitational field), why it may annihilate with the positron, what gives the electron the mass (the Higgs mechanism and some Yukawa couplings, and those may be approximations of something more fundamental) and more.<br /><br />Even at the level of field theory, there is a lot of <em>conceptual stuff</em> to say. OK, all the electromagnetic interactions of the electron boil down to the term in the action\[<br /><br />S_{\rm int} = \int d^4 x\, j^\mu A_\mu = e\int d^4 x\, \bar \Psi A_\mu\gamma^\mu \Psi.<br /><br />\] Add your standard coefficients or signs or \(2\pi\) factors if you hate my schematic picture. OK, there's apparently some electromagnetic 4-potential \(A_\mu\) that interacts with the current – the density of the electric charge and the vector-valued density of the flux of that charge. And this term increases or decreases the particles' energy in the electromagnetic potential, bends the paths in magnetic fields, and more.<br /><br />The existence of the \(A_\mu\) electromagnetic gauge field may be derived from the \(U(1)\) gauge symmetry, by requiring that the apparent freedom to change the phase of the field \(\Psi\) is extended to become the independent freedom at each spacetime point. Once you accept that the field \(A_\mu\) is needed for that and you have fields like \(\Psi\) for the electron and \(A_\mu\) for the electromagnetic field, the structure of the terms in the Lagrangian is mostly dictated by "nonzero physical content", "consistency", "Lorentz covariance", and "gauge symmetry".<br /><br /><img src="https://www.researchgate.net/profile/Samuel_Marchetti/publication/319552399/figure/fig4/AS:668802924679180@1536466451817/Feynman-Diagram-of-electron-positron-annihilation-showing-how-the-collision-produces-two.ppm" width=407><br /><br />Once \(\Psi,A_\mu\) are understood to be quantum fields i.e. operators, all the probabilities of all conceivable outcomes may be calculated perturbatively (at least if some good enough approximation is good for you) from expressions such as the annihilation Feynman diagram above. There is quite an interesting question for the beginning: "Why can the point-like electron and the point-like positron exactly hit each other at all?"<br /><br />If you imagine that they're like point-like planets, the probability that their initial velocities exactly lead them to a head-on collision is infinitesimal i.e. zero. And with a high enough energy, it should also be impossible that they lose enough energy to spiral and fall on one another. In classical physics, the annihilation of point-like particles would be impossible.<br /><br />But quantum mechanics – its subtype called quantum field theory in this case – saves the day. Quantum mechanics calculates probabilities. They're calculated as squared absolute values of probability amplitudes. And probability amplitudes of quantum field theories involve terms (something that is added with other terms by the plus signs) that are integrals over histories. Alternatively, they may be rearranged as Feynman diagrams which are integrals over the spacetime points where the \(\bar\Psi \cdot A\cdot \Psi\) interaction took place.<br /><br />Because quantum mechanics allows all intermediate histories, it also allows the intermediate history in which the electron and positron precisely collide – their position is the same at some point (which is reflected by continuous electron-positron line in the diagram). This single "infinitely unlikely point" of the history space (which is located at the cubic vertex or vertices of the Feynman diagram) still contributes a finite amount because some terms are weighted by a delta-function (whose integral over the space is one). So just because the precise encounter of the electron and the positron is a <em>possibility</em>, quantum mechanics guarantees that it changes the final probabilities of the elastic collisions, inelastic collisions, or annihilation by a finite amount!<br /><br />In classical physics, just the existence of an infinitely unlikely <em>possible intermediate history</em> couldn't change the predictions. Classical physicists (or critics of quantum mechanics) could <em>passionately argue</em> that it's impossible for the predictions to change due to infinitely fine-tuned <em>possibilities</em>. But quantum mechanics disagrees and says that the predictions of the probabilities of virtually <em>any outcome</em> are unavoidably affected if there exist some extra <em>potential</em> intermediate histories.<br /><br />You know, I am sure that this is the kind of the <em>conceptually important wisdom</em> that should be explained by popular articles about similar questions – exactly because the uninformed beginner is likely to make wrong assumptions and incorrect guesses. And because these snippets are great examples showing how some general principles of quantum mechanics (e.g. summation over histories) qualitatively affect particular processes such as the annihilation (they make it possible). But it's never being done – perhaps because none of the popular writers actually understands any of these things.<br /><br />Obviously, the answer to the question posed to Ethan may be a more or less detailed summary of a quantum field theory course. It makes no sense to try to compress a whole quantum field theory course into this blog post or any other single blog post. But I think that even the point "to understand these things properly, you need to learn quantum field theory well" is simply not being communicated. These laymen clearly maintain some kind of a belief that they may circumvent all the difficult stuff of QFT (and maybe even circumvent all of quantum mechanics) while properly understanding the answers to all questions about Quantum Electrodynamics. But it's simply not the case.<br /><br />But the laymen – and maybe even young prospective physicists – are much more misdirected when it comes to any "thinking beyond the old-fashioned Standard Model machinery and pictures". They're deliberately indoctrinated by the completely wrong meme that nothing has changed about the physicists' knowledge or thinking about these matters since 1972, a year I semi-randomly picked. Near the beginning, Siegel wrote:<br /><blockquote>They [electrons] were the first fundamental particles discovered, and over 100 years later, we still know of no way to split electrons apart. <br /></blockquote>From an appropriate side, this is a perfectly valid statement. And I have almost certainly made the almost identical statement many times, too. The electron still looks like an elementary particle, despite its (discovery's) age over 100 years. In this sense, the electron differs from molecules, atoms, nuclei, protons, and neutrons that have been shown to have some particular internal architecture.<br /><br />However, in between the lines and in the broader context, you may see that Siegel and others are conveying something stronger that actually isn't right. They want the reader to think that nothing has changed about our view that "the electron could be or should be a structureless point-like particle" from 1919 or from 1972. That is simply untrue.<br /><br />After 1972, physics has understood lots of deep things that <em>have</em> changed the physicists' understanding of all these questions. It hasn't looked helpful to teach these things to the laymen or high school students, so the laymen and the high school students just remain largely ignorant about them and keep the naive Wikipedia-style or high-school-level pictures as frozen in 1972.<br /><br />But that doesn't mean that the <em>scientists' opinions and expectations</em> haven't changed since 1972. They have and the popular writers who try to deny these transformations are simply creating an abyss between their readers and the actual scientific research.<br /><br />If we focus on the "internal architecture of the electron", we may say that the main changes since 1972 took place in two realms:<br /><ul><li>renormalization group – and more generally, a deeper, non-perturbative etc. understanding of quantum field theories, what they are, what are their limits, where the parameters come from and which of them are natural etc.</li><li>string theory – and more generally our understanding of particular mechanisms beyond quantum field theory that clarify in what sense the Standard Model is generated or may be generated as an approximate description of a more fundamental theory</li></ul>OK, Siegel – and most other popular writers – love to deny and hide both. Last week, I discussed some <a href="https://motls.blogspot.com/2019/03/its-irrational-to-both-worship-and.html?m=1">emotional ideas of Eric Weinstein</a>. One of his proposals was that the task for theoretical physicists should be to popularize the renormalization group, a gem they are sitting at, and to spread it to other fields and turning it into a general knowledge.<br /><br />I have explained that my side is even losing the battle to preserve \(x,y\) at the Czech elementary schools, let alone seeds of the renormalization group – and I have argued that top physicists simply shouldn't be understood as teachers or communicators because they're ultimately doing a much more selective and special kind of work. But articles such as Siegel's Forbes column are the perfect examples of the venues in which the renormalization group thinking should be promoted. And the question about the internal architecture of the electron was a perfect opportunity to make a small step for a man but a larger step for the mankind in this direction. Either because Siegel doesn't know or doesn't like the renormalization group, he hasn't used the opportunity at all. It's not just him – it's most of the mass writers. The underlying reason might be that the writers simply do not have a deeper conceptual understanding of the issues than the readers – so the writers just don't have anything substantial to teach to the readers, aside from some boring tables.<br /><br />So the readers are expected to keep their belief that nothing has changed since 1972 and the electron is just point-like. The end of the story.<br /><br />In real physics, while we don't have any promising quantum field theories in which the electron is composite, we do understand that the electron could <em>very well be</em> composite – it's a possibility that we must simply always consider viable because we know that seemingly simple theories such as the Quantum Electrodynamics with the simple point-like electrons generally arise as long-distance limits of different (sometimes simpler, sometimes more complex) theories with different degrees of freedom.<br /><br />Even if "the electron" stayed point-like up to the Planck scale, the field that produces the electron is "renormalized" in between the high-energy regime which are close to the fundamental laws of Nature; and the low-energy regime that is enough for a good enough description of low-energy phenomena involving the electron.<br /><br />In fact, the electroweak theory that <em>has</em> already been established forces you to isolate the electron's spinor fields from some doublets (in which the neutrino fields start as indistinguishable fields at high energies, before the Higgs has a vev) – and also diagonalize some mass matrix that involves three generations of charged (and neutral) leptons. We also know that the fine-structure constant \(\alpha\approx 1/137.036\) isn't really fundamental. First of all, it's a function obtained from two electroweak couplings that mix and these electroweak fine-structure constants are more fundamental than the electromagnetic one; and all these constants "run" with the energy scale, while the high-energy values of the couplings are different than the notorious \(1/137\) yet more fundamental.<br /><br />My point is that Siegel deliberately tries to enforce the readers' belief that <em>there is nothing conceptual about the electron that goes beyond Quantum Electrodynamics</em>. Even the fermion mass matrices of the Standard Model and the doublets – some aspects of the electron beyond Quantum Electrodynamics – are being deliberately obfuscated while the beyond the Standard Model thinking is being hidden altogether.<br /><br />This brings me to the second class of "hidden secrets". Siegel doesn't want the readers to embrace any insights or arguments from the renormalization group era – which also began in the mid 1970s or so. But he also wants to hide all the actual known types of "more fundamental theories" that produce the Standard Model as an approximation.<br /><br />They include grand unified theories, supersymmetric theories, and – most ambitiously and rigorously – compactifications of string theory. While no example has been picked as the <em>single, provably right</em> theory beyond the Standard Model, they have provided us with many working <em>proofs of the concept</em> that theories compatible with all the known observations exist where the Standard Model terms are not the "end of the explanatory story".<br /><br />In particular, the electric charge (of the electron – the lightest charged particle) – may emerge as the quantized Kaluza-Klein momentum \(p_5=Q/R\), \(Q\in\ZZ\) of a particle moving in a higher-dimensional Universe with a circular dimension \(x^5\). The electric charge may also arise as the winding number of a string, or a wrapping number of a membrane, counting how many times the string or brane is wound around a non-contractible circle or a non-contractible higher-dimensional submanifold of the manifold of extra dimensions.<br /><br />Also, the electric charge may be reduced to a topological invariant in some field configurations – e.g. those involving tachyons within a higher-dimensional annihilating D-branes, along the lines initiated by Ashoke Sen. We could find a few more interesting effects in which "the electric charge has some deeper explanation or geometrization" within string theory. We could divide the stringy compactifications – heterotic, type I, type IIA brane worlds, M-theory with boundaries, M-theory on \(G_2\) manifolds, and F-theory of several subfamilies – into the groups. In each group, some particular "geometrization" of the electric charge would be more important than others.<br /><br />And we could see that the electron itself is most likely to "be" either a vibrating closed string, or a vibrating open string of some kind, although it's really a shrunk wrapped membrane in some M-theory models etc. As a vibrating string, the electron is composite, after all, although the "string bits" – building blocks in a regularized picture – are just a string length away from each other, an inaccessibly tiny distance.<br /><br />Again, none of them has been <em>established as the final answer yet</em> which is why the research of them is <em>ongoing</em>. But the fact that none of them has been <em>established</em> doesn't mean that the research is <em>meaningless</em>. If one had adopted this utterly stupid anti-research "logic" in the past, then all the scientific progress would have been impossible in the past. Why? Simply because every insight, however rock-solid at the end, must first be investigated by someone who isn't immediately sure that the idea is correct.<br /><br />If someone spits upon research just because the conclusions aren't rock-solid yet, then he or she is spitting on <em>all research</em> and on <em>science as a whole</em>. I find it amazing that so many people seem unaware of this <em>elementary point</em>.<br /><br />So Siegel's reader was asking a question of the type "I want to understand electrons at a deeper level" and Siegel responded with "don't ask, nothing interesting to be seen relatively to the high school summaries from 1972". If Siegel and/or his readers were this disinterested in the actual <em>answers and possible answers</em> to these questions, as they are being refined by actual researchers, why do they ask the questions at all? And why do they pretend to answer them?<br /><br />It makes no sense and this whole question-and-answer ritual seems to be a deception. You are either interested in the deeper origin of the electric charge and the electron – which means that you <em>want</em> to look at some of the best papers in which the best scientists are addressing this question – or you are uninterested. You shouldn't pretend that both answers are possible at the same moment. If you're not following any ideas since 1972 about the deeper explanations of the electron or the electric charge, then you are just a superficial layman uninterested in the research of particle physics. Period. It's just fraudulent for you to pretend something else.<br /><br />Real theoretical high-energy physicists are very interested and they have made a huge progress, even after 1972.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-83712779861804447382019-04-01T08:04:00.000+02:002019-04-01T16:14:21.875+02:00Skepticism about Standard Models in F-theory makes no senseFour weeks ago, I discussed <a href="https://motls.blogspot.com/2019/03/one-quadrillion-standard-models-in-f.html?m=1">a quadrillion Standard Model</a> compactifications that were constructed within F-theory by Cvetič et al. For some happy reasons, Anil at Scientific American wrote his own version of that story four days ago:<br /><blockquote><a href="https://www.scientificamerican.com/article/found-a-quadrillion-ways-for-string-theory-to-make-our-universe/">Found: A Quadrillion Ways for String Theory to Make Our Universe</a><br /></blockquote>I think that Scientific American hasn't been publishing this kind of articles about some proper scientific research – and Anil hasn't been writing those – for years. Some adult who works behind the scenes must have ordered this one exception, I guess. So I am pretty sure that the readers of SciAm must have experienced a cultural shock because the article is about a very different "genre" than the kind of pseudoscientific stuff that has dominated SciAm for years.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />Well, there are some differences between my and Anil's comments about that article. But there exists "a very different" way of talking about these matters – a rant titled <a href="http://www.math.columbia.edu/~woit/wordpress/?p=10915" rel="nofollow">This Week's Hype</a> (this title has been recycled about thousands of times) written by Mr Peter Woit.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />OK, so he seems <em>dissatisfied</em> that SciAm writes about this fancy, rigorous research at all. If some people still read those vacuous anti-science tirades, Mr Peter Woit serves them the usual emotional gibberish. First, the paper and the SciAm summary are said to be "hype". Well, there is absolutely no hype in those articles. It's just a very technical research of 8-dimensional geometries that are relevant for particle physics thanks to the F-theory constructions; and a semi-popular summary of that research.<br /><br />The first full sentence with a complaint says:<br /><blockquote>As usual in these things, the only physicists quoted are the authors of the article, as well as some others (Cumrun Vafa and Washington Taylor) who are enthusiastic about the prospects for getting the Standard Model out of “F-theory”.<br /></blockquote>People who have been doing research on F-theory – and especially phenomenology of F-theory – were chosen to provide their opinion on the paper by Cvetič et al. for a simple reason: They are the experts and the "only" experts. The opinions of non-string theorists would clearly be nothing else than random incoherent noise that would brutally lower the quality of the story in SciAm. <br /><br />Even most string theorists could be expected to say misleading things about F-theory and its realizations of the Standard Model. It's just wrong to fill articles about a technical research where many details matter a great deal with some non-experts' emotions. These emotions wouldn't add any positive value. And you know, there is a very good reason why Cumrun Vafa was asked about his opinion. He is the <em>father of F-theory</em>. A detail, right? Maybe some people think that shouting "F-theory is evil" is as good an expertise as being the father of F-theory but I don't.<br /><br />Wati Taylor is also highly qualified to comment; among other things, he co-authored a <a href="https://motls.blogspot.com/2015/11/there-are-many-more-flux-vacua-in.html?m=1">truly gigantic class of flux vacua</a> (not resembling the Standard Model) in F-theory in 2015.<br /><br />If someone doesn't know e.g. how to write a torus by a twelfth degree complex polynomial equation in \(x,y,z\), then he or she has almost certainly nothing useful to say about F-theory, period. And be sure that Peter Woit as well as over 99.99999% of the mankind belongs to this "not really promising" set. Science builds on evidence and calculations, not on "opinions" of the people who don't understand anything about the issue. People who can't reasonably say things like "oops, Mirjam, you forgot a term contributing to the first Chern class from a brane" should exploit the opportunity to shut their mouth because they have clearly nothing to contribute and it's terrible if some mass culture is trying to pretend otherwise. The scientific value of some knee-jerk "critics of F-theory" is exactly the same as the musical value of a drunk guy who penetrates into a concert hall and throws up on the orchestra. They should be a task for bodyguards, not researchers or musicians.<br /><br />But Woit's whining gets more intense:<br /><blockquote>No one skeptical of the idea of F-theory compactifications of string theory...<br /></blockquote>A person who is "skeptical of the idea of F-theory compactifications of string theory" is exactly analogous to a person who is skeptical about the other planets in the Solar System or skeptical about the primes greater than 100. He or she is clearly a person who doesn't have the slightest idea about the topic that was discussed in the SciAm article.<br /><br />The existence of F-theory compactifications of string theory is pretty much a rigorous mathematical fact. There is no "hype" or "commercial" or "exaggeration" hiding in this statement. It is really literally true. Even some 30 years after the First Superstring Revolution, the only known consistent theories of quantum gravity coupled to particle physics are the constructions linked to string theory. And they may be divided to five or so classes of compactifications – here I clump all of F-theory as one class. That's why actual physicists working on the top-down particle physics take string compactifications – and even F-theory compactifications, let's say a 20% market share of the stringy model building industry – very, very seriously.<br /><br />One simply can't be both intelligent and "skeptical about them".<br /><blockquote>If such a person had been consulted, he or she might have pointed out: Models like this have been around for over two decades, see for instance this from 23 years ago.<br /></blockquote>It's nice if someone were capable of noticing that F-theory has been investigated since <a href="https://arxiv.org/abs/hep-th/9602022">February 1996</a> (OK, is there something shocking about that timing information?) but this knowledge of a historical factoid that is extremely far from turning someone to an F-theory expert who could be reasonably "consulted" in articles about F-theory.<br /><blockquote>They have always come with claims that some sort of connection to experiment was right around the corner.<br /></blockquote>There is no comment about any "experiments around the corner" in the paper by Cvetič et al. and there is absolutely no reason why such remarks should be "mandatory" in papers that map the landscape of possibilities to get a realistic theory of particle physics from a consistent framework of quantum gravity.<br /><blockquote>This new work doesn’t even bother trying to make “predictions”. It just works backwards, trying to match the crudest aspects of Standard Model, ones determined by a small set of small integers. <br /></blockquote>There is absolutely nothing wrong about "working backwards". Indeed, the search for the right theoretical explanation of Nature is an inverse problem of a sort. To one extent or another, everyone who has ever searched for a better theory of Nature was "working backwards": Kepler, Newton, Maxwell, Einstein, Feynman, Glashow, and everyone else. It's just extremely embarrassing if someone misunderstands even such <em>totally rudimentary facts</em> about science.<br /><br />One quadrillion Standard Models in the paper refer to one quadrillion of 8-dimensional topologies that, if used as the hidden dimensions of F-theory, produce a particle physics spectrum whose low energy part agrees with the Minimal Supersymmetric Standard Model. It is nontrivial that all the quantum numbers work – and the authors were capable of translating these conditions into geometric constraints on the 8-dimensional topology. It is an impressive piece of work whether or not an anti-physics heckler prefers to spit on it.<br /><br />Also, it's laughable to describe the reproduction of the Standard Model spectrum just as "crudest aspects" of the model. All physical predictions are totally determined by the theory given the knowledge of the spectrum <em>and</em> the values of some continuous parameters. In this sense, the correct quantum numbers describing the spectrum are about "one-half" of all of physics (and virtually all of the "qualitative aspects" of physics), not just the "crudest aspects".<br /><blockquote>Given the huge complexity and number of choices of these F-theory constructions, that some number of them would match this set of small integers is not even slightly surprising.<br /></blockquote>One quadrillion is much more than any "package of explicitly constructed Standard Models" that was found ever before. So by the sheer size, it is surprising. Cvetič et al. deliberately tried to look for such a class and they found a quadrillion solutions. Some people could have expected more, some people could have expected less. Surprises are a subjective matter. It is meaningless to talk about "surprises" in an objective way.<br /><br />What is surprising to me is the very concise way how the geometric conditions equivalent to the "Standard Model spectrum" may be written down. I find the topological condition with some "three terms" to be much more economical than the usual QFT ways to describe the Standard Model spectrum.<br /><blockquote>The authors seem to argue that it’s a wonderful thing that they have found quadrillions of complicated constructions with this kind of crude match to the SM. The problem is that you don’t want quadrillions of these things: the more you find, the less predictive the setup becomes.<br /></blockquote>These assertions are absolutely irrational. Every consistent theory of quantum gravity that also includes the Standard Model spectrum and perhaps a few more things that are needed is a <em>viable candidate to describe Nature in detail</em>. So until they're ruled out by a wrong detailed prediction, these quadrillion F-theory vacua are <em>viable candidates</em>, too. In science, one simply can't refute or eliminate possible theories by incoherent emotional rants. Only the falsification by conflicting empirical evidence may eliminate models – that's true for every element of this "set of one quadrillion vacua", too!<br /><br />At this level of fineness, it is simply another <em>mathematical fact</em> that the number of viable candidates is at least one quadrillion. Someone could find a number smaller than one quadrillion "more philosophically pleasing" but in that case, he would simply be discarding most of the <em>real possibilities</em> and therefore heavily <em>reducing the probability of finding the right theory</em>.<br /><br />Realistic models of quantum gravity coupled to the Standard Model are rather rare (string theory is the unique solution, it seems), but because of the multiplicity of the stringy vacua, they may also be considered numerous. Is the number of possibilities large or small? It depends on what you mean by "large" or "small" e.g. what you expected. At any rate, this is the relevant class one has to work with and to say otherwise means to be detached from the basic facts.<br /><br />So of course it is a wonderful thing that these one quadrillion F-theory Standard Models were found and explicitly constructed.<br /><blockquote>What’s being promoted here is a calculation that not only predicts nothing, but provides evidence that this kind of thing can’t ever predict anything. A peculiar sort of progress…<br /></blockquote>These compactifications are (supersymmetric) Standard Models. So they make the same qualitative predictions – of the spectrum and the particles' interactions etc. – as the Standard Model as a QFT, or any other realization of the Standard Model within a complete theory. F-theory isn't just some numerology producing quantum numbers; it actually does include all of the QFT dynamics as a limit. So to say that the F-theory vacua make "no predictions" is as silly as saying that the Standard Model makes no predictions. But an intelligent person understands it. He doesn't have the need to talk about "predictions" all the time.<br /><br />Woit's pathologically obsessive usage of the word "predict" is a sign for every intelligent reader indicating that he's just doing a propaganda for the least demanding readers, not anything that is related to science. While his short emotional rant uses the word "predict" a whopping six times, this verb doesn't appear on the 6 dense pages of the Cvetič et al. preprint at all.<br /><br />Science isn't being done and cannot be done by obsessively screaming buzzwords. It is obvious to everybody with a brain why the mapping of Standard-Model-like compactifications in F-theory, 1/5 of string theory, is an important enough research. SciAm hasn't consulted "F-theory skeptics" because it realized that consulting people who are totally and completely unfamiliar with the topic would be heavily counterproductive for the quality of the resulting article.<br /><br />And that's the memo.<br /><br />P.S.: Some commenters realize that Woit's negative remarks are just "mean" and uninformed. But there's one other commenter who cannot understand anything about the derivation but who still feels entitled to demand a "worsening" of the title. "A quadrillion standard models in F-theory" is no good for that commenter, you know, because they're supersymmetric models and they may have various proton decay operators. So the title will hopefully be bastardized by a reviewer, the commenter hopes.<br /><br />Holy cow. Every reader who has at least 1% chance to get <em>anything</em> useful out of the paper knows that all realistic, detailed models of particle physics incorporated in a theory of quantum gravity must be <em>supersymmetric models</em>. In fact, all the promising potential readers almost certainly know that all models ever described by Cvetič were always supersymmetric. So all these people simply <em>know</em> that the title refers to a verbally nice shortcut of the "MSSM". The MSSM really <em>is</em> the "standard" model in the string model building community.<br /><br />Also, it's terminologically correct to use the term "Standard Model" even if e.g. the proton is much less stable there. The "Standard Model" is a standard phrase defining models that have the same qualitative <em>low-energy spectrum</em> as the theory we need to explain the LHC data. I sincerely hope it's still impossible for these individuals from comment sections on the Internet to corrupt the peer review process but sometimes I am no longer sure.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-74924894826313555812019-03-28T10:30:00.003+01:002019-03-28T11:53:59.232+01:00Some reasons why the West won't stop building collidersMany reasons why it's right to keep on building larger, more powerful colliders are often described in rather mainstream articles. But I happen to think that some of them, while fundamentally true, sound like clichés, politically correct astroturf theses. Like the correct statement that the scientific research is a universal value that unites nations – and people from different nations peacefully cooperate on something that boils down to the same humanity inside both. Just to be sure, I totally believe it and it's important, too!<br /><br />As you know, my emphasis is a bit different... and I want to start with the reasons that are related to the "competition between civilizations". The first assumption of mine that you need to share is that the decisions in the West and the decisions in Asia are done very independently and they may have very different motivations. In particular, the anti-collider activists in the West influence the thinking of the VIPs in China about as much as the P*ssy Riot group does. They're just another strange aspect of the Western mass culture.<br /><br />China – as the place of the CEPC, a planned future collider – has its own discussions about the colliders but only big shots seem to matter in those. <a href="https://motls.blogspot.com/2016/09/chen-ning-yang-against-chinese-colliders.html?m=1">Chen-Ning Yang</a>, a physics titan (Lee-Yang and Yang-Mills), turned out to be the most prominent antagonist. Yang's reasons are really social. He thinks China is too poor and should pay for the people's bread instead. Well, ironically enough, this social thinking won't necessarily be decisive for the leaders in the communist party. Shing-Tung Yau – a top mathematician who comes from a pretty poor family – is among the numerous champions of the Chinese collider.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br /><b>OK, the pride of the West can't be hurt for too long: astronauts' precedent</b><br /><br />Approximately like the space research, high energy physics is one of the major litmus tests that determine which country or continent or civilization is technologically ahead – ahead when it comes to the high-brow newest types of scientific and technological power and/or brute force.<br /><br />There's one recent precedent that can already teach us something. In the 1960s, America was catching up with the Soviet Union in the outer space. Sputnik, Lajka, and Gagarin have really humiliated America and a response was needed. After a decade of huge investments (around 5% of the GDP was the peak annual fraction), America arguably became #1 in most "disciplines" and the 12 men on the Moon were among the most visible cherries on the pie that supports my assertion. But just to be sure, the manned spaceflights are the "mass culture-oriented" part of the space program. <br /><br />Numerous scientific experiments deployed by NASA are probably more important, at least from scientists' viewpoint, and the U.S. is well ahead of Russia there, I think.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />However, a decade ago or so, already during the reign of George Bush Jr, the U.S. was drifting towards a new long-term strategy that ultimately resulted in America's inability to send people into space. Lots of science experiments were still administered by NASA but America had to beg its friend, Russia, to send the people into outer space.<br /><br />This situation continued for a couple of years – I am sure that many of you know the dates in detail, I would have to search and remind myself – but it was a very humiliating situation, indeed, especially with the growing American-Russian political tensions in the background. So the situation – which was nothing else than the "1960s V2.0 Lite" just wasn't sustainable.<br /><br />SpaceX, Elon Musk's cosmic company, became famous – and the only "miracle" that the company did was to partially revive some activity that the U.S. government previously <em>deliberately shut down</em>. SpaceX basically argued "but we still want to send the people in space, don't we?" and did it privately, while recycling some 20-year-old tricks with recycled rockets. And regardless of the government decision to shut down the U.S. manned spaceflights, there was still a clear <em>demand</em> and SpaceX earned lots of money from the government contracts, mainly because of its being politically close to the source of money. Once other companies were allowed to compete with SpaceX, the importance of that company began to drop.<br /><br />Trump is already talking about the returning of the people to the Moon before 2025 and other things. It makes sense: If you want to make America great again, you should better make sure that it's capable of sending people to the Moon again, too. The message of this story is that even if a fad may lead to the abolition of such a high-tech activity in a country like America, it can't last in the long term. And the main reason for the ongoing or looming revival of U.S. manned spaceflights may be called <em>pride</em>.<br /><br />Now, no one cares much about the small "competition" between America and Europe because these two worlds are almost "united", they are living together – especially their physicists do. So the Tevatron was shut down and the LHC took over. There are American physicists working there and even those who aren't there now that it is "our" experiment. But the competition between the West and Asia or the East is a different level.<br /><br /><b>Brain drift: from the West to Asia</b><br /><br />The reasons for the U.S. or Europe not to remain inferior for a long time are not <em>only</em> about pride. There are more practical considerations involved. One of the most important ones is the brain drift. Just a century ago, the U.S. was just a "potential" superpower. The British Empire and Germany were "stronger" in most respects.<br /><br />As you know, that dramatically changed from 1945. America became the world's main – or only – superpower. It wasn't damaged by the Second World War. In many respects, the U.S. has benefited from the war. And the brains were among the important things that the U.S. could acquire at that time. Ironically enough, America's science and technology was strengthened both by the Jews and by the ex-Nazi scientists and engineers. Wernher von Braun is an example of the second group.<br /><br />You might dispute the importance of the transferred scientific and technological elite for the American leadership. I think it's important. The post-war brain drift was important – and the brains drifting to the American universities, research centers, and corporations are important for the current American technological edge, too. America has a healthier "job market" in these high-tech matters than any other country. It can pay the promising people well – so they often go to America.<br /><br />According to some, China may become the leader in an analogous way in the future. I think that these rosy predictions for China are a bit exaggerated but it's possible. That country's ability to take over the scientific and technological talent is a likely part of the potentially successful Chinese efforts to become a real superpower.<br /><br />Now, particle and fundamental physicists represent a certain fraction, perhaps 5% or 10%, of the world's 1 million smartest folks. The reason is understandable: a significant percentage of the smart folks simply <em>are</em> curious about important enough questions – and particle and fundamental physics are top examples. <br /><br />From some broader perspective, these people are analogous to other smart folks who do other things than particle and fundamental physics. But my explanation assumes that we <em>define</em> particle and fundamental physicists as those smart enough folks (here I am assuming that a meritocratic process actually chooses them as smart, not that they and their allies just describe themselves as smart!) who can be <em>lured</em> to physics research because they naturally consider it important.<br /><br />At any rate, by becoming the new headquarters of particle physics, China (or even someone else?) can attract a very large portion of the particle physicists – i.e. not really a negligible percentage of the world's top talent. This could have implications for China because, as Eric Weinstein noted, these people may also invent things such as the World Wide Web, molecular biology, and transistors in their leisure time.<br /><br />America has usually respected the researchers' freedom, it was very professional. In late 2017, I discussed the likely hypothesis that <a href="https://motls.blogspot.com/2017/11/chinese-communist-bosses-may-poison.html?m=1">China would try to shape these people's behavior</a> a little bit more assertively, for them to serve China's interests. I don't want to be very specific about the possible exploitation of these people – but I am sure most of you will agree and have good enough fantasy to find examples.<br /><br />Some apparent fads in the mass culture – e.g. the ordinary people's hobby to say "we hate fundamental physics now" – aren't just inconsequential fads. They may have far-reaching consequences for the global balance of power. Some people have had jobs in science they didn't deserve which is why they consider scientists' jobs to be on par with welfare – it is a correct description in their own case. But if some people got a job easily and without meritocratic reasons, it doesn't mean that all scientists did. Real scientists who deserve the job aren't on welfare (they are often picked from 50+ candidates) and they're likely enough to influence the world in tangible ways, whether people on welfare want to deny this fact or not.<br /><br /><b>Most people in the West actually understand the debt to the physicists</b><br /><br />The U.S. physicists have built the first nuclear weapons which helped to persuade an extremely stubborn enemy, Japan, to surrender. The bombardment of Hiroshima and Nagasaki was tragic and scary and some physicists didn't feel well about it. On the other hand, it's very likely that it shortened the war by a year or years – and therefore saved millions of lives. Equally importantly, the bombs made sure that America wouldn't be a loser in that war.<br /><br />"The shorter war" and "the victorious war" were gifts worth at least a trillion dollars if not ten trillion. The success of this nuclear research ignited the massive funding of high energy physics in the U.S. Many young researchers were surely surprised why they are paid from grants of the "Department of Energy". Why is that? Yes, "energy" sounds a bit physical but the details don't add up, do they? Well, they do if you think about the whole nuclear prehistory of that research.<br /><br />There are several ways to look at the funding for the fundamental, pure physics research from the financial sources that care about very practical matters. One of them is that e.g. the theoretical physicists and purely curiosity-driven experimental particle physicists are helping to establish fields that are adjacent to the practical nuclear research – and even for practical reasons, it's good for the foundations to be firm. <br /><br />The previous paragraph elaborates on the perspective that the government represents the people's interests, it is in charge of things, it knows what it's doing, and it is rationally funding similar things that were helpful in the past. The previous paragraph pretends that the government "creates" the physicists by hiring them.<br /><br />Well, it's not really the most civilized or most ethical perspective because the scientists' passions aren't – or at least shouldn't be – dictated by the government. These passions exist independently of the government. A different perspective is that the results of the Manhattan Project (aside from other practical things) were "gifts" that a subset of the population defined by some <em>traits</em> has given to the overall U.S. population. The traits that define this subset of the people are their 1) intelligence and 2) curiosity to do the fundamental research of the Universe.<br /><br />None of the people from the Manhattan Project are very active in the physics research today. So if you formulated that "debt" too personally, you would end up with nothing. But you know, even nations and companies sometimes owe money and things to each other for a very long time – even when the original borrowers are already dead. There may be a "debt" in between nations or their parts as "abstract sets of people" whose identity lasts longer than the human lifetimes.<br /><br />I think that most of the civilized people actually understand this kind of a "debt" – which is enough to build dozens or hundreds of LHC-like colliders. Japan wasn't defeated by "all people in the U.S. equally". The kind of people who have contributed more deserve some lasting compensation. At some level, this compensation isn't too different from the lasting salaries for the U.S. troops – and especially the compensation for the veterans (and sometimes their widows). I focused on the nuclear bombs but similar comments apply to transistors, molecular biology, the World Wide Web, and more.<br /><br /><b>Reasonable people know that concentrated, ambitious projects are needed to avoid the universal waste of money</b><br /><br />The world's annual GDP approaches $100 trillion – which is equal to the price of 5,000 FCC colliders. The people earn an amount comparable to $100 trillion a year – and they spend it. For a very long time, people were able to produce more than they needed to survive. They did various things with the surplus. They invested it. They built cathedrals. Or colliders. But they could also buy a greater amount of more expensive cigarettes.<br /><br />Again, I am not saying that "cathedrals" are really the same king of expenses as "colliders". The differences between science and religion are profound – but so are some basic similarities. Instead, my main point is that there exists a basic difference between "diluted spending" and "concentrated spending". And it's generally true that whenever the "diluted spending" completely dominates, the society stagnates because the money is being spent for consumption, not investment – e.g. for cigarettes.<br /><br />Unlike cigarettes, the construction of a new collider requires some real work that won't be done automatically, overcoming of some hurdles that can't be overcome without some focused work. Even the very fact that the constructors of the collider – and those who maintain it and use it to do experiments – need to do some mental exercise is nontrivial. Without concentrated, ambitious projects, the skills of these groups of people may fade away.<br /><br />If things get really bad, the society should at least preserve the patent offices that could be stimulating for a new Einstein. If you only allow truly low-brow occupations, you may really lose the potential for discoveries. People can invent or discover amazing things – but these discoveries and inventions become very unlikely if the people spend all their time with much dumber activities.<br /><br /><b>Higgs and top can't become abstract myths from old, dusty books</b><br /><br />Let's optimistically assume that in 2050, the texts (from 2023 or older) revealing the existence of the Higgs boson or the top quark won't be burned and banned yet. And imagine that between 2023 and 2050, there won't be any high-energy collision that would produce these particles with masses above \(100\GeV\). What will the scientifically literate people think about these texts?<br /><br />They will be surrounded by some nice technology that mostly depends on Quantum Electrodynamics only – and many condensed-matter-physics applications of it. Will the smart young physicists actually believe that the Higgs boson and the top quark exist? Will they trust the texts – that will look historical because they will be over 25 years old?<br /><br />Some people will surely believe it. Even though the LHC tunnel will be used to grow mushrooms, they will rightfully think it's a conspiracy theory to suggest that all the texts up to 2023 talking about the production of the top quark and the Higgs boson at some colliders were just "old myths". On the other hand, there will also be a real widespread doubt about the very existence of the particles they will haven't produced for more than 25 years.<br /><br />Unless all the people in 2050 lose their interest in the laws of the Universe altogether, there will be a big enough reason to build some new collider, anyway. At least another repetition of the LHC – which will be much better than nothing. <br /><br />High-energy physics wasn't meant to be a temporary stunt. It's a long-term discipline of physics in which the people's understanding of the basic laws of the Universe is known to get deeper as the center-of-mass collision energy gets higher – and this basic relationship works all the way to the Planck energy (where the growing black holes change the rules of the game). Our colliders are nowhere close to the Planck energy (and they arguably never will be) which is the simplest reason why there's no reason to stop pushing the energy frontier.<br /><br /><b>Intimidation always ends at some point and people realize that new physics may be found</b><br /><br />It's conceivable that new colliders may only produce the particles we already know – and study their interactions at higher energies, perhaps with a better precision. Well, it may happen. It wouldn't be the first time. One could argue that even the tallest cathedrals have failed to persuade God to climb down from Heaven to Earth and personally visit the believers.<br /><br />But new physics – beyond the latest, 2012 discovery of the Higgs boson – may also materialize at higher energies. It's possible that a discovery is waiting in the LHC data that have already been collected (in the run that ended in late 2018) but haven't been analyzed yet. If there's no discovery in that dataset, a new possibility exists when the LHC probes a higher amount of data (integrated luminosity). Or – especially – when new colliders with a higher collision energy are launched.<br /><br />These days, it's fashionable to say that there won't be a new discovery. And even scream at the people who would dare to suggest that a \(2\TeV\) gluino is totally possible within the LHC 2018 data, among other things. Some people allow to be intimidated in this way. But you know, this intimidation cannot last indefinitely because it fundamentally makes no sense.<br /><br />If and when colliders and detectors perform and analyze collisions at energies that are higher than ever before, it's always possible for them to discover a new particle or effect that was previously unknown to the experimenters. Science clearly doesn't know any valid argument that would exclude such new discoveries – or even arguments that would make this scenario very unlikely. The bullies may scare the physicists for a while but they ultimately run out of energy because what they say isn't backed by anything that makes any sense.<br /><br />The discovery of new physics at higher, previously untested energies is always possible and it is always an important natural reason for people to want the new gadget. We don't know of really solid derivations that would tell us what these discoveries are going to be but that's just another reason why the experiments are desirable.<br /><br /><b>High energy collisions are the most agnostic way to look for new phenomena</b><br /><br />One may think about many smaller experiments and phrase them as "competitors" of the next colliders. Well, first, this suggestion that "you may only have this or that" is just wrong. People have built Superkamiokande and the LHC in a similar epoch. They have done various types of research simultaneously and the overall cost was still small relatively to the countries' GDP – one or two percent of the GDP goes to research.<br /><br />Second, even if there were some "real competition", it's very likely that the colliders would win the meritocratic contest because <em>the energy is the most useful variable to parameterize physical effects</em>. People have known the concept of energy for centuries and since the early 20th century, it's been really helpful in particle physics.<br /><br />But the importance of energy for the <em>classification of knowledge</em> has increased further, in the 1970s, with the birth of the renormalization group thinking – by Ken Wilson and others. Since that time, physicists are well aware that much of their knowledge about the world around us is phrased in terms of "effective theories" that are optimized for phenomena with a particular value of energy – an order-of-magnitude estimate of the energies in the process.<br /><br />You may attach various particles and phenomena to the energy axis. In this sense, walking along the energy axis produces <em>qualitatively new</em> particles and physical phenomena. So this looking at "many energy scales" gives you a bigger picture than just trying to measure <em>one particular parameter of Nature</em>, like the proton lifetime, more accurately than before. If you walk from one energy scale to another, <em>all</em> the possible phenomena and their parameters emerge and disappear. So the increase of the energies – which allows you to study matter at a more fundamental level – is a "more general way of looking" for new possibilities.<br /><br />On the other hand, the somewhat cheaper experiments are more specialized. They will only discover something new if they're really lucky – if the only new hypothesized physical phenomenon that they are testing happens to be realized in Nature, with the values of parameters that are accessible by the experiment. That chance should be expected to be smaller than the change to find <em>any</em> new physics in some new range of energies.<br /><br /><b>Most people will realize that the anti-collider folks are Luddites dragging us back to the Middle Ages real fast</b><br /><br />Professional physicists solve lots of technical questions. For example, they decide whether one version of a cutting-edge theory – that has only been partially proven, like inflationary cosmology – is more viable than another. The physicists and sponsors who plan the new experiments are deciding which of the two experiments – or two possible designs of the same kind of an experiment – is more economically feasible or scientifically useful.<br /><br />Such research – and disagreements – are subtle and only experts really understand most of them.<br /><br />However, it seems very clear to me that the contemporary anti-collider and anti-theoretical-physics fad has almost nothing to do with the nuances and careful research. It is a movement represented by the people who have no respect towards science and research in general. They have no respect towards science that has already been found, science that is being found or proposed, no respect towards the proposed theories, no respect towards the experiments that play the role of the judges that help one theory over another, no respect for the curiosity, patience, intelligence, and other character traits that describe great scientists.<br /><br />Up to some moment, a similar populist movement may grow. But later, it reaches a point where the growth stops. New people will stop joining the anti-scientific movement simply because they will realize that they're better. At some moment, the qualitative difference between the "two camps" will become obvious. They will ask: Do I really want to be similar to these Luddites? To the people who just sling mud on everything that is fancier than some superficial laymen's sentiments? Am I not closer to this fancier pro-physics camp instead? And most of these people will just respond to themselves: I am way better than these folks (even if it won't be true in some cases – but it will be a better choice for their image). The people will suddenly say: I actually do have some respect for knowledge accumulated by the mankind, science, the process of accumulating new knowledge, impartiality, integrity (similar to the scientific one), and plans, dreams, and expenses that transcend the everyday life.<br /><br />Once this "peak of the anti-science movement" is reached, and it may be very soon, the trend will reverse and people will start to enjoy talking openly about the sexiness of science as well as the vices of the anti-science activists who prefer the dark ages.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-89529629605935491702019-03-26T08:47:00.001+01:002019-03-29T17:09:48.253+01:00It's irrational to both worship and completely distrust a thinker<b>People like Weinstein hide their fanatical desire to silence thinkers into some "flattering" mumbo-jumbo</b><br /><br />Peter Thiel has hired Eric Weinstein as a part-time economist, part-time talking head about science – someone who produces far-reaching and emotionally loaded statements about the value of science, its future, the relationship between scientists and the establishment and, as we will see... the need for the majority society or the rich to conquer the scientists' brains and turn the scientists into obedient slaves.<br /><br />Last week, Weinstein gave an <a href="https://www.youtube.com/watch?v=2wq9x2QcZN0">80-minute-long</a> very unfocused interview about music, humor, labor... (I don't have patience for all this cheesy and distracting stuff and sorry to say, it is very clear that I don't belong to the target audience – it's just talk addressed to the mass culture) and after 50:00 or so, he talks about his "love-hate relationship" with theoretical physicists. <a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />On one hand, Weinstein sometimes seems to understand mathematical logic and the problems with logical contradictions. For example, hours ago, he tweeted<br /><br /><blockquote class="twitter-tweet" data-lang="cs"><p lang="en" dir="ltr">A lot of people think transparency is simply a good thing. Those same people who feel comfortable working at that level of generality tend to think privacy is also a good thing. <br /><br />I guess I don’t know how those two thoughts don’t bump into each other in the hallways of the mind.</p>— Eric Weinstein (@EricRWeinstein) <a href="https://twitter.com/EricRWeinstein/status/1110342298297094144?ref_src=twsrc%5Etfw">26. března 2019</a></blockquote><script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script><br />And I "liked" the tweet because indeed, there is a general or potential contradiction between transparency and privacy. Some people advocate both principles and they do so to the extent that they're running into rather sharp contradictions. I still believe that many of us have a reasonable taste where the boundary should lie and where transparency should replace privacy (the most important principle is that the more personal, individual, and non-essential for other people's lives some information is, the more privacy should be respected).<br /><br />But some people don't seem to realize that they're sometimes religiously defending words that contradict each other in the zeroth approximation.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />Great. So Weinstein sometimes realizes that logical contradictions are a problem. But in the interview, he says things like<br /><blockquote>There is nothing that could intellectually match or beat the theoretical physics community. They do amazing things and may produce things like the molecular biology as a small side effect of their research.<br /><hr>Theoretical physicists have been on the wrong track for half a century or so and they need people like me to fix it and end the epoch of failures.<br /></blockquote>Can't you see the obvious contradiction, Eric? According to proper logic, you either believe that it's constructive to allow some bright people like XY (or a vaguely defined group of folks similar to XY) to think for themselves and reach their own opinions about what is true and what is worth thinking about (related to physics); or you don't. Assuming the logical consistency, your answer just cannot be both Yes and No!<br /><br />But your answer <em>is</em> both Yes and No, Eric. You believe that there's some "pile of mental gold" but you seem to believe that its powers may only exploited if someone like you who isn't a part of that pile of gold makes all the important decisions – and probably determines the rough conclusions that the scientists in the pile should reach. Sorry, science just cannot work like that. It makes absolutely no sense because the gold <em>is</em> composed of all the mental steps and decisions that you apparently want to take from them.<br /><br />In Germany of the 1930s, they enjoyed the Aryan Physics. The politicians also saw some potential in the body of physicists but to optimize the usage of the potential, the physicists had to be constrained. For example, they had to be shielded from the evil Jewish and theoretical physics – starting with Einstein's relativity. Folks like Werner Heisenberg found themselves in between a rock and a hard place, seeing their patriotism collide with their scientific knowledge, passion, and integrity (Heisenberg obviously knew that relativity was right and he could mostly keep that view which indicates <em>some</em> tolerance of the system of that time, perhaps higher than what we are seeing today). Too bad, the "mainstream" thinking about these matters that you represent has returned to this discourse of the 1930s.<br /><br />Theoretical physicists may be average or worse in tons of ordinary things. But what defines them is that they're better – or reasonably expected to be better – exactly in the kind of abstract, demanding, extraordinary things that the ordinary people are naive about. <em>This</em> is exactly where their freedom to think is absolutely essential for the exploitation of their intellectual potential. To suggest that they should be "led" by some outsiders or ordinary people when it comes to the <em>big questions directly linked to the topic of the research</em> means to say that they're really useless.<br /><br />Weinstein dramatically discusses that in the mid 1980s, he could have joined theoretical physics but he didn't because he disliked string theory. Why do you discuss it so dramatically, Eric? You just never became a theoretical physicist. Many other people considered becoming astronauts but they didn't become astronauts, e.g. because NASA decided that their amputated leg was a problem, after all. What's the difference? Or consider a lad in the mid 19th century: "Dear professor, I want to become a famous physicist but I don't like thermodynamics and electrodynamics, those are useless failures. Kepler's laws were nice." What can the professor do with him? You were born in 1965 and around 1985, you were approximately 20 years old, a not very mature man, you weren't "getting" string theory and similar things that defined the field at that time, and you made a bet that string theory would be a fad that would go away. And maybe you could return to theoretical physics then.<br /><br /><em>Sadly, people born around 1965 are already leaving us. A crazy and eccentric yet privately introvert blonde Czech singer who loved tropics, delights, and plush toys, Daniel Infinite (Daniel Nekonečný) who was born in 1966 as Daniel Finite (Daniel Konečný, no kidding), and a key person in the bands "<a href="https://www.youtube.com/results?search_query=laura+a+jej%C3%AD+tyg%C5%99i" rel="nofollow">Laura And Her Tigers</a>" and the "<a href="https://www.youtube.com/results?search_query=%C5%A1um+svistu" rel="nofollow">Roaring of the Swist</a>", suddenly died of heart attack today (well, a few days ago, but was found today). See e.g. his <a href="https://www.youtube.com/watch?v=Co7UZWyA4Dg" rel="nofollow">I Am the Boss/Barefoot</a> [And You Are Bosa Nova/Barefoot, a pun] or <a href="https://www.youtube.com/results?search_query=daniel+nekone%C4%8Dn%C3%BD" rel="nofollow">more</a>.</em><br /><br />Well, 34 years later, this bet still seems to be completely wrong and in these subsequent 34 years, you haven't done any – stringy or non-stringy – stuff that would be considered a valuable contribution to physics by anyone similar to Mr XY mentioned above. But you're trying to paint yourself as a <em>hero</em> because you didn't ever become a physicist. What sort of a hero status is it? You can see that you're just pandering to the egos of the most ordinary people who just want to hear that physicists are bad in some way, right? <br /><br />You didn't become a physicist because you weren't capable of doing any research that would be considered interesting at that time. So you weren't hired by the people who really understand stuff. You could have hypothetically had some non-stringy interesting stuff but you didn't have it. In the subsequent 34-year-long era, the outcome would have probably been the same. You may be hired as a theoretical physicist by Peter Thiel who knows virtually nothing about theoretical physics. Great. Why do you think it is a reason to brag? It's not.<br /><br />And now, in Weinstein's comments, there are some real gems such as:<br /><blockquote>The youngest person who has contributed to the Standard Model is Frank Wilczek now.<br /></blockquote>Right. The youngest person who has co-built the Standard Model is a rather old man now – simply because the Standard Model is a rather old theory, too. And, if you have missed it, Albert Einstein and Isaac Newton have already died. Rest in peace, Isaac and Albert. What's the big issue here? The Standard Model was really completed in the early to mid 1970s. In other words, the Standard Model hasn't been a cutting edge of theoretical physics that could pick the brightest minds for some 45 years.<br /><br />Other topics have become the hot topics since the 1970s. Many questions have been basically settled while in others, theoretical physicists have found many possibilities and we don't know which of them is right if any. You don't appreciate these advances which is too bad. But the reason why you don't appreciate them is that you're just another ordinary layman. Your lack of appreciation isn't any different from the lack of appreciation for the cutting-edge science that most laymen have displayed in any other previous epoch of physics – or science.<br /><br />It is frustrating that the broader society doesn't appreciate amazing things that were settled by theoretical physicists since that time – the fact that the spacetime we inhabit has some 6-8 extra dimensions, elementary building blocks are extended and may melt into each other, dualities imply that seemingly very different pictures of the Universe are actually equivalent, black holes evaporate yet preserve the information, there is AdS/CFT and its pp-wave limit and F-theory compactifications with fluxes that are at least cousins of our Universe, and so on. <br /><br />And indeed, it has a personal dimension. Honestly, I also think it is utterly terrible that the broader public doesn't understand or appreciate e.g. matrix string theory and its founder! You could do better than the average member of the general public but you don't. It is very clear from the "worshiping" part of your monologue that you treat the membership in the "theoretical physics community" as a matter of one's big ego and (like in the case of a couple of other people) this ego was simply hurt when you didn't become a real theoretical physicist. So you're simply trying to revenge for that hurt ego.<br /><br />There are tons of other weird statements made in the monologue. For example:<br /><blockquote>The theoretical physics also sits on some golden knowledge such as the renormalization group techniques which could be used everywhere. And theoretical physicists fail to communicate it...<br /></blockquote>Renormalization group is indeed considered a great conceptual discovery by the "likes of XY" above. But what you don't seem to understand is that a physics researcher is something else (and, within the intellectual hierarchy, much more) than a communicator or a teacher or a journalist. People do various things. Particle physicists use the renormalization group in the particle physics research. Condensed matter physicists use it in condensed matter research. And the renormalization group philosophy and techniques may also gradually penetrate to "less hard" disciplines of the human activity because it can be useful there, too. But it is probably not quite as useful and it is also harder for the people in those "softer" disciplines because the renormalization group may be too hard.<br /><br />Maybe the renormalization group techniques could advance many other fields. I find it totally plausible. Maybe it's a great project for very smart people similar to the physicist XY above. But "spreading the gospel of the renormalization group" simply isn't the job for the top minds in research. They have more important things to do. To "spread the gospel", it is enough to have "less special" people to do it. If the gospel isn't being spread, it's primarily the fault of the communicators, not the researchers. You don't really seem to understand the differences between the different occupations, Eric. The top researchers with extraordinary brains should have the room to do research – and especially the room to investigate possible ideas with far-reaching, surprising, and counterintuitive conclusions because that's where their comparative advantage lies. No one should intimidate them and force them to accept a layman's opinion about the number of dimensions in the Universe, if I pick a very simple yet important example. If you hire ordinary people to decide about the big enough questions – like whether string theory is correct, you know – while you violently downgrade top researchers to some journalists who promote 40-year-old physics discoveries such as the renormalization group, you will basically destroy science as a part of our civilization. This is no detail, it's no laughing matter.<br /><br />Incidentally, I would be thrilled to join as a champion of the "renormalization group for other fields". But back in the real world, I have to be one of the warriors who want to preserve variables such as \(x\) and \(y\) in the elementary schools, among similar things, and we're still apparently losing even this battle! How do you want to increase the knowledge of the renormalization group by folks in other fields in a society that is increasingly hostile towards science (and mathematics)? And you, Eric, are contributing to this anti-science hostility – much more than you have ever contributed to science itself.<br /><br />There is nothing generally wrong about theoretical physics since the 1970s and all the propaganda claiming otherwise is just the postmodern version of the Aryan Physics or Aryan Physics v2.0. Everyone who participates in it should be deeply ashamed. What is <em>actually</em> wrong are the external political pressures acting against the scientific environment and the scholars' very freedom of thought.<br /><br />And don't make a mistake about it: Superstring/M-theory is the language in which God wrote the ten-dimensional world.<br /><br />Amen to that – no one else says this for me, either. ;-)<br /><br /><hr><br /><b>A rant against the relevance of quantum computers</b><br /><em>A bonus example showing how journalists are serving anti-science sentiments everywhere</em><br /><br />Reader P.F. has enjoyed an article <a href="https://www.nextplatform.com/2019/03/25/quantum-no-threat-to-supercomputing-as-we-know-it/">Quantum No Threat to Supercomputing As We Know it</a>. It's quite annoying because that text is almost certainly the worst demagogic text I read about quantum computers in years.<br /><br />The girl who wrote it created a heroic story from the proposition "Cray, a supercomputer company, hasn't joined efforts to build a quantum computer". Great, it hasn't but what's so wonderful about it? Just a small number of companies did – and those are more interesting here, aren't they? Most companies in the world didn't. Exxon, Tesla, Nestle and other not-really-high-tech companies don't work on their quantum computer. If you really appreciate that passivity, Peter, let me say that I don't own any experimental lab for quantum computing, either! ;-) <br /><br />And indeed, quantum computers aren't "direct competitors" to supercomputers. Like Philip Morris International, Cray will probably do fine <em>commercially</em> without any quantum computing platform. But the research of quantum computers isn't just some <em>business as usual</em>. It's a disruptive activity - to exploit a buzzword that the journalists who love to spread hype choose in tons of wrong contexts but don't pick e.g. here where it's appropriate – meant to create a whole new industry. Some companies that have worked in adjacent industries are working on it. The current stage is a gradual transition from applied physics to commercially feasible products. It's not yet an established industry which is why it's wrong to look at this activity from a "business as usual" perspective. The companies who invest into it should better not overpay. But that doesn't mean that there aren't wonderful reasons to join these efforts.<br /><br />Supercomputers and quantum computers are like the companies producing high-caffeinated sodas (Kofola) and alcoholic beverages (Stock Spirits) – choose which of them is the high-energy novel field LOL. They just don't "directly" compete because the beverages are qualitatively different and have different audiences and contexts when they're consumed. Or a more technical analogy: producers of tanks vs anti-tank missiles. They don't directly compete with each other except that the products sometimes do fight against each other. An anti-tank missile may do a very special task – like a quantum computer – a task that rips a tank apart.<br /><br />Are anti-tank missiles good or bad for the producers of tanks? They're probably good. When tanks are being eliminated by missiles, a straightforward solution is to replete the reservoir of tanks. So tank companies produce more and have higher profits. It's analogous with supercomputers whose applications may be "ripped apart" by some quantum algorithms. When some codes are broken by quantum computers, the first defense strategy will probably be to make the codes harder by using <em>more</em> of the ordinary supercomputer tricks, won't it?<br /><br />So the claim that supercomputers and quantum computers don't directly compete is probably true – but it's not new at all. And everything else that is being served to the reader is just some kind of misleading delusion. In between the lines, the readers is being served that quantum computers can't have far-reaching implications. They may very well have very far-reaching (and perhaps dangerous) implications for our IT world that depends on cryptography. The reader is served that quantum computers aren't a big deal or they're not very new and fancy applied science. They are a very big deal, surely from a scientific viewpoint. Unlike all the stuff that average journalists love to hype – climate change or electric cars, for example (which aren't new or scientifically interesting at all) – quantum computers <em>are</em> both novel and deep.<br /><br />And the readers are being persuaded that it's heroic for Cray to ignore this potentially emerging field, quantum computing. There is nothing heroic about it at all. And there's nothing intelligent about praising similar would-be high-brow articles written by girls who don't have a clue about the things that actually matter. By the way, this article is what the "politically correct" articles about quantum computing will look like – hostile rants by authors who don't have a clue how quantum mechanics works but who present themselves as important pundits by spitting on its tangible applications.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-53121613692443212782019-03-24T07:12:00.000+01:002019-03-27T16:21:41.943+01:00CMS: a 3.5-sigma excess in CP-odd Higgs to tops decaysThe CMS collaboration has apparently resumed its mass production of deviations from the Standard Model. After the hints of a gluino in gauge-mediated supersymmetry breaking, we have a new anomaly:<br /><blockquote><a href="http://cds.cern.ch/record/2668686/files/HIG-17-027-pas.pdf" rel="nofollow">Search for heavy Higgs bosons decaying to a top quark pair in proton-proton collisions at \(\sqrt{s} = 13\TeV\)</a><br /></blockquote>The excess is locally 3.5 sigma and globally 1.9 sigma.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />A hypothetical new Higgs boson – which is searched between \(400\GeV\) and \(700\GeV\) and for various small values of \(\tan\beta\) – is assumed to decay to top quarks which decay further. The maximum deviation occurs near the lightest edge of the possible mass, around \(m_A=400\GeV\) for the mass of the CP-odd Higgs boson \(A\).<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />The situation is most clearly summarized by the last figure:<br /><br /><img src="https://cds.cern.ch/record/2668686/files/Figure_006.png"><br /><br />The black line is the expected exclusion limit – everything below this line should have been excluded. The two grey strips indicate the 68% and 95% bands, understood as a percentage of models for the given values of \(m_A,\tan\beta\) that are excluded.<br /><br />The blue region is the actual excluded part of the parameter space. You may see that much less is excluded than expected, especially on the left side. As I said, the excess is as high as 3.5 sigma on the left boundary of the diagram. It was expected that everything below \(\tan\beta\lt 2.3\) would be excluded but only \(\tan\beta\lt 1\) was excluded.<br /><br />It's all great and they have even taken some interference with top quark pairs into account. However, the possible and rather strong excess has one big disadvantage. It might be an artifact of the leading order (LO) approximation chosen for the MADGRAPH5_AMC@NLO calculations.<br /><br />That lousy approximation might matter because higher-order processes could predict significantly higher cross sections for the top pair quark production near the threshold. And because the minimum mass of a top quark pair is some \(350\GeV\) or so, \(400\GeV\) is near the threshold, indeed. After all, these risks near the "top quark pair threshold" are the likely reason why they didn't try to look at lower values of the CP-odd Higgs mass.<br /><br />That's why this excess, despite its being signal-like, could be more likely to be explained by some neglected Standard Model processes than other, smaller excesses I discussed recently. But it's still signal-like which is why there exist good reasons to think that it could be a sign of a new Higgs boson, too.<br /><br />That excess could be due to the same new physics as this similar <a href="https://arxiv.org/pdf/1802.03158.pdf#page=20">L200 signal region of ATLAS</a> in a similar search (3rd line from the bottom on that page).<br /><br />An isolated 1.9-sigma excess is also found in a bin on <a href="http://cds.cern.ch/record/2668677/files/HIG-18-008-pas.pdf#page=7">Figure 2b</a> in a different paper.<br /><br />Some huge, over 5 sigma, <a href="http://cds.cern.ch/record/2668754/files/BPH-18-005-pas.pdf">discrepancies were found for the \(J/\psi\) physics</a> but I don't understand it well and it seems too messy, not a reason for new physics.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-91526628977623489582019-03-22T08:13:00.000+01:002019-03-22T09:24:02.642+01:00A scalar weak gravity conjecture seems powerful<b>Stringy quantum gravity may be predicting an \(r=0.07\) BICEP triumph</b><br /><br />Many topics in theoretical physics seem frustratingly understudied to me but one of those that are doing great is the Weak Gravity Conjecture (WGC) which is approaching 500 followups at the rate of almost a dozen per month. WGC hasn't ever been among the most exciting ideas in theoretical physics for me – which is why the activity hasn't been enough to compensate my frustration about the other, silenced topics – but maybe the newest paper has changed this situation, at least a little bit.<br /><br /><iframe width="407" height="277" src="https://www.youtube.com/embed/Ifimsaport8" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe><br /><br /><em>Nightingales of Madrid by Waldemar Matuška. Lidl CZ goes through the <a href="https://www.lidl-flyer.com/3f701db6-43d7-11e9-917e-005056ab0fb6/locale/cs-CZ/view/flyer/page/1?_ga=2.20226929.852735331.1553242434-2067842266.1517309103" rel="nofollow">Spanish week</a> now.</em><br /><br />Eduardo Gonzalo and Luis E. Ibáñez (Zeman should negotiate with the Spanish king and conclude that our ň and their ñ may be considered the same letter! Well, the name should also be spelled Ibáněz then but I don't want to fix too many small mistakes made by our Spanish friends) just released:<br /><blockquote><a href="https://arxiv.org/abs/1903.08878">A Strong Scalar Weak Gravity Conjecture and Some Implications</a><br /></blockquote>and it seems like a strong cup of tea to me, indeed. The normal WGC notices that the electron-electron electric force is some \(10^{44}\) times stronger than their attractive gravity and figures out that this is a general feature of all consistent quantum gravity (string/M/F-theory) vacua. This fact may be justified by tons of stringy examples, by the consistency arguments dealing with the stability of near-extremal black holes, by the ban on "almost global symmetries" in gravity which you get by adjusting the gauge coupling to too small values, and other arguments. <br /><br />Other authors have linked the inequality to the Cosmic Censorship Conjecture by Penrose (they're almost the same thing in some contexts), to other swampland-type inequalities by Vafa, and other interesting ideas. However, for a single chosen Universe, the statement seems very weak: a couple of inequalities. The gravitational constant is smaller than the constant for this electric-like force, another electric-like force, and that's it.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />Yes, this Spanish variation seems to be stronger. First, we want to talk about scalar interactions mediated by scalars instead of gauge fields. At some level, this generalization must work. A scalar may be obtained by taking a gauge field component \(A_5\) and compactifying the fifth dimension. If the force mediated by the gauge field was strong, so should be one mediated by the scalar.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />To make the story short, they decide that the scalar self-interactions must be stronger than gravity as well and decide that an inequality for the scalar potential should hold everywhere, at every damn point of the configuration space\[<br /><br />2(V''')^2 - V'''' \cdot V'' - \frac{(V'')^2}{M_P^2} \geq 0.<br /><br />\] It's some inequality for the 2nd, 3rd, 4th derivatives of the potential. The self-interaction's being strong says that the third derivative should mostly dominate, in some quantitative sense. That's a bit puzzling for the purely quartic interactions. For \(A\phi^2+B\phi^4\), the inequality seems violated for \(\phi=0\) because there's a minus sign in front of the fourth derivative term and the "purely second" derivative term, too (the third derivative term vanishes in the middle). Do we really believe that this first textbook example of a QFT is prohibited? Does quantum gravity predict that the Higgs mechanism is unavoidable? And if it does, couldn't this line of reasoning solve even the hierarchy problem in a new way?<br /><br />OK, they decide this is their favorite inequality in two steps: the fourth-derivative term is added a bit later, for some consistency with axions.<br /><br />The very fact that they have this local inequality is quite stunning. In old-fashioned effective field theories, you could think that you may invent almost any potential \(V(\phi)\) and there were no conditions. But now, calculate the left hand side of the inequality above. You get some function and of course it's plausible that it's positive in some intervals and negative in others. It's unlikely that you avoid negative values of the left hand side everywhere. But if it's negative anywhere, this whole potential is banned by the new Spanish Inquisition, I mean the new Spanish condition! Clearly, a large majority of the "truly man-made" potentials are just eliminated.<br /><br />Now, the authors try to find a potential that saturates their inequality. It has two parameters and is the imaginary part of the dilogarithm. It's pretty funny how complicated functions can be obtained just by trying to saturate such a seemingly elementary condition – gravitation is weaker than self-interactions of the scalars – that is turned into equations in the most consistent imaginable way.<br /><br />The potentials they're led to interpolate between asympotically linear and perhaps asymptotically exponentially dropping potentials. They also derive some swampland conjectures and find a link to the distance swampland conjecture, another somewhat well-known example of Vafa's swampland program.<br /><br />The WGC-like thinking has been used to argue that string/M-theory prohibits "inflation with large excursions of the scalar field". The "large excursion" is basically prohibited in analogy with the "tiny gauge coupling", it's still morally the same inequality. And it's a "weak" inequality in the sense that there's one inequality per Universe.<br /><br />But these Spaniards have a finer resolution and stronger claims – they study the inequalities locally on the configuration space. And in the case of inflation, they actually weaken some statements and say that large excursions of the inflaton are actually allowed if the potential is approximately linear. As you know, I do believe that inflation is probably necessary and almost established in our Universe. But the swampland reasoning has led Vafa and others to almost abandon inflation (and try to replace it with quintessence or something) because the swampland reasoning seemed to prohibit larger-than-Planck-distance excursions of the inflaton. Others were proposing monodromy inflation etc.<br /><br />But these authors have a new loophole: asymptotically linear potentials are OK and allow the inflaton to go far and produce 50-60 \(e\)-foldings. If they were really relevant as potentials of the inflaton, you would have a very predictive theory. In particular, the tensor-to-scalar ratio should be \(r=0.07\) which is still barely allowed but could be discovered soon (or not). Do you remember the fights between BICEP2 and Planck? Planck has pushed BICEP2 to switch to publishing papers saying "we don't see anything" but I still see the primordial gravitational waves in their picture and \(r=0.07\) could explain why I do. According to some interpretations, <a href="https://arxiv.org/pdf/1502.01334.pdf">Planck+BICEP2 still hint at</a> \(r=0.06\pm 0.04\), totally consistent with the linear potential. BICEP3 and BICEP Array have been taking data in the recent year or two. Do they still see something? Perhaps I should ask: Do they see the tensor modes again? Hasn't the Brian guy who did it for the Nobel Prize given up? Are there others working on it?<br /><br /><img src="https://cosmos-images1.imgix.net/file/spina/photo/5356/img.png?ixlib=rails-2.1.4&auto=format&ch=Width%2CDPR&fit=max&w=835" width=407><br /><br />These new authors also claim that a near-saturation of their inequality naturally produces the spectrum of strings on a circle, with momenta and windings related by T-duality. In the process, they deal with the function \(m^2\sim V''\) and substitute integers to some exponentially reparameterized formulae... Well, I don't really understand this argument, it looks like black magic. Why do they suddenly assume that some of the parameters are integers and these integers label independent states? But maybe even this makes some sense to those who analyze the meaning of the mathematical operations carefully.<br /><br />We often hear about predictivity. The swampland program and the WGC undoubtedly produce some predictions (like "gravity is weak") – it's a reason I was naturally attracted to these things because by my nature, I usually and slightly prefer to disprove and debunk possibilities than to invent new ones – but these predictions have looked rather isolated and weak, a few inequalities or qualitative statements per Universe. But when studied more carefully, there may be tons of new consequences like inequalities that hold locally in the configuration space. Functions that nearly or completely saturate these conditions are obviously attractive choices of potentials (I finally avoided the adjective "natural" not to confuse it with more technical versions of "naturalness").<br /><br />And these functions may have the ability to turn stringy inflation into a truly predictive theory because they would imply the \(r=0.07\) tensor modes. Maybe WGC is pretty exciting, after all. (Just to be sure, it's been known for a long time that the linear potentials produce this tensor-to-scalar ratio.)<br /><br />If it is truly exciting, I am still comparing it to the uncertainty principle. Imagine that you have some inequalities that look like the uncertainty principle for various pairs of variables. Some of these inequalities might be a bit wrong, a bit too weak etc. But you also want to consolidate them (into the general inequality for any two observables) and derive something really sharp and deep, e.g. that the observables have nonzero commutators. (This is not how it happened historically, Heisenberg had the commutators first, in 1925, and the inequality was derived in 1927.)<br /><br />Maybe we're in a similar situation. They're asking the reader whether the WGC is a property of the black holes only or quantum gravity. I surely think it's both and the latter is more general. Black holes are just important in quantum gravity – as some extreme and/or generic localized objects (which produce the whole seemingly empty interior and paradoxes associated with it). But at the end, I do think that the WGC or its descendants should be equivalent even to holography and other things that are not "just" about the black holes.<br /><br />Quantum gravity is <em>not</em> quite the same as an effective field theory. And the difference between the two <em>may</em> be very analogous to the difference between classical and quantum physics. The WGC and its gradually thickening variations could be the first glimpses of a new understanding of quantum gravity – first glimpses that might hypothetically make the full discovery and understanding unavoidable.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-51787902366966958222019-03-19T15:08:00.002+01:002019-03-21T07:25:24.109+01:00CMS: 2.4-sigma excess in the last gluino bin, photons+MET<img src="https://ae01.alicdn.com/kf/HTB1KDA.RXXXXXXcXpXXq6xXFXXXI/Gluino-Vampire-Alchemist-human-free-eyes-1-3-bjd-popular-bjd-gift-dolls-resin-figures.jpg_640x640.jpg" width=407><br /><br /><em>Gluino, a vampire alchemist with human eyes</em><br /><br />I just want to have a separate blog post on this seemingly small anomaly. We already saw the preprint for one day in advance but the CMS preprint finally appeared on the hep-ex arXiv:<br /><blockquote><a href="https://arxiv.org/abs/1903.07070">Search for supersymmetry in final states with photons and missing transverse momentum in proton-proton collisions at 13 TeV</a><br /></blockquote>OK, they look at events in which two photons are created and seen in the calorimeters, plus the momentum addition doesn't seem to add up. The sum of the initial protons' \(\sum\vec p_i\) seems to differ from the final particles \(\sum \vec p_f\). The difference is the "missing transverse momentum" but because such a momentum is carried by particles which must have at least the same energy, it's also referred to as MET, the missing \(E_T\) or missing transverse energy.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />OK, CMS has picked the collisions with the qualitatively right final state, photons plus MET, and divided them to bins according to the magnitude of MET. The last bin has MET between \(250\GeV\) and \(350\GeV\). It's very hard to produce this high missing transverse momentum at the LHC – the Standard Model assumes that MET is carried by the invisible neutrinos only. And although the protons carry \(13\TeV\) of energy in total, it's divided between many partons in average, and it's unlikely that a neutrino created in the process can steal more than \(0.35\TeV\) of energy for itself.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />In this last bin of photons+MET, 5.4 events were expected, plus minus the systematic error 1.55 events or so. However, a whopping 12 events were observed. If you combine the 1.55 systematic error with the \(\sqrt{5.4}\) statistical error in the Pythagorean way, you get some 2.8 events for a total sigma, and 12 is some 2.4 sigma above the predicted 5.4. Sorry if my calculation is wrong but that's a mistake I probably repeated a few times. They seem to say that the error 1.5-1.6 already includes the statistical error and I can't see how it can be true because 1.6 is smaller than the square root of five.<br /><br />A 2.4 sigma excess corresponds to a 99% confidence level. It means that if this increase from 5.4 to 12 is due to chance, the probability that it occurred by chance is just some 1% or so. It means that the odds that it is due to a real signal are about 100 times higher than your prior odds. That's a significant increase. I think it's basically fair to interpret this increase as a reason to increase the visibility of this particular CMS search by a factor of 100 (for those who look for new physics).<br /><br />Some people learned to react instinctively and easily. If the excess is just 2.4 sigma, it can't be real. I think it's a sloppy reasoning. It's <em>right</em> to be affected by 2.4 sigma excesses. In softer sciences, this is more than enough to claim a sure discovery and demand a world revolution that is proven necessary by that discovery. Particle physicists want to be hard, nothing below 5 sigma counts, but the truth is fuzzy and in between. It's just right to marginally increase one's belief, hopes, or attention that something might be true if there is a 2.4 sigma excess.<br /><br />And this excess is in the last bin – the cutting edge of energy and luminosity. With the full dataset, if things just scale and it's a signal, there could be 50 events instead of the predicted 22, just on CMS. The same could be seen by ATLAS, in total, 100 events instead of 44. Maybe some events would show up above \(350\GeV\). If true, it would surely be enough for a combined or even separate discovery of new physics.<br /><br /><iframe align="left" scrolling="no" frameborder="0" style="width:140px;height:245px;" marginheight="0" src="//ws-na.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&OneJS=1&Operation=GetAdHtml&MarketPlace=US&source=ac&ref=tf_til&ad_type=product_link&tracking_id=lubosmotlsref-20&marketplace=amazon®ion=US&placement=0000000000&asins=B009YXFX0G&show_border=false&link_opens_in_new_window=false&price_color=BBBBBB&title_color=FFAA44&bg_color=002211" marginwidth="0"/></iframe>And yes, this new physics looks just damn natural to me. Gluinos, the superpartners of gluons, may appear in this photons+MET channel assuming a popular version of supersymmetry breaking, the gauge mediation supersymmetry breaking (GMSB). In that setup, the lightest supersymmetric particle and therefore dark matter particle is the gravitino \(\tilde G\), the superpartner of the graviton whose mass could be just \(1\eV\) or so which is <a href="https://arxiv.org/abs/0705.0219">nice in cosmology</a>. Its interactions are very weak and the particle is predicted to be invisible to doable direct detection experiments, because gravitino's interactions are superpartners of gravity which is the weakest force, you know, thus easily explaining the null results (and suggesting that these direct search experiments were "wasted money" – such things are unavoidable somewhere in science; science is exploration, not a guaranteed insured baking of breads).<br /><br />The gravitino could still be unstable i.e. R-parity could be broken (but the lifetime could be billion of years so the gravitinos are still around as dark matter) – in which case the gravitino decays could be seen in cosmic rays. On top of that, the NLSPs' decays to the gravitino could be seen in colliders. If the NLSP is too long-lived, there's a problem with the Big Bang Nucleosynthesis.<br /><br /><img src="https://vignette.wikia.nocookie.net/nomanssky/images/8/8c/Gravitino_ball_desc.png/revision/latest?cb=20160820150842" width=407><br /><br /><em>If gravitinos are dark matter, they're probably overpaying for the commodity. See how these <a href="https://www.youtube.com/watch?v=275dxdYFBp4" rel="nofollow">gravitino balls are farmed</a>.</em><br /><br />We also need to assume a neutralino NLSP, the next to lightest superpartner, and it's not just any neutralino. It must be close to the bino, the superpartner of the B-boson, the gauge boson for the electroweak \(U(1)\) hypercharge gauge group (bino is closer to a photino than a zino under the Weinberg angle rotations of bases, correct my mistakes, please). There are many other possibilities but for the largest percentage of my tenure as an amateur supersymmetry phenomenologist, I have considered this assignment of the LSP and NLSP to be the more natural one despite my clear grasp which stringy vacua predict it. (Later, I looked at <a href="https://arxiv.org/pdf/hep-th/0512170.pdf">an American-Romanian-Slovak 2005 paper</a> to have an idea that it's possible at all.)<br /><br />Just to be sure, it's different from Kane's \(G_2\) M-theory compactifications, from Nima-et-al. split supersymmetry, and from many other things you've heard. The number of choices for the qualitative character of supersymmetry breaking and for the hierarchy of the superpartner masses is somewhat large.<br /><br />The mass of the gluino indicated by the small excess would be about \(1.9\TeV\). Some squarks could be around \(1.8\TeV\) but detailed numbers are too model-specific because there are many squarks that may be different – and different in many ways.<br /><br />As we discussed with Edwin, Feynman has warned against the "last bin", in an experimental paper claiming that the FG theory wasn't FG. But I think that he talked about a situation in which the systematic error in the last bin was high – the experimenters lost the ability to measure something precisely due to complications in their gadget. Here, the error in the last bin is already mostly statistical. So one can't get a new bin simply because there are no events detected with the MET above \(350\GeV\) and zero is too little.<br /><br />In this sense, I think it's right to say the exact opposite of Feynman's sentence here. Feynman said that if the last bin were any good, they would draw another one. Well, I would say that if the last bin were really bad, it would have zero events and they wouldn't include it at all. ;-)<br /><br />There's a significant probability it's an upward fluke, some probability it's a sign of new physics. None of them is zero. To eliminate one possibility means to be sloppy or close-minded. This last bin of this channel is arguably one of the most natural bins-of-channels where the superpartners could appear first, and the gradual appearance that grows from 2 sigma to 5 sigma and then higher is the <em>normal thing how a discovery should proceed</em>. It's no science-fiction speculation. The LHC is struggling to see some events with very high energies of final particles that the LHC is barely capable of producing – it just produces them too scarcely. In such a process, it's just rather natural for new physics to gradually appear in the last bin (what is the last bin in one paper but may be earlier than the last in higher-luminosity papers).<br /><br />The Higgs boson also had some small fluctuations first. At some time in 2011, we discussed a possible bump indicating a \(115\GeV\) Higgs if you remember. It went away and by December 2011, a new bump emerged near \(125\GeV\) and I was correctly <a href="https://motls.blogspot.com/2011/12/higgs-at-124-126-gev-is-sure-thing.html?m=1">certain</a> it was the correct one and I was right (I wasn't the only one but the certainty wasn't universal). This last bin may be analogous to the wrong \(115\GeV\) Higgs. But it may also be analogous to the correct \(125\GeV\) Higgs and it may grow.<br /><br />You see that this paper was just published now and only uses 1/4 of the available CMS dataset. It sees a minor fluke. The LHC still has enough not yet analyzed data that could produce new physics – although the LHC is already not running for several months. It's just plain wrong for anyone to say that "the discovery of new physics in the LHC data up to 2018 has already been ruled out".<br /><br /><hr><br /><b>Update, March 21st:</b><br /><br />There is a new <a href="http://cds.cern.ch/record/2668332/files/SUS-18-005-pas.pdf">CMS search for gauge mediation</a> which also has 35.9/fb, includes the diphoton channel above, plus three more channels. One of them is one photon+jets+MET which was already <a href="https://arxiv.org/abs/1707.06193">reported in July 2017</a> when I naturally ignored it because that paper seemed like "clear no signal" in the absence of the diphoton analysis in it. But there's an excess in the (next to last) bin with the transverse energy \(450\)-\(600\GeV\), one photon, jets (or "transverse activity"), and missing energy. Instead of 3 expected, one gets 10 events.<br /><br />In combination, there are some 2-sigmaish-excesses everywhere, although the diphoton is probably still the most important source of them. The charginos are mostly excluded below \(890\GeV\) but it should have been \(1080\GeV\). The wino mass \(M_2\) seems to be excluded below \(1100\GeV\) although it should have been all below \(1300\GeV\), and so on. I finally decided that the degree to which this increases the odds that the CMS is seeing hints of gauge-mediated SUSY breaking (if it is an increase at all, not a decrease) is too small and doesn't justify a full new blog post.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-60394667943781901092019-03-08T07:07:00.001+01:002019-03-08T07:25:51.968+01:00CERN fires Strumia: the silence is deafeningAfter five months of "investigations" that weren't investigating anything, the vicious, dishonest, and ideologically contaminated individuals who took over CERN have said "good-bye" to Alessandro Strumia, a top particle phenomenologist with <a href="https://scholar.google.com/scholar?q=%22a+strumia%22&hl=en&lr=&btnG=Search" rel="nofollow">38k</a> citations according to Google Scholar and <a href="http://inspirehep.net/search?ln=en&ln=en&p=find+a+strumia%2Ca&of=hcs&action_search=Search&sf=&so=d&rm=&rg=25&sc=0">32k</a> according to Inspire.<br /><br />See e.g. <a href="https://www.bbc.co.uk/news/science-environment-47478537" rel="nofollow">the BBC</a>, <a href="https://www.dailymail.co.uk/sciencetech/article-6782085/CERN-scientist-SACKED-sexism-six-months-saying-physics-built-men.html">The Daily Mail</a>, <a href="https://gizmodo.com/cern-drops-italian-scientist-who-complained-about-women-1833131224">Gizmodo</a>, <a href="https://www.science20.com/tommaso_dorigo/cern_deliberates_against_strumia-236979">Dorigo's despicable defense</a> of the Soviet tactics.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />Why were his ties to CERN cut? In his quantitative, carefully researched October 2018 CERN talk at a conference dedicated to this very question – representation of men and women in particle physics and causes of patterns – he dared to say the truth. You may recall some <a href="https://motls.blogspot.com/search?q=Strumia&m=1&by-date=true">TRF blog posts about Strumia</a> since that time.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />My country has spent about one-half of the 20th century in two totalitarian regimes, the Nazi and communist one, where people were punished for expressing their opinions that were politically inconvenient for the ruling class and its ideology. And I am the typical kind of a person who fought against the later, communist regime, and who would have had serious problems with the previous, Nazi regime as well. <br /><br />My maternal grandfather, academic painter Francis Koliha, was a civilized antagonist of both regimes and in these matters, he's clearly been my role model within the family. Also, both uncles – mother's brother and father's brother – ended up in emigration, in Australia and Bavaria, respectively. So I guess it's true that I have some background that increases my sensitivity about these matters. But I do think that every decent person is sensitive about the violation of basic principles that make the Western society civilized.<br /><br />Sadly, the aggressiveness with which the current "politically correct" views are being imposed on the society is comparable to that of the Nazi and communist times. In some respects, the current situation is better, in others it is worse. It is better because the bullies of the present don't have any kind of an official control over the whole society. They "only" control most of the information-related industries such as the media, education systems, and social networks. For this reason, this regime isn't really "totalitarian" because they don't have the "total" control over all aspects of the society and all parts of human lives.<br /><br />On the negative side, the contemporary cultural Marxists seem more fanatical than their Nazi and communist predecessors. Also on the negative side, the theses they defend are much more self-evident lies than the theses that were defended by the two major 20th century totalitarian regimes. The previous regimes were defending their ideology that was mostly composed of preferences and interpretations. After all, whether one likes Jews or capitalists is up to his or her subjective choices.<br /><br />On the other hand, those who would like to dictate the only allowed opinions today deny <em>elementary facts about the world</em>. They deny the differences between sexes, nations, and races. They deny that the new enemies of the people – like the proud white men whose status increasingly resembles that of the Jews in Nazi Germany or capitalists in the Soviet Russia – are actually being heavily disadvantaged instead of privileged. The Nazi Germans and Soviets have never spread views that were this obviously false.<br /><br />Every day, we are seeing remarkable proofs that the claims about the "discrimination against women" are just pure lies. Just yesterday, dozens of news outlets including NYT revealed that <a href="https://tech.economictimes.indiatimes.com/news/internet/google-moves-to-address-wage-equity-and-finds-its-underpaying-many-men/68294614">Google has analyzed whom it was underpaying</a>. Unsurprisingly for us, it found out that men were being underpaid. It wasn't some fringe research or opinion. It was the most official analysis of the situation in the company and according to predetermined mechanisms, the company had to take the results into account and increase the salary of the men – men very similar to James Damore who was previously fired from Google for similar reasons why Alessandro Strumia was fired from CERN. Everyone who claims that the low percentage of women in HEP is due to "discrimination against women" is a dirty liar because he or she must know it's complete nonsense and the actual discrimination works in the opposite direction (and it's been so for decades) – and some examples of that discrimination, the cases of Damore and Strumia, are truly extreme.<br /><br />Show me the analogous female engineer at Google or particle phenomenologist at CERN who were fired for saying that women were discriminated against? And those dismissals would be far more justified because their statement would be a lie while Damore's and Strumia's statements were the truth or its accurate enough approximation.<br /><br />As I hinted in the title, I am terrified by the silence that surrounds this shocking act. I am terrified that none of the people whom I have known has complained. I could add a hundred of names of the people who have greatly disappointed me as human beings. They just suck from an ethical perspective, they would probably create the "support at the bottom" for the Holocaust as well if this brought them some advantages, and I pretty much don't want to meet them again. I despise all of you who enable these disgusting things to take place. You're just pieces of junk and most of you have done much more to hurt science than how much you positively contributed to science.<br /><br />You may fire as many people as you like but you will still be wrong and you will still at least subconsciously know that your statements are lies and your lives are build on a stinky pile of trash.<br /><br />P.S.: Some committee at University of Pisa has decided that Strumia has violated a "code of conduct" of theirs that apparently says that one should be "nice" in some vague sense. It isn't clear to me whether this conclusion has tangible consequences.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-29007147214872422762019-03-06T07:21:00.000+01:002019-03-06T07:47:25.948+01:00Young diagram hooks for a fermionic matrix modelThe third hep-th paper in today's listings is very interesting.<br /><blockquote><a href="https://arxiv.org/abs/1903.01628" rel="nofollow">Gauged fermionic matrix quantum mechanics</a><br /></blockquote>First of all, the authors are nice because they are deniers. One of them is a Koch brother, Robert, and the other one is David Berenstein, the long-haired guy who sings "<a href="https://www.youtube.com/watch?v=Vx-t9k7epIk" rel="nofollow">I'm a Denier</a>" along with Al Gore, Michael Mann, and Chicken Little. ;-)<br /><br />OK, more seriously, they study matrix models which are clearly relevant for full-blown definitions of quantum gravity. Lots of descriptions of vacua (or superselection sectors) of quantum gravity are given by \(U(N)\) or \(SU(N)\) gauge theories in various numbers of dimensions – that includes the \(AdS_5\)-like vacua in AdS/CFT and the BFSS matrix theory.<br /><br />Some half-supersymmetric subsets of operators in such theories are fully understood etc.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />A non-supersymmetric case of a matrix model is the "old matrix model", just a Hamiltonian with a natural kinetic and potential term written for bosonic degrees of freedom arranged in a Hermitian matrix. One may cleverly solve it, see that the eigenvalues of the matrix effectively behave as particles, and those particles happen to be fermions. That treatment may be shown to be dual to the Liouville theory etc.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />Koch and Berenstein study the seemingly simpler, fermionic version of that matrix model,\[<br /><br />S = \int dt\,{\rm tr} \zzav { \bar\psi \cdot iD_t \cdot \psi - m\bar\psi \cdot \psi }<br /><br />\] A very simple theory of a massive fermion in 0+1 dimensions extended to a whole matrix of fermions. The solution of this problem for any \(N\) must be sort of analogous to the bosonic case with eigenvalues. <br /><br /><iframe align="left" scrolling="no" frameborder="0" style="width:140px;height:245px;" marginheight="0" src="//ws-na.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&OneJS=1&Operation=GetAdHtml&MarketPlace=US&source=ac&ref=tf_til&ad_type=product_link&tracking_id=lubosmotlsref-20&marketplace=amazon®ion=US&placement=0000000000&asins=0817642595&show_border=false&link_opens_in_new_window=false&price_color=BBBBBB&title_color=FFAA44&bg_color=002211" marginwidth="0"/></iframe>OK, there are some gauge-invariant operators. The simplest way to make operators gauge-invariant is to trace a product of fields. Here, there is just one field, \(\psi\), so you may take the trace of the \(n\)-th power of this \(\psi\) (only odd powers are nonzero), or a product of such traces. That's a basis of gauge-invariant operators here.<br /><br />Alternatively, you may think about a single trace only but allow the traces over various representations \(R\) of \(U(N)\). Representations of \(U(N)\) may be labeled by Young diagrams. So there's another nice basis of the gauge-invariant operators whose elements coincide with Young diagrams. This Young diagram basis is really called "Schur function basis".<br /><br />Koch and Berenstein show that these two bases are actually exactly the same, up to the normalization of each basis vector. The translation is done by hooks in the Young diagrams. A hook is an (upside-down-inverted) L-shaped sequence of boxes. You may divide every diagram to hooks, starting from the left upper corner, see e.g. <a href="https://en.wikipedia.org/wiki/Hook_length_formula">this hooky Wikipedia page</a>.<br /><br /><img src="https://2.bp.blogspot.com/-4FwUO4zg6-U/UhT0aWjgDyI/AAAAAAAACQ4/FF3Qhzg-xEs/s1600/PartitionHooked.jpg"><br /><br />It's surprising that the identities to show that the basis vectors are the same, up to a normalization, are said to be newly discovered. So the single fermionic matrix model simplifies.<br /><br />For years, I have thought that the proper mastery of many such matrix models, their operators in assorted regimes, and their clever generalizations is important for a complete understanding of a theory of everything – or the most universal possible description of string/M-theory or quantum gravity. Note that the word <a href="https://motls.blogspot.com/search?q=Schur&m=1&by-date=true">Schur</a> appears in 7 TRF blog posts so far, if I include this one. In some of them, I tried to make the Schur bases relevant for the ER-EPR correspondence.<br /><br />There's another hypothetical (my) connection to quantum gravity that needs to be refined. Note that Vafa et al. wrote about <a href="https://arxiv.org/abs/hep-th/0309208">quantum Calabi-Yaus</a> and <a href="https://arxiv.org/abs/hep-th/0312022">classical crystals or quantum foam</a> where the partition sum may be written as a sum over generalized, 3D "Young diagrams" labeling topologies of Calabi-Yau three-folds. I believe that there is some generalization involving the two-complex-dimensional topologies and matrix models of the Koch-Berenstein style are dual to a theory in a two-complex-dimensional world volume – a generalized definition of a world sheet.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-12046760842776001672019-03-04T10:28:00.000+01:002019-03-04T18:45:47.246+01:00One quadrillion standard models in F-theoryI want to pick two papers on the arXiv today. In<br /><blockquote><a href="https://arxiv.org/abs/1811.00407">Signatures of supersymmetry and a \(L_\mu−L_\tau\) gauge boson at Belle-II</a>,<br /></blockquote>Banerjee and Roy from the <em>Indian Association for the Cultivation of Science</em> (I couldn't resist to write this cute name of an institution) point out that <a href="https://en.wikipedia.org/wiki/Belle_II_experiment">Belle-II</a>, a Japanese B-factory experiment that began to take limited data one year ago and is already taking all the data since early 2019, may observe a smoking gun for a class of supersymmetric theories that recently looked very intriguing to many physicists, for lots of reasons. <br /><br /><a href="https://en.wikipedia.org/wiki/Maribor"><img src="https://upload.wikimedia.org/wikipedia/commons/thumb/9/90/Maribor_Lent.jpg/400px-Maribor_Lent.jpg"></a><br /><br />It's models with the extra \(L_\mu-L_\tau\) gauge symmetry which may be good enough to explain the masses of generations of leptons, dark matter, baryon asymmetry, and the discrepancy in the muon magnetic moment. Belle-II could see the reaction\[<br /><br />e^+ + e^- \to \gamma Z' \to \gamma+\met<br /><br />\] where \(Z'\) is the new gauge boson and the reaction is possible due to its kinetic mixing with the photon. Looking at some nearly highest energy boxes, Belle-II could discover the \(Z'\) boson even if it were too heavy to be accessible by the LHC. This is an example of a cheaper experiment that could beat the "brute force energy frontier" collider such as the LHC or FCC – but the price you pay is that such reactions are very special and you must hope that a rather particular scenario is picked by Mother Nature, otherwise you see nothing.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />Now, a fascinating Slovenian-Upenn-SmallerBoston-Asian hep-th stringy paper. In <br /><blockquote><a href="https://arxiv.org/abs/1903.00009">A Quadrillion Standard Models from F-theory</a><br /></blockquote>Mirjam Cvetič along with Helverson, Lin, Liu, and Tian discover a simple recipe to construct about one quadrillion F-theory (a formally 12-dimensional, deeply geometric description of type IIB string theory) compactifications that automatically produce the three-generation Standard Models with the right spectrum.<br /><br />It seems that they in order not to be beaten by savages on the street, a category that recently welcomed Sheldon Lee Glashow again who's been hibernating as a string theory hater for over 20 years, Cvetič and collaborators say "exact chiral spectrum of the Standard Model" when they actually mean the "exact chiral spectrum of the Minimal Supersymmetric Standard Model". Well, I trust my fists so I still think it's right to <em>brag</em> about the supersymmetric adjective. And yes, they find a way to construct one quadrillion F-theory models, a natural class that automatically produces the right spectrum of the MSSM at low energies.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />Recall that in the 1980s, people found 5 originally separated superstring theories in 10 spacetime dimensions. Type I, IIA, IIB, heterotic \(E_8\times E_8\), and heterotic \(SO(32)\). In the mid 1990s, it was realized that all of them are connected with each other and their limits with a strong string coupling constant, previously considered to be a mysterious empire full of dragons, got fully understood.<br /><br />In particular, type I and \(SO(32)\) heterotic strings were found to be S-dual to each other. The strong coupling dual of one produces the weak coupling limit of the other. On the other hand, type IIA string theory and \(E_8\times E_8\) heterotic strings produce something new in the strong coupling limit – an eleven-dimensional theory. In the two cases, the new 11th dimension looks like a circle or a line interval with two domain walls carrying the \(E_8\), respectively.<br /><br />The strong coupling limit of type IIB string theory, the last, fifth one, turned out to be the same type IIB string theory. In fact, this type IIB string theory may be visualized as a 12-dimensional theory, F-theory, with two tiny dimensions compactified on the torus \(T^2\). The shape (ratio of sides and the tilting angle) is called the complex structure \(\tau\) and values of \(\tau\) related by the \(SL(2,\ZZ)\) group are geometrically equivalent. In particular, the exchange of the "two sides" of the two-torus or the \(\tau\to-1/\tau\) operation is responsible for the self-S-duality, the equivalence between the weak and strong limits from the beginning of this paragraph. Because \(\tau\) is a complex scalar field in type IIB string theory (the dilaton plus the RR-axion), the shape of the two-torus may be a variable function of the 10 spacetime dimensions.<br /><br />In effect, type IIB vacua with the variable dilaton and axion may be geometrized as "more geometric", 12-dimensional geometries. The 12-dimensional spacetime has toroidal, two-dimensional fibers, and a 10-dimensional base. Out of these 10 dimensions, 3+1 remain large and most people know them. The remaining 6 are compactified. So F-theory compactifications – a more geometric way to describe type IIB string vacua – are mostly given by an 8-dimensional geometry that may be described as a 2-toroidal fibration over a 6-dimensional base.<br /><br />Note that F-theory stands for Father Theory. If you are a progressive who believes that men and fathers are politically incorrect, please believe me that F-theory actually stands for a Fudged-up Frigid Feminist instead. <br /><br />OK, the full 8-dimensional space – the fibration – has to be a Calabi-Yau manifold (a four-fold where four counts the complex dimensions which is appropriate because all the geometers doing this stuff are truly complex ones). Fine. Sometimes the 8-dimensional topologies produce the MSSM spectrum. So far some "large families" with this intriguing property were found whose number of elements was below one million or so.<br /><br />Cvetič et al. increase this record by a factor of billions. They suddenly construct a quadrillion MSSMs. And I really mean three-generation models with the gauge group \([SU(3)\times SU(2)\times U(1)] / \ZZ_6\). And yes, this F-theory research is quite a precise and rigorous branch of physics close to mathematics which is why the folks generally don't overlook the \(\ZZ_6\) quotient. Things must work and do work really precisely here. If the likes of Sheldon Glashow got one million dollar for a single Standard Model, the laws of proportionality imply that Cvetič et al. deserve sextillions of dollars. What are their F-theory geometries? They basically say that all fibrations are good enough and only the base has to obey some condition.<br /><br />The three-fold base \(B_3\) must have a non-rigid anti-canonical irreducible divisor(s) obeying the equation 16, a D3-brane-tadpole cancellation of a sort\[<br /><br />n_{\rm D3} = 12+ \frac 58 \overline {\mathcal K}^3 -\frac{45}{2 \overline {\mathcal K}^3 }\in \{0,1,2,\dots\}<br /><br />\] It's not even a too constraining equation. It just says that some number must be a non-negative integer. With this single condition, you get the exact spectrum of the MSSM. Isn't it amazing? A subset of these bases are "weak Fano toric threefolds" – please don't assume that I am as familiar with those as e.g. with "beer" – and those are encoded by 3D reflexive polytopes \(\Delta\). 4319 good enough polytopes are known for this construction.<br /><br />Great. Isn't it amazing? The spectrum of the Standard Model or MSSM looks rather ugly, artificial, and we normally describe it by many isolated sentences or many conditions. Cvetič et al. may derive them more or less from a single condition constraining an 8-dimensional geometry. I think it's obvious that it may be rather important – and it could be called progress.<br /><br />The crackpot movement recently attacked Ptolemy, his friends (including Nima Arkani-Hamed), and the epicycles again. I think that only the brainwashed laymen describe "epicycles" as something that should be hated. Epicycles were an old version of the "Fourier analysis" that described astronomy in the precise, state-of-the-art way from ancient Greece to the era of Kepler. They worked great and in some sense, they were "right" because they only claimed to produce a good enough description, not an "explanation" why the planetary orbits are what they are.<br /><br /><iframe align="left" scrolling="no" frameborder="0" style="width:140px;height:245px;" marginheight="0" src="//ws-na.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&OneJS=1&Operation=GetAdHtml&MarketPlace=US&source=ac&ref=tf_til&ad_type=product_link&tracking_id=lubosmotlsref-20&marketplace=amazon®ion=US&placement=0000000000&asins=B00KJLUTQU&show_border=false&link_opens_in_new_window=false&price_color=BBBBBB&title_color=FFAA44&bg_color=002211" marginwidth="0"/></iframe>One might argue that because there's something "more instinctive" about the epicycles relatively to the ellipses constrained by Kepler's laws, the epicycles and epicycles on epicycles were an unavoidable chapter in the history of science. All people who have drew patterns with the <a href="https://www.google.cz/search?q=inspiro+obrazce&um=1&ie=UTF-8&hl=en&tbm=isch&source=og&sa=N&tab=wi&biw=1317&bih=708" rel="nofollow">Inspiro</a>/<a href="https://en.wikipedia.org/wiki/Spirograph">Spirograph</a> as kids (I did) must understand where I am coming from. So yes, I think that people who just mock them are brainwashed morons who don't get it.<br /><br />Nevertheless, Kepler's description with ellipses was progress because it made a real explanation "why those orbits are what they are", an explanation given by Newton's laws, more accessible.<br /><br />But I want to say one more thing. The crackpots love to present the "epicycles and epicycles on epicycles" as something terrible – and I just discussed why this demonization is a sign of one's misunderstanding. But these organized crackpots also love to say that "epicycles and epicycles on epicycles" are analogous to string theory.<br /><br />Both statements are wrong and they are just totally indefensible insults that don't work at all. In fact, the more sensible analogy would be the exactly opposite one. Epicycles are analogous to quantum field theories with manually added (fields and) operators into the Lagrangian. On the other hand, epicycles were replaced with... Kepler's ellipses. And indeed, the elaborate packages of the Standard Model's operators may also be replaced with... elliptically fibered 8-dimensional manifolds.<br /><br />Even the word "ellipses" appear in both cases. (In the complex geometry jargon, two-tori are called elliptic curves because the elliptic functions appear there, and those had previously appeared in some analyses of ellipses.) Ellipses normally represent progress because they elevate us from some man-made ad hoc description that just happens to be precise enough to a deeper, more geometrically natural, understanding of what's going on – to a deeper construction that has fewer independently moving parts, so to say. The transition from Ptolemy to Kepler seems exactly analogous to the transition from the effective quantum field theories to the geometric structures in F-theory or string theory in general! That's why the intelligent people work hard to think in terms of string theory – and why people saying bad things about string theory are just stupid scum.<br /><br />And that's the memo.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-43043685463703259102019-02-23T09:22:00.001+01:002019-02-23T15:21:49.722+01:00Rumors of the WIMP miracle's death have been greatly exaggeratedEthan Siegel has shown us another example of the profound difference between careful scientists on one side and zealous activists on the other side (the side where he sadly belongs) when he wrote<br /><blockquote><a href="https://www.forbes.com/sites/startswithabang/2019/02/22/the-wimp-miracle-is-dead-as-dark-matter-experiments-come-up-empty-again/amp/" rel="nofollow">The 'WIMP Miracle' Hope For Dark Matter Is Dead</a><br /></blockquote>The bold statement from his title is repeated very many times in his text:<br /><blockquote>[...] The big hope was for a WIMP miracle, a great prediction of supersymmetry. It’s 2019, and that hope is now dashed. Direct detection experiments have thoroughly ruled out the WIMPs we were hoping for. [...] Theorists can always tweak their models, and have done so many times, pushing the anticipated cross-section down and down as null result after null result rolls in. That’s the worst kind of science you can do, however: simply shifting the goalposts for no physical reason other than your experimental constraints have become more severe. There is no longer any motivation, other than preferring a conclusion that the data rules out, in doing so. [...]<br /></blockquote>And to make sure that you won't overlook them, he repeats the thesis that the "WIMP miracle is dead" at several other places.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />Well, the theories incorporating the WIMP miracle are more constrained than a decade ago but they're not dead. You know, clickbait writers and activists such as Ethan Siegel love oversimplified statements. They particularly love the idea that they "totally killed" an idea. That's how they become attractive for simple-minded readers.<br /><br />However, in real science, you may only kill an idea by actually falsifying it. And that hasn't been the case of the WIMP miracle.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />The energy density of the Universe is composed of <br /><ul><li>(almost 70%) dark energy, some non-localized substance that has the negative pressure \(p=-\rho\) which makes its stress energy tensor \(T_{\mu\nu}\) proportional to the metric tensor \(g_{\mu\nu}\), a choice that preserves the local Lorentz symmetry (the cosmological constant is dark energy for which this form of the stress-energy tensor is precise and constant all over the Cosmos and its history)</li><li>(about 25%) dark matter, something we only see through its gravitational effects, e.g. on the galactic rotation curves, but not electromagnetically; theories that this matter doesn't exist and the curves are due to modifications of the gravitational force are known as MOND but those theories faced much bigger trouble than dark matter in recent years</li><li>(about 5%) of the visible, baryonic matter – most of this mass is composed of protons and neutrons and they form atoms which send us electromagnetic radiation so that we see them</li></ul>If we talk about the dark matter, the second component exists. It may hypothetically be composed of some small black holes or MACHOs or RAMBOs or SIMPs etc. but the option that still looks like the most likely single choice to most cosmologists is that dark matter is composed of a new type of a particle. One of the possible types of a new particle species that makes up dark matter is an axion; however, I think it is fair to say that the most motivated one is still WIMP, the Weakly Interacting Massive Particle (the acronym shows the physicists' characteristic playfulness in inventing terminology).<br /><br />A nice "coincidence" that has been known for decades – the WIMP miracle – is that if this particle has the mass around \(100\GeV\) through \(1\TeV\) or so, then the thermal evolution in cosmology leads to the prediction that the current abundance of the particle should be almost exactly high enough to account for the observed 25% of the energy density of the Universe. What's nice about this mass is that it is also the right mass for the particle to explain the lightness of the Higgs boson – the mass is close to the Higgs mass itself. So this new particle could explain both the lightness of the Higgs as well as the abundance of dark matter. When "one product" is capable of solving "two problems" at the same moment, it looks like a good product, you surely understand why. The mass of the hypothetical particle may be estimated in two different ways that approximately agree. And this nontrivial agreement is evidence supporting the hypothesis.<br /><br />The single most attractive subtype of WIMP is the LSP, the lightest superpartner of a Standard Model particle in theories with supersymmetry. LSPs may be neutralinos, the superpartners of the Higgs, photon, and Z-boson (a superposition of the photino, higgsino, and zino that happens to be the lightest mass eigenstate). Because the number of superpartners is conserved modulo two (R-parity) in many supersymmetric models, the single lightest superpartner can't decay to anything lighter while preserving the R-parity, and is therefore stable. (You may generalize this clever idea and invent other new particles independent of SUSY with symmetries or charges that are only analogous to the R-parity but that also make the lightest charged particle stable.)<br /><br />In the recent decade, two classes of experiments have made the picture marginally less likely than before. The LHC hasn't found any new physics which shows that the most straightforward or naive assumptions about naturalness don't work (and some naturalness needs to be assumed for the WIMP miracle to exist); and the most expensive direct search experiments for the dark matter (XENON, LUX, and others) haven't found a WIMP, either, although they already got to the sensitivity that excludes quite a fraction of the simplest vanilla models involving the WIMP miracle.<br /><br />The space of possible "theories including the numbers" – the parameter spaces – have been constrained. It is impossible to tell you any precise percentage of the spaces that have been eliminated. It may be 10%, 50%, 90%, or something else, depending on your chosen measure on the spaces, assumptions about the allowed degree of fine-tuning, about the deviations from the thermal history of the Universe, and lots of other things.<br /><br />But even if you said that 95% of the parameter space is excluded, it would simply not allow you to defend the statement that the WIMP miracle is "dead" – because an experimental "proof" at the 95% confidence level is still extremely weak (we also call it "two-sigma evidence") and can't be reasonably called a "proof" at all.<br /><br />Now, if you're reading this blog post seriously, I want you to do some "research" in the sense of "comparative literature". Try to read excerpts from a dozen of actual technical physicists' articles (the PDF files are found after one or two clicks) that contain "WIMP" or "WIMP miracle", especially the recent ones, and decide whether Ethan Siegel's assertions agree with the literature:<br /><ul><li><a href="http://inspirehep.net/search?ln=en&ln=en&p=find+title+wimp+miracle&of=hb&action_search=Search&sf=&so=d&rm=&rg=25&sc=0">INSPIRE: wimp miracle in the title</a></li><li><a href="http://inspirehep.net/search?ln=en&ln=en&p=find+title+wimp&of=hb&action_search=Search&sf=&so=d&rm=&rg=25&sc=0">INSPIRE: wimp in the title</a></li><li><a href="https://www.google.com/search?q=site:arxiv.org+%22wimp+miracle%22&source=lnt&tbs=qdr:y&sa=X&ved=0ahUKEwiq8arKptHgAhWPwcQBHWAVDpgQpwUIJQ&biw=1359&bih=635" rel="nofollow">arXiv: wimp miracle in the title</a>, 12 recent months</li><li><a href="https://www.google.com/search?sourceid=chrome&ie=UTF-8&q=site%3Aarxiv.org+%22wimp+miracle%22" rel="nofollow">arXiv: wimp miracle in the title</a>, all times</li></ul>You may try many other searches and combinations. Open some PDF files. You will see that they say what I say. Experiments looking for WIMPs are ongoing and new ones are being planned. Theorists keep on discussing both WIMP and WIMP miracle because it's still one of the most intriguing observations that indicate what the dark matter could be composed of – and how heavy it could be.<br /><br />The theories are under pressure that most of the authors of those papers are aware of but the general concept simply isn't dead.<br /><br />Let me mention some data about the "WIMP miracle" in the title. INSPIRE finds 10 papers with such titles and they were posted in 2009, 2010, 2010, 2011, 2012, 2012, 2012, 2013, 2015, and 2017. The last two papers include adjectives in front of the "WIMP miracle" – "fraternal" and "of the second kind", respectively. You might say that the frequency of the "WIMP miracle" decreased but physicists generally don't see the question about the validity of the "WIMP miracle" as a settled one simply because it is not settled.<br /><br />When new experiments achieve a greater sensitivity than their predecessors, it's always possible that they will make a discovery – and some physicists are genuinely expecting the new results. Siegel's and other people's assumption that they can't ever find anything is pure faith, it has nothing whatever to do with the scientific arguments. The experiments may continue to find null results for 100 years but it just isn't known <em>now</em>.<br /><br />Like many other laymen, Siegel just totally misunderstands what great theoretical physics is when he says:<br /><blockquote>Theorists can always tweak their models, and have done so many times, pushing the anticipated cross-section down and down as null result after null result rolls in. That’s the worst kind of science you can do, however: simply shifting the goalposts for no physical reason other than your experimental constraints have become more severe.<br /></blockquote>On the contrary, it's the most <em>correct</em> thing that theorists should do when experiments gradually tighten the limits. The reason is known as "Bayesian inference". When new experimental data arrive, theorists – and all scientists – must <em>adjust</em> their opinions about the probabilities of various theories, scenarios, and statements. This <em>adjustment</em> is no heresy, it is not unethical. On the contrary, it's absolutely essential because this is what underlies the statement that the truth in science ultimately boils down to empirical facts. "Shifting the goalposts" is just a silly negative synonym for "taking the empirical facts into account" – which is actually a good, essential thing. The new experimental evidence may be "just one physical reason" but it's a very important one (and Siegel's attempt to mock it is silly) because all of science is ultimately governed by the empirical facts.<br /><br />But when the theory is "fuzzy", like the WIMP miracle or naturalness of the Higgs mass, and the experiments are "gradual", then the adjustments of the beliefs are and must be "gradual", too. Anything else would be irrational and unrepresentative of the actual experimental facts!<br /><br /><iframe width="407" height="277" src="https://www.youtube.com/embed/zrzMhU_4m-g" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe><br /><br />In this documentary movie, Ethan Siegel is one of the simpletons who want to "burn the WIMP" because she's made of wood etc.<br /><br />It's exciting for Siegel and his friends to burn the witches and WIMPs but it's not really science. The lady should have been allowed to live – and scientists must be allowed to do research into theories they see as intriguing and/or likely. And theories with some sort of a WIMP miracle are still among the top ones. And believe me, I have absolutely no horse there. Even on this blog, you find "WIMP miracle" at roughly five places and you may see that "I don't really have a strong belief". And an axion dark matter is almost as OK for me as a WIMP. But I also know that Siegel's black-and-white statements are untrue.<br /><br />Siegel may be the kind of a guy who loves simple statements by himself – and simple-minded conclusions. But not all insights in physics are simple – if it were the case, a <em>simple</em>ton would be the ideal person to make the discovery. Instead, many discoveries that physics has to make are really new – and the ideal person who may do them is a Newton, not Simpleton. ;-)<br /><br />Unfortunately, the gap between activists pretending to be the "voices of science" or "communicators between the public and science" – such as Siegel – and the actual science is growing deeper. Many of these people have turned to nothing else than full-blown haters of the scientific research whose own methodology is much closer to the most self-evident examples of fraud than to proper research based on the scientific integrity.<br /><br />Be ashamed, Siegel.<br /><br />I would also like to point out that these frantic activists' efforts to "burn the witches" or "declare the WIMP miracle dead" are seen in many other contexts. The movement to ban the fossil fuels is an example. As a source of energy, fossil fuels have faced some competitors whose importance was pretty much growing in recent years (and for the same reason, the relative importance of fossil fuels was decreasing, although the statement must be interpreted very carefully, especially in the wake of the fracking revolution) but they still remain an important source of energy, almost certainly the most important one for the mankind, just like the WIMP miracle is probably remaining the most important "numerical clue" in all of the research of dark matter. Needless to say, there's a funny difference between the people who want to "burn the witch" and those who fight against fossil fuels. The haters of the fossil fuels actually don't want to "burn the fossil fuels" at all! ;-) Their hatred is manifested by their desire to <em>ban</em> the burning of the fossil fuels. Too bad that the witches didn't face similar haters, they could have survived. ;-)<br /><br />The opponents may perhaps (and in some cases) correctly observe some "negative trend" for the single most important paradigm or technology – fossil fuels or WIMP – and they're doing "momentum trading", effectively assuming that the value of fossil fuels or WIMP (and there are many such examples) is zero. But it is not zero, the financial markets couldn't work well just with "momentum traders" (who are really just a destabilizing force), and healthy dark matter science can't do without WIMPs just like the modern industrial economy would be devastated without the fossil fuels right now.<br /><br />At any rate, the main point of this blog post is that you just can't replace the scientific qualities, careful analyses, and integrity by premature conclusions or by your decision to be very radical and straightforward. Premature conclusions and radical prejudices are among the greatest enemies of science.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-39870031087929003872019-02-19T12:39:00.000+01:002019-02-19T12:55:08.746+01:00Henry Tye & pals: fermion masses from anti-naturalness of string theoryBefore I discuss the cute new paper by Tye et al., I must mention a SUSY paper<br /><blockquote><a href="https://arxiv.org/abs/1902.06650">Low-energy lepton physics in the MRSSM: \((g−2)_\mu\), \(\mu\to e\gamma\), and \(\mu\to e\) conversion</a><br /></blockquote>by Kotlarski, Stöckinger, and Stöckinger-Kim, by a Polish, German, and German-Korean ;-) trio. They revisit the 2007 Kribs-Poppitz-Weiner model known as MRSSM. The acronym stands for the same thing as MSSM with the extra "R-symmetric" inserted in between.<br /><br />There's an extra \(U(1)\) R-symmetry in the model, the Standard Model particles are neutral but the superpartners are charged under it. They carry some special new superpartners. In this scenario, compressed spectra are assumed so that the LHC bounds aren't violated even though some superpartner masses are below \(200\GeV\). Instead of the LHC, they predict new phenomena to be seen at experiments "directly converting electrons to muons" such as COMET.<br /><hr><br />Now, the main paper I want to discuss is <br /><blockquote><a href="https://arxiv.org/abs/1902.06608">String Landscape and Fermion Masses</a><br /></blockquote>by Andriolo, Yan Li, and Tye. Henry Tye is of course a brilliant playful man and this paper – building on some previous papers by a similar group – shows that.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />I have repeatedly discussed that a scientist who tries to go beyond the well-established theories, at least a little bit, <em>always</em> needs "something like naturalness", a mental framework that tells him what values of parameters labeling the latest established theory may be considered natural. The word "natural" really means "likely" in general which means that there must be a statistical distribution. No special patterns that are "very unlikely" according to the distribution should appear in Nature. <br /><br />The phenomenologists like the usual naturalness that is based on uniform or close distributions. So if an angle may between \(0\) and \(2\pi\), for example, they assume a uniform distribution on that interval. If that's the case, it just shouldn't happen that the angle, such as the QCD theta-angle, is below \(10^{-8}\) or something like that. If it is this small, you should better have an explanation.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />Tye et al. also have a distribution but in many important cases, it is not uniform at all. Instead, they claim to derive some distributions from a picture of string theory. Such distributions are typically non-uniform, they claim. In fact, the probability distribution \(P(\Lambda)\) for quantities such as the cosmological constant \(P(\Lambda)\) is peaked near \(\Lambda=0\). Well, if that argument is generally true, and I haven't really understood it too well so far, it means that whenever we observe some unnaturally small values of parameters, they're evidence in favor of string theory!<br /><br />That would imply a trade-off. If you were e.g. a nasty bitch who hysterically attacks naturalness, you would strengthen string theory whenever you do it, and vice versa. It's neat – with a rather big "If", of course.<br /><br />OK, I want to encourage some people to read the papers more carefully – and I don't want to reproduce all the cool ideas from the papers. Instead, I want to throw some motivating excerpts that could make the reading of their papers more intriguing. They decide to have some arguments claiming that it's natural – likely according some distributions peaked near zero – that the cosmological constant is "exponentially small" in the Planck units, just like observed.<br /><br />The cosmological constant has been considered in previous papers by Tye et al. In the newest paper, they focus on the fermion masses. Your time may be expensive so let me extract the wonderfully simple and almost childishly looking claim they're making. They're saying that the zeroth-order distribution \(P(m)\) for the masses is also peaked near \(m=0\) but because of some statistical evaluation of the string vacua, the individual fermion masses (the eigenvalues of the mass matrices, I guess) are ultimately independently (the lack of correlation surely makes me suspicious) distributed according to a distribution, a Weibull distribution, that has the form\[<br /><br />f(m; k,l) = \frac{k}{l} \zav{ \frac{m}{l} }^{k-1} \exp \zzav{ -\zav{\frac{m}{l}}^k }<br /><br />\] That's their equation (3.2). And to make you really breathless, they have determined that the right values of the parameters are \(k=0.269\) and \(l=2.29\GeV\). That's quite some high-precision numerology, indeed! They have extracted these values of the parameters from the simple list of the 6 quark masses and 3 charged lepton masses.<br /><br />What's totally wonderful about this childish distribution is that the quark and charge lepton masses very nicely cover the interval of allowed masses. You may calculate the percentile \(Y\%\) for each quark or charged lepton masses and you will get 14.5%, 17.4%, 34.7%, 57.5%, 69.1%, 95.9% for quarks, and 19.1%, 58.9%, 85% for the charged leptons. If those aren't percentiles that quasi-uniformly cover the interval from 0% to 100%, nothing is.<br /><br />I have cheated a little bit. The value of \(k,l\) listed above produce the aforementioned percentiles for the six quark masses. If you want to derive the percentiles for the charged lepton masses, you need to use different values of \(k,l\) for the Weibull distribution (they also propose another possible Ansatz of the distribution). You need \(l=0.164\GeV\) for the charged leptons and... \(k=0.269\). Holy cow, the optimum value of the dimensionless parameter \(k\) is the same for the six quarks and the three charged lepton masses. ;-)<br /><br />How intriguing do I find this agreement between the values of \(k\) extracted for two groups of lepton masses? It's pretty intriguing, indeed. So while we can't produce the individual values of the 9 masses, we can describe their rather strange distribution by a functional form for the distribution that only depends on 2 parameters. All these claims sound like weird numerology but at some level, I do find it plausible that there is some truth behind these observations. The Standard Model has a relatively high number of elementary particle masses, \(N\gg 1\), which means that there could be an idealized description of their behavior that uses the \(N\to\infty\) "thermodynamic" limit. And in such limits, many things could simplify and many distribution could become peaked, indeed. Assuming some genericity of our vacuum, some statistical properties of the rather large number of fermion masses in our Universe <em>could</em> become predictable, much like things become rather precisely predictable in the thermodynamic limit of statistical physics!<br /><br />So I still do believe the default assumption that it's silly numerology that cannot work but the possibility that it's no coincidence is attractive, indeed.<br /><br />Note that the observed fermion masses – like their Yukawa couplings – differ from each other by orders of magnitude. The Weibull distribution of theirs is capable to produce percentiles that are neither too close to 0% nor too close to 100% because \(k\lt 1\) if not \(k\ll 1\) and the distribution therefore allows masses that span many orders of magnitude. In effect, I think that they determine the scale \(l\) roughly as the geometric average of the mass eigenvalues and \(k\) is determined by the "standard deviation" of the number of orders of magnitude by which you should deviate from the mass scale \(l\).<br /><br />Cute. 6+3 masses isn't too much but it's marginally enough to start to determine the value of the parameters \(k,l\) and maybe the question whether the eigenvalues want to repel each other etc.<br /><br />They also apply similar "methods" to the three neutrino mass eigenvalues and remarkably enough, they claim\[<br /><br />\sum^3 m_\nu = 0.0592\eV<br /><br />\] with the error margin around 0.1%. Quite a prediction! The sum of the three neutrino masses is almost certainly smaller than \(0.066\eV\), they say, and all these statements pretty much mean that the lightest neutrino mass should be of order \(0.001\eV\), smaller than all the neutrino mass differences, roughly speaking.<br /><br />Again, so far, I think it's unlikely that any of these claims are more than childish numerological games. But Tye is a smart guy. Those things are intriguing. It does seem plausible to me that because the number of parameters and elementary objects in the Standard Model is much greater than one, some statistical features of the distribution of these numerous parameters – such as the lepton masses – could be "calculable".<br /><br />To understand their papers or even go beyond them, you primarily need to understand<br /><ul><li>how these claims about the distributions may be justified by combining string theory with some clever statistical thoughts</li><li>how they're applied to our Universe, given the known parameters, whether the agreement is good, and whether there is anything nontrivial about the fact that you may apparently fit the elephant.</li></ul>If you find some positive arguments of at least one of these two types, they could be evidence that there is some truth about their shockingly sounding picture.<br /><br />As I have already mentioned, one of the suspicious claims they make is that the distribution for the mass eigenvalue gets factorized – there are no correlations between the individual quark masses. They have some argument in favor of this claim but why should it be the case? Is it "exactly" the case or just in the \(N\gg 1\) limit? There are obviously many questions.<br /><br />At any rate, their paper shows that in the presence of a large landscape of solutions, one's expectations about the physics predicted by string theory depends not only on the uncontroversial calculational techniques of string theory but also on some "philosophical statistical principles or a probabilistic axiomatic system for the vacua". Tye et al. have presented a picture that may be considered a competitor both to the "uniform phenomenologists' naturalness" as well as the "generic string vacuum or Douglas' naturalness" – or the really fishy anthropic methods trying to maximize the number of observers. All these "statistical additions" to the mental framework how to thing about string theory with its set of solutions are vastly different from each other. But one of them – or another one that hasn't been clearly proposed yet – could be true. At this moment, we don't really know which, if any, is true.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-53136040988553864892019-02-13T19:49:00.000+01:002019-02-13T20:11:45.472+01:00Matrix theory: objects' entanglement entropy from local curvature tensorI want to mention two papers that were released today. A Czech one and an Armenian one. In the Czech paper,<br /><blockquote><a href="https://arxiv.org/abs/1902.04470">Hierarchy and decoupling</a>,<br /></blockquote>Michal Malinský (senior co-author) and Matěj Hudec (also from a <a href="https://en.mapy.cz/turisticka?x=14.4482230&y=50.1151033&z=17&pano=1&pid=56261113&yaw=1.254&fov=1.257&pitch=-0.505&source=addr&id=8992878&q=V%20Hole%C5%A1ovi%C4%8Dk%C3%A1ch%202%2C%20Praha" rel="nofollow">building</a> where I spent a significant part of my undergrad years) exploit the new relaxed atmosphere in which everyone can write things about naturalness that would be agreed to be very dumb just some years ago. ;-) OK, so they don't see a problem with the unnaturalness of the Higgs potential in the Standard Model. <br /><br /><a href="https://www.hmc.edu/about-hmc/wp-content/uploads/sites/2/2013/05/visit-campus-header.jpg" rel="nofollow"><img src="https://www.hmc.edu/about-hmc/wp-content/uploads/sites/2/2013/05/visit-campus-header.jpg" width=407></a><br /><br /><em>Harvey Mudd College, CA</em><br /><br />If they nicely ban all the high-energy parameters and efforts to express physics as their functions, they may apply the perturbation theory to prove things like\[<br /><br />m_H^2 \sim \lambda v^2<br /><br />\] to all orders. The Higgs mass is always linked to the Higgs vev and no one can damage this relationship, assuming that you ban all the players that could damage it. ;-) OK, it's nice, I am probably missing something but their claim seems vacuous or circular. Of course if you avoid studying the dependence of the theory on the more fundamental parameters, e.g. the parameters of a quantum field theory expressed relatively to a high energy scale, you won't see a problematic unnatural dependence or fine-tuning. But such a ban of the high-energy independent parameters is tantamount to the denial of reductionism. <br /><br />I believe them that they don't have a psychological problem with naturalness of the Higgs potential but I still have one.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />That was a hep-ph paper. On hep-th, I regularly search for the words "string", "entan", "swamp", and "matrix" (although the list is sometimes undergoing revisions), not to overlook some papers whose existence should be known to me. So today, "matrix" and "entan" converged to the same paper by an author whom I am fortunate to know, Vače Sahakian (or Vatche Սահակեան, if you find it more comprehensible):<br /><blockquote><a href="https://arxiv.org/abs/1902.04229">On a new relation between entanglement and geometry from M(atrix) theory</a><br /></blockquote>He has sent the preprint from a muddy college in California which might immediately become one of the interesting places in fundamental physics. ;-)<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />OK, Vače assumes we have two objects in our beloved BFSS matrix model which are, as the matrix paradigm dictates, described by a block diagonal matrix. The upper left block describes the structure of the first object, the lower right block describes the second object, and the off-diagonal (generally rectangular) blocks are almost zero but these degrees of freedom are responsible for the interactions between the two objects.<br /><br />Vače allows the non-center-of-mass degrees of freedom of both blocks to optimize to the situation, he sort of traces over them, and wants to calculate the entanglement entropy of the center-of-mass degrees of freedom (which are the coefficients in front of the two blocks' identity matrices). He finds out that the von Neumann entropy depends of the derivatives of the gravitational potential, \(\partial_i \partial_j V\). <br /><br />By a process of "covariantization", he translates the gravitational potential and its derivatives to the variables that are more natural in Einstein's general relativity, such as the Riemann tensor, which leads him to a somewhat hypothetical form of the entanglement entropy \[<br /><br />S_{ent} = -\gamma^2 {\rm Tr} \zav { \frac{{\mathcal R}^2}{4} \ln \frac{{\mathcal R}^2}{4} }<br /><br />\] which is finite, concise, and elegant. Here, the \({\mathcal R}\) object is the Riemann tensor contracted with some expressions (partly involving matrices) that are either necessary for kinematic or geometric reasons or because of the embedding into the matrix model.<br /><br />Aside from the finiteness, conciseness, and elegance, I still don't understand why this particular – not quite trivial – form of the result should make us happy or why it should look trustworthy or confirming some expectations that may be obtained by independent methods. "Something log something" is the usual form of the von Neumann entropy which has terms like \(-p_i\ln p_i\), as you know, but the probabilities should be replaced by a squared Riemann tensor. If it is true, I don't know what it means.<br /><br />At the end, if a result like that were right, it could be possible to determine some entropy of black holes or wormholes or holographic deviations from locality (from the independence of regions) or something like that in Matrix theory but I have no idea why. It may be because I don't have a sufficient intuitive understanding of the entanglement entropy in general. At any rate, this is a kind of a combination of Matrix theory and the entanglement-is-glue duality that should be studied by many more people than one Vače Sahakian.<br /><br />Incidentally, after a 7-month-long hiatus, Matt Strassler wrote a <a href="https://profmattstrassler.com/2019/02/13/breaking-a-little-new-ground-at-the-large-hadron-collider/">blog post</a> about their somewhat innovative <a href="https://arxiv.org/abs/1902.04222">search for dimuon resonances</a>.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-73210005358766646592019-02-07T12:07:00.001+01:002019-02-07T16:13:02.377+01:00Can the FCC tunnel(s) become much thinner?Are you a hardcore theorist who sometimes loves to play the game that he (or she, Ann and Anna) is a game-changing inventor dealing with the practical life issues and construction, nevertheless? I am and I do. ;-)<br /><br />Electric cars with batteries suck because 1 kg of a battery only stores 2% of what 1 kg of petrol does. Recharging is slow and some of these parameters won't get much better. But why don't we add wires to all our highways and switch to personal trolleybuses everywhere? The electric cars could have batteries just for a few miles of being off the grid. What's your objection, grumpy reader? :-)<br /><br /><a href="https://www.skoda.cz/reference/trolejbus-32-tr/?from=prod" rel="nofollow"><img src="https://www.skoda.cz/photo-ct-4301-760-546-.jpg" width=407></a><br /><br /><em>Why don't we fill the land with personal trolleybuses? No batteries, no refueling anymore. <a href="https://www.skoda.cz/reference/" rel="nofollow">The Pilsner model</a> above is only designed for speeds up to 65 kph but it could be improved, I guess.</em><br /><br />Or why don't we have nuclear-powered aircraft? You can invent such ideas and Google search for them. You will usually find out that it's been discussed and there are some usual problems that are immediately presented as fatal. For example, the nuclear-powered airplanes suck because the people can't be nicely protected against the radiation.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />When I saw the proposals to build the next \(100\TeV\) collider at CERN, the FCC, I was impressed how surprisingly cheap the project is claimed to be (although it's not guaranteed that the final price wouldn't be much higher – it often is). Well, \(100\TeV\) is more than 7 times \(13\TeV\) but €21 billion is less than 5 times $5 billion, the price of the LHC, and it does include the new tunnel which the LHC inherited from LEP for free.<br /><br />And those €21 billion are "cheaper euros" due to some 15 years of inflation – maybe by some 30% – than the "LHC euros". The cost only grows like the square root of the collision energy, it seems! Every person who has at least some relationship to science agrees that even €21 billion is peanuts for the most extreme and far-reaching science experiment that is being built just once in 20 years at the current speed.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />However, in the decomposition of the expenses to the basic parts, I was rather annoyed by the expensive tunnels. Well, those €21 billion are composed of:<br /><blockquote>€5 billion for the 100-kilometer-long tunnel,<br />€4 billion for the lepton collider magnets etc.,<br />€12 billion for the later upgrade, hadron collider magnets.<br /></blockquote>I think it doesn't make sense to be more precise than that because the final numbers can't be estimated too accurately. Great. €5 billion for the tunnels looks like a lot. The percentage of the price that is consumed by the tunnel, the most low-brow part of the project, seems to be going up.<br /><br />In a calculation in my <a href="https://motls.blogspot.com/2019/01/fcc-collider-tunnel-will-elon-musk-save.html?m=1">article about Musk's proposed discount</a> (which is ludicrous because his Boring Company is doing the same as competitors), I saw that by the volume and the proportionality law, using the previous colliders, the new tunnel should only cost €2 billion, not €5 billion. But a part of the increase is explained by inflation. A part may be due to a somewhat thicker tunnel. And the boring costs may grow faster than the general inflation, who knows. Maybe the rocks in the FCC area are less friendly, too.<br /><br />To get higher collision energies, you need a greater curvature radius of the tunnels to keep the particles in the pipes – well, except for the magnets' getting stronger but the improvements have their limits. That implies that the tunnel has to be long. But it could arguably be thinner and therefore cheaper because the boring costs are almost proportional to the volume of the rock (and therefore to the cross section area, assuming a fixed length of the tunnel).<br /><br />The cross section of the FCC tunnel is said to be <a href="https://www.nextbigfuture.com/2019/01/boring-company-could-lower-cost-of-europes-next-generation-particle-colliders.html">23.76 square meters</a>. By saying it equals \(\pi d^2/4\), you will get the diameter \(d=5.5\,{\rm meters}\). Wow, that's a pretty thick tunnel, indeed. Is that really needed?<br /><br />Why wouldn't I ask the people behind the FCC? The key people behind a €21 billion project surely don't have anything better to do than to chat with the laymen on Twitter – and I was right. ;-) So I asked:<br /><br /><blockquote class="twitter-tweet" data-lang="en"><p lang="en" dir="ltr"><a href="https://twitter.com/FCC_study?ref_src=twsrc%5Etfw">@FCC_study</a> May I ask a trivial layman's technical question? Why isn't the FCC tunnel thinner, like 1-meter diameter <a href="https://t.co/K0O2HV2Pc4">https://t.co/K0O2HV2Pc4</a> which would be much cheaper, I guess, and still allowed to bring the magnets there, e.g. with some robots?</p>— Luboš Motl (@lumidek) <a href="https://twitter.com/lumidek/status/1093385436175708160?ref_src=twsrc%5Etfw">February 7, 2019</a></blockquote><script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script><br />The hyperlink goes to the Wikipedia page about "microtunneling".<br />The answer arrived almost immediately.<br /><br /><blockquote class="twitter-tweet" data-lang="en"><p lang="en" dir="ltr"><a href="https://twitter.com/lumidek?ref_src=twsrc%5Etfw">@lumidek</a> thank you for this question. As you may suspect there is a number of reasons that we will try to summarize: 1) Magnet design already foresees a diameter of 1.20 m to which you have to add the support and alignment underneath (1/6) <a href="https://t.co/61HN4nlW93">https://t.co/61HN4nlW93</a></p>— FCC study (@FCC_study) <a href="https://twitter.com/FCC_study/status/1093450334016622593?ref_src=twsrc%5Etfw">February 7, 2019</a></blockquote><script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script><br />I surely did suspect that I would be immediately thrown at some usual excuses why they can't get below 24 square meters. But I think that the FCC folks hopefully do suspect that I won't give up this easily! ;-)<br /><br /><blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Moreover, the cryogenic lines (QRL) for the superconducting magnets require about one extra meter (2/6)</p>— FCC study (@FCC_study) <a href="https://twitter.com/FCC_study/status/1093450583602798593?ref_src=twsrc%5Etfw">February 7, 2019</a></blockquote><script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script><br />The FCC proponents weren't careless, of course:<br /><br /><blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">There has already been an effort to minimize these parameters in the magnet design and ensure that these magnets can fit in the existing LHC tunnel - as the FCC study also covers the HE-LHC upgrade (3/6)</p>— FCC study (@FCC_study) <a href="https://twitter.com/FCC_study/status/1093450741623255040?ref_src=twsrc%5Etfw">February 7, 2019</a></blockquote><script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script><br />And, to make things worse:<br /><br /><blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">One also needs space to transport/exchange a magnet once the machine is installed (one doesn’t want to take all magnets out in front to replace one broken in the middle) (4/6)</p>— FCC study (@FCC_study) <a href="https://twitter.com/FCC_study/status/1093450796140843008?ref_src=twsrc%5Etfw">February 7, 2019</a></blockquote><script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script><br />We mustn't forget:<br /><br /><blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">pluts other services (cables, cooling pipes)and for ventilation lines as foreseen by safety rules if someone has to go in the tunnel and intervene manually - we may not be able to do everything by robots (5/6)</p>— FCC study (@FCC_study) <a href="https://twitter.com/FCC_study/status/1093450965158629376?ref_src=twsrc%5Etfw">February 7, 2019</a></blockquote><script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script><br />Finally:<br /><br /><blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">One can discuss a smaller diameter for the tunnel - and the FCC CDR identifies areas for further R&D - but microtunelling doesn't seem yet to be an effective solution though indeed is an interesting approach (6/6)</p>— FCC study (@FCC_study) <a href="https://twitter.com/FCC_study/status/1093451188530499584?ref_src=twsrc%5Etfw">February 7, 2019</a></blockquote><script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script><br />And some <a href="https://twitter.com/FCC_study/status/1093451407695495170">extra niceties</a> with an offer to explain things by the e-mail.<br /><br />OK, there are clearly some extra "veins" that go through the tunnel, on top of the 1.2-meter-in-diameter cylinder with magnets. But this is a €5 billion tunnel – it might be a good idea to save some of those 23.76 square meters in the cross section, to miniaturize things a little bit, right?<br /><br />We need the main pipe with the particles and magnets; cryogenic lines with another meter of space in diameter; space through which the magnet is transported during installation (to avoid "LIFO" deconstruction of the whole collider during repairs); cables and cooling pipes plus a space for a person to get there. I omitted the extra comments unrelated to the content of the large intestine.<br /><br />What do you think my reaction should be?<br /><br />I did know that there are things on top of the main tube, of course, and one doesn't want to deconstruct the whole collider during repairs. But I think that several thin tunnels could replace the extremely thick one. Let's count the square meters that we really need.<br /><br />The main cylinder with the magnets and particles in the middle could be 1 square meter. These magnets could be transported there through another thin tunnel which is another 1 square meter, and these tunnels (and all other tunnels in the plan below) could be fully connected e.g. on one hundred 50-meter-long segments each 1 kilometer of the circumference. On each kilometer, all the magnets in the row would be taken out if one of them had to be repaired.<br /><br />Another 1 square meter is the cryogenic line, another 1 square meter is some wires and extra cooling, and 1 additional square meter is enough for a CERN employee to physically get there. If Elon Musk kindly allowed, the employee would be a British diver who is not a pedo and doesn't suffer from claustrophobia. He could easily climb through a tube of diameter 1 meter. Each 100 meters, there would be some small holes in between all the small tunnels so that he could look or fix the mechanisms that allow things to be moved in between all the tunnels on each 1 kilometer.<br /><br /><a href="https://mk0nextbigfuturj5ioe.kinstacdn.com/wp-content/uploads/2019/01/Screen-Shot-2019-01-18-at-9.31.49-AM-730x430.png" rel="nofollow"><img src="https://mk0nextbigfuturj5ioe.kinstacdn.com/wp-content/uploads/2019/01/Screen-Shot-2019-01-18-at-9.31.49-AM-730x430.png" width=407></a><br /><br /><em>Just by eye-balling, don't you agree that at least one-half of the area of the disk-shaped cross section is wasted?</em><br /><br />I tried to be tough and reduce the total cross section from 24 to 5 square meters. I am surely gonna be told that it's too ambitious and impossible. Maybe some merger into two tunnels of the diameter of 2 meters could be better. Maybe we could get to 12 square meters in total. But the price of the tunnel – now tunnels – could still drop by one-half or two billion Euros, I think.<br /><br />Some optimization <em>should be tried</em>. It's a lot of money. <br /><br />One of the arguments we sometimes quote as the "secondary" benefits of the collider projects is that they encourage the progress in lots of the technologies that are needed to build that huge device. We usually mean the superconducting magnets and other "hi-tech" components. But what about the damn tunnels? They're a century-long technology but some "clever tunnels for the 21st century" which minimize the cross section and allow all things to get to the right places due to some clever enough logistics should be a part of the "secondary progress" ignited by the CERN projects.<br /><br />The kind FCC folks surely feel uneasy about such proposed revisions. But I do think that they should move their aß and try to do some clever optimization of the tunnels' infrastructure because the thickness of the tunnels looks wasteful – for a 100-kilometer-long tunnel whose space isn't really enjoyed by the human inhabitants – and sort of "outdated", if you appreciate that "miniaturization" is one of the trends of the relatively modern progress. Maybe as soon as they make the lepton collider €3 billion or 30% cheaper, impressed sponsors will immediately approve the project and the serious work may begin.<br /><br />I also suspect that the dipole magnets themselves and many other things could be thinner than they are as well but I leave this related topic to someone else.<br /><br />And I must add a medium-term shiny accelerator physics vision: the tunnels need to get longer to achieve higher collision energies but there could be an ongoing miniaturization in the thickness of all the tubes, the cross section could keep on shrinking, and the volume of all the tunnels and magnets and therefore the price could stay fixed as the people build ever stronger colliders!<br /><br /><iframe width="407" height="277" src="https://www.youtube.com/embed/i2zIuQqQEzo" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe><br /><br />Off-topic but European and geographically close to the topic: Although Macron has ludicrously declared himself to be one of the Yellow Vests, there had to be some reasons why he didn't like the Italian deputy prime minister's meeting with one of his (Macron's) bosses, leaders of the Yellow Vest movement. <br /><br />So France has recalled its ambassador to Rome. Clearly, after decades of taking credit for the peace on our continent, the European Union isn't helpful in calming the passions. The video above compares the French and Italian forces in the looming Romance war. <br /><br />The foes are tied in many respects, in others they are imbalanced and France has a slightly higher number of advantages but the result could be uncertain for a long time, especially because France has a big disadvantage of greater internal disagreements right now, I think.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0tag:blogger.com,1999:blog-8666091.post-6425126853709730072019-02-05T17:28:00.000+01:002019-02-05T17:59:42.323+01:00Realistic fermion masses from D6-branesThe most interesting hep-ph paper today is<br /><blockquote><a href="https://arxiv.org/abs/1902.00983">All Fermion Masses and Mixings in an Intersecting D-brane World</a><br /></blockquote>by Van Mayes of Houston. Well, it's a string phenomenology paper so it's more interesting than a dozen of average hep-ph preprints combined. Since my childhood, I wanted to calculate the "constants of Nature". It took some time to understand that one may only calculate the dimensionless ones – those don't depend on a social convention, the choice of units. Mass ratios of elementary particles were the first constants I was obsessed with – even before the fine-structure constant.<br /><br />Well, at the beginning, I also failed to appreciate that the proton wasn't quite elementary so the proton-to-electron mass ratio, \(m_p/m_e\approx 1836.15\), was interesting enough. I figured out it was equal to \(6\pi^5\). Good numerology proves one's passion. ;-) I still think that the numerical agreement between this simple formula and the measured ratio is rather impressive.<a name='more'></a><br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-8768832575723394" data-ad-slot="4218709518"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />OK, in more adult terms, the Standard Model has some 29 parameters or so. Most of them describe the mass matrices of the quarks and leptons. Wouldn't it be great to calculate them? String theory in principle allows you to calculate all the dimensionless constants encoded in the masses – once you insert a finite number of bits that describe the string compactification, you may calculate all such constants with an unlimited accuracy, at least in principle and after you figure out a calculational framework that really allows you any accuracy in principle.<br /><br /><script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script> <ins class="adsbygoogle" style="display:inline-block;width:336px;height:280px" data-ad-client="ca-pub-8768832575723394" data-ad-slot="0363397257"></ins> <script> (adsbygoogle = window.adsbygoogle || []).push({}); </script><br /><br />String theory's realistic vacua require supersymmetry, for many reasons, and the maximum decompactified number of spacetime dimensions is 10, 11, or (if you promise to undo the decompactification of two soon) 12. The realistic classes of vacua in string/M/F-theory include<br /><ul><li>\(E_8 \times E_8\) heterotic strings on a Calabi-Yau three-fold</li><li>their strongly coupled limit, Hořava-Witten M-theory on a line interval times the Calabi-Yau</li><li>M-theory on a singular manifold of \(G_2\) holonomy</li><li>type IIA braneworlds with D6-branes</li><li>F-theory on Calabi-Yau four-folds or perhaps \(Spin(7)\) holonomy manifolds, perhaps with lots of fluxes</li></ul>There are various dualities between these five groups. The second is the strong coupling limit of the first. The \(G_2\) holonomy manifolds of M-theory may be obtained from type IIA string theory with D6-branes – if those branes are replaced by Kaluza-Klein monopoles (which are ultimately smooth vacuum solutions of Einstein's equations). Also, if the \(G_2\) holonomy manifolds or the Calabi-Yau four-folds used for F-theory are written as certain fibrations, one may find a dual heterotic string theory. And there are a few more.<br /><br /><iframe align="left" scrolling="no" frameborder="0" style="width:140px;height:245px;" marginheight="0" src="//ws-na.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&OneJS=1&Operation=GetAdHtml&MarketPlace=US&source=ac&ref=tf_til&ad_type=product_link&tracking_id=lubosmotlsref-20&marketplace=amazon®ion=US&placement=0000000000&asins=0521880327&show_border=false&link_opens_in_new_window=false&price_color=BBBBBB&title_color=FFAA44&bg_color=002211" marginwidth="0"/></iframe>Mayes discusses some developments in type IIA string theory with D6-branes. There's some sense in which I love the five classes above equally. The IIA braneworlds with D6-branes are a great class of semi-realistic string compactifications. Incidentally, you may understand those braneworlds rather well from Barton Zwiebach's undergraduate textbook of string theory!<br /><br />Just to be sure, I am not 100% sure that our world has to be described by a string vacuum from at least one of the five classes above (the number of correct classes may be higher than one due to dualities – equivalences across the groups). There may be other classes we have overlooked due to our incomplete understanding of string theory or string theory may be wrong as a theory of our Universe, in principle. But even if the confidence were just in "dozens of percent" that our Universe belongs to one of those groups, I would view it as a moral imperative for a sufficiently intelligent person to dedicate some time to get closer to such a TOE – or to find a viable alternative to string theory (which seems extremely unlikely to me).<br /><br />Mayes uses type IIA string theory on an orbifold of the six-torus, \(T^6/\ZZ_2\times \ZZ_2\). The hidden six dimensions aren't too complicated – they are flat, in fact. You just make all six flat dimensions periodic, using a lattice. That's what a six-torus is. The orbifold means the "division by the group", in this case \(\ZZ_2\times \ZZ_2\): you identify points (or physical configurations, more generally) that are related by the geometric (or generalized) transformations representing the group elements. Here the orbifold group has \(2\times 2 =4\) elements. One of them is the identity element and the other three are "analogous to each other". So although \(\ZZ_2\times \ZZ_2\) looks like a group that is "all about the number two and its powers", there is a triplet hiding underneath it.<br /><br />The group acts on the three complex coordinates labeling the six-torus, \(z_1,z_2,z_3\), by changing signs of two of the three coordinates (or by doing nothing). You may check that these operations are closed under composition and the group is isomorphic to \(\ZZ_2\times \ZZ_2\). These orbifolds were considered interesting since the mid 1980s, the First Superstring Revolution, and with the D6-branes added, they have been known to be damn promising in phenomenology since 2000 or so. Note that there also exist interesting orbifolds of the torus \(T^6\) that involve the group \(\ZZ_3\) – but the six-torus must split to two-tori defined with the angle of 120 degrees. The angles in Mayes' tori may be arbitrary.<br /><br />Most of the key papers that Mayes uses are about one decade old – papers by Cvetič, Shiu, Uranga; Chen, Li, Mayes, Nanopoulos, and others. Type IIA string theory is great for braneworlds because the fermions and the Higgs doublet emerge really naturally from the branes.<br /><br />Note that D6-branes are filling the 3+1-dimensional spacetime and they have 3 extra dimensions along the compactified directions. Those latter 3 are exactly equal to 1/2 of the number of the compactified dimensions which means that two generic D6-branes intersect at one point of the extra dimensions. The intersection is where some extra fields may live – fields arising from open strings stretched between two different D6-branes.<br /><br />On top of that, the cubic couplings such as the Yukawa couplings may be calculated from "open world sheet instantons", triangular (=topologically a disk) fundamental world sheets stretched between the three intersections where the three fields involved in the cubic coupling live! That's wonderful because such "open world sheet instantons" effects are naturally suppressed with \(\exp(-AT)\) where \(A\) is the area and \(T\) is the string tension. That braneworld has a natural "exponential" explanation why the Yukawa couplings may be very small – and why they may differ by orders of magnitude from each other.<br /><br />In another subfield, pure phenomenology, people have been playing with the fermion mass matrices for some time. The masses of quarks have been mostly understood by the 1970s – the top quark mass was really the newest and only new added parameter, in the mid 1990s. On the other hand, the neutrino masses – only seen through neutrino oscillations – have only been increasingly clearly measured since the late 1990s or so.<br /><br />By now, the lepton masses plus the (squared) mass differences of the neutrinos and the mixing angles have been measured analogously precisely and completely as their quark counterparts. So in the quark sector, you basically need to know the masses of six quarks, the mass eigenvalues (three upper, three lower quarks), and the CKM matrix depending on four angles.<br /><br />The story in the lepton sector is almost the same except that the upper and lower quarks are replaced by charged leptons and their neutrinos; the mixing matrix is called the PMNS matrix; there is one "overall" parameter labeling the neutrino masses that is unknown (only the differences of squared masses are known, as I mentioned, because only the differences affect the oscillations which is how the neutrino mass parameters are being measured – we haven't seen a neutrino in its rest frame yet); and there is a possibility that the neutrino masses aren't really Dirac masses but Majorana masses – in which case their fundamental origin could be unequivalent to the quark masses.<br /><br />The neutrino mass-or-mixing matrices have been measured. One can see that the neutrinos are much lighter than the charged leptons and all the quarks. On top of that, they are apparently much more mixed than the quarks. All the angles in the CKM matrix are "rather small". On the other hand, many angles in the PMNS matrix seem to be "very far from zero or all the multiples of 90 degrees". That means that they're close to things like 45 degrees.<br /><br />OK, a bit quantitatively. The CKM matrix is a \(3\times 3\) unitary matrix in \(U(3)\) – which encodes the transformation you have to do with the 3 upper-type quark mass eigenstates to get the upper \(SU(2)\) partners of the 3 lower-type quark mass eigenstates. Five of the phases may be thrown away by redefining six phases of the quark mass eigenstates (one of those phases which rotates all 6 quarks equally doesn't affect the CKM matrix so it's one parameter that has to be "subtracted from the subtraction"). It means that out of 9 parameters in the \(U(3)\) matrix, four are left – basically three real angles of an \(SO(3)\) matrix and one CP-violating "complex angle".<br /><br />It's similar with the neutrinos' PMNS matrix. There's some CKM-like unitary matrix \(U\). A funny observation was that this matrix was close to\[<br /><br />U_{TB} = \pmatrix {\sqrt{2/3} & \sqrt{1/3} & 0 \\ -\sqrt{1/6} & \sqrt{1/3} & - \sqrt{1/2} \\ -\sqrt{1/6}&\sqrt{1/3}&\sqrt{1/2} }.<br /><br />\] All the matrix entries are (plus minus) square roots of small integer multiples of \(1/6\). You may check that it's a unitary matrix: all pairs of rows are orthogonal to each other, all rows have length equal to one, and to make a check, the same two types of conditions hold for columns or their pairs, too.<br /><br />This Ansatz for the PMNS matrix is very close to the observed one and only a decade ago or so, this form of the PMNS matrix was actually falsified, primarily by seeing that the entry "zero" isn't quite zero. A new transformation involving neutrinos of 1st and 3rd generations (because the vanishing entry is in the 1st row and 3rd column) was observed for the first time – at some moment a decade ago.<br /><br />The matrix \(U_{TB}\) is an intriguing piece of numerology but is there any reason why this form should be the right one (or close to the right one)? The answer is that such reasons were found in the flavor symmetries. There are three generations of quarks and leptons. The generations have different masses but there may still be some symmetries acting on the three generations that constrain the form of the mass matrices – in a nontrivial but not "complete" way, so that different eigenvalues are still allowed.<br /><br />This "more serious level of neutrino matrix numerology" has led the people to realize that the special form of the unitary matrix above, the "tribimaximal mixing", may be derived from the assumption of flavor symmetries, either \(A_4\) or \(\Delta(27)\), two finite groups. The first is just the group of even permutations of four elements. The second one is more complex and I discussed it in a <a href="https://motls.blogspot.com/2013/07/fermion-masses-from-27-group.html?m=1">similar blog post six years ago</a> and e.g. <a href="https://motls.blogspot.com/2011/06/seeing-d-branes-at-tevatron-and-lhc.html?m=1">this 8-year-old one</a>.<br /><br />Mayes localized his pet D-braneworld model and argued that it produces a close-to-tribimaximal mixing matrix – which is non-trivial – and with some choice of some vevs of the many Higgses in the model, all the parameters determining the fermion masses and mixing seem to be OK, too. He seems to assume many values of the parameters. At the end, I think that he can't calculate a single combination of them from the first principles – although I am not sure, maybe he claims that he can.<br /><br />But even if the "nominal" predictive power of his construction is zero, he has done some non-trivial reverse engineering of the fermion mass parameters. The braneworld he has apparently can rather naturally – in some colloquial sense, but maybe also a technical sense – explain the hierarchy between the fermion masses and the nearly maximal mixing of some neutrino species, among other things.<br /><br />There are many qualitative choices one can make while choosing a type IIA D-braneworld. Ideally, we would want the number of predictions that arise from his model to be greater than the number of choices that had to be made – imagine both credits and debits are counted in bits or nats. But even if that comparison indicates that he hasn't produced more than he inserted, it's still true that the number of detailed microscopic theories that have a reasonable chance to explain the spectrum of the Standard Model, approximate values of the masses and/or their hierarchies, and the approximate values of the mixing angles, is extremely limited.<br /><br />Grand unification can do something but it's always limited because grand unified theories still have some parameters. His string compactification ends up being a Pati-Salam theory which is strictly speaking not a grand unified theory because the gauge group has two factors. But his Pati-Salam theory behaves much like a grand unified theory, exhibits the gauge coupling unification, and other things. There's also the \(U(1)_{B-L}\) gauge group in it.<br /><br />It seems plausible to me that models like that are so amazingly on the right track that a few weeks or months or years of work by some folks could have a chance to "nearly prove" that the model is actually right – that it predicts something. I think it's just terribly painful for this Earth with more than 7 billion humans to only produce "several" people who work at the D-braneworld string phenomenology at this moment (and similarly "several" \(G_2\) holonomy phenomenologists, and analogously with the other three – I guess that the F-theory researcher class is most numerous right now), a truly fascinating subfield. Individuals who would like to reduce this number and similar numbers of researchers <em>further</em> are simply animals. I will never consider them full-blown human beings.Luboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.com0