## Saturday, February 09, 2013 ... /////

### UV/IR connection: link between God and His inverse

Whether some people like it or not, science has demonstrated that reductionism works. The laws governing the evolution of increasingly complex – and, typically, geometrically larger – objects (laws that are increasingly vague, riddled with uncertainties, errors, and exceptions) may be reduced to the laws describing ever smaller and ever more elementary building blocks (i.e. to laws that are increasingly more fundamental, accurate, and universal).

In fact, one may approximately list the disciplines of science in a hierarchical tree in which the arrow ↑ means that the discipline at the beginning of the arrow may be reduced to the discipline at the end of the arrow.

The sequence may look like this:

string theory

beyond the Standard model QFT

the Standard Model

nuclear and subnuclear physics

atomic physics and chemistry

biochemistry and microbiology

neuroscience

psychology

economics

science about religions and mass delusions
And so on, and so on. I could make the tree more diverse and add branches. Note that something close to God appears at the end of the list above so He is highly non-fundamental, despite some people's beliefs. Of course, one could also make God more fundamental and place it above string theory. But that would be a slightly different God, a more scientific and less compassionate one. ;-)

(Some other people may choose the opposite convention for the arrow; independently of that, they could also sort the entries upside down. But I will keep my arbitrary ordering and use it in many sentences.)

The entries at the the beginning will be called "more fundamental" and the entries near the bottom will be "less fundamental". You should note that the objects and concepts discussed by a more fundamental discipline are typically shorter – associated with smaller distance scales – than the objects and concepts in the less fundamental disciplines of science.

A conceptual breakthrough in quantum field theory has formalized this rule. In the 1970s, Ken Wilson and others have realized that one may use a generalized description of some phenomena in Nature. Biology or theology is never an example; all these descriptions are quantum field theories (or models of statistical physics that obey somewhat similar rules as quantum field theories, at least in this Wilsonian enterprise). But they are the so-called "effective quantum field theories" that neglect all the inner structure of objects that are shorter than a distance scale $L$ associated with the effective quantum field theory. So an effective quantum field theory should be OK for a description of phenomena whose typical distance scale is of order $L$ and those whose typical distance scale is longer than $L$ (those phenomena must be derived by some possibly complicated calculation). However, the effective field theory says nothing – and remains agnostic – about everything that happens at distance scales shorter than $L$. The key fact that it's possible – that it's often possible to formulate the laws governing phenomena at long distances that are well-defined (either completely or up to parameterically small errors) despite the uncertainty about many things at shorter distance scales.

An amusing important thing is that you may consider two effective field theories for the same situations whose values of $L$ are almost equal to one another but not quite: they infinitesimally differ. You may switch from $L$ to a slightly longer $L(1+\varepsilon)$. The effective field theory with the longer value of $L$, i.e. the latter one, may be derived from the former one by "integrating out" the degrees of freedom (the integration is a partial integration in the Feynman path integral over fields) that are "long enough" that they are described by both effective quantum field theories but too short so that they're "ignored" in the latter effective quantum field theory (one with the longer distance scale).

(The opposite derivation – the derivation of the short-distance theory from the long-distance theory – isn't possible in general because the long-distance theory has "forgotten" about some details, especially some new particle species that appear at one $L$ or another. In the most general case, the "integrating out" is therefore an irreversible process and the procedures of "integrating out" therefore form a one-way "semigroup" which is a reason why the term "renormalization group" is mathematically misleading but still accurate enough for a physicist.)

Because we have changed the description (in particular, its characteristic distance scale $L$) infinitesimally, the theory will only change infinitesimally. Unless something special was happening exactly at the scale $L$, it means that the qualitative spectrum (list of fields and particle species) remains unchanged. However, the parameters of the theory – masses and values of the coupling constants and other parameters – do change. They only change infinitesimally, by terms proportional to $\varepsilon$. But you may repeat the same step many times which effectively makes the ratio of distance scales $L$ finite – in other words, $\varepsilon$ is pretty much finite. The coupling constants then typically depend on $L$ either as power laws or (the classically dimensionless ones) logarithmically. This dependence of $1/g^2$ on $\log L$ is known as the "running couplings" and it is very important for calculations of the gauge coupling unification in grand unified theories as well as all other analogous calculations that relate the values of parameters in the natural, short-distance-based theory and the parameters in an effective theory that is directly relevant for "low energies" such as those at the LHC.

(Yes, $13\TeV$ at the LHC from 2015 may look like high energy to some people – and that's why particle physics is known as "high energy physics" – but in this classification, it's referred to as "low energy" because it's still closer to the mundane energies we know from the everyday lives than to the high energies such as the Planck energy whose self-evident experimental inaccessibility is sometimes criticized by idiots who just hate the fact that $10^{19}$ is a large number and want to vote this fact and most other facts out of existence.)

But let's stop these possibly complicated comments. My main point is that in quantum field theory, shorter distance scales are the more fundamental ones. The shorter distances you may experimentally probe (which requires ever higher accelerator energies) or the shorter distances you may describe by your effective theories (which requires an ever deeper and more comprehensive knowledge of physical phenomena that have already been seen and those that may only be clearly seen in the future), the closer you are to the most fundamental laws of Nature.

We have to ask: Does this rule always work?

Needless to say, the answer is No according to quantum gravity and/or string theory – which is ultimately the same thing. In quantum field theory, we needed to go an "infinite distance" on the log scale if we wanted to reach the ultimate fundamental laws. But note that the word "infinite" appears in the previous sentence and "infinities" or "singularities" mean that a particular description is incomplete. When you switch to a more well-behaved description, such infinities are replaced by finite numbers. The "infinity" in this paragraph is no exception. The final theory is a finite distance away!

Why is it so? Well, it's true because distances shorter than a particular constant with the units of one meter don't "exist" in the operational physical sense or, to say the least, the geometry at the would-be shorter distances obeys rules that are unfamiliar from our everyday experience accumulated in the world of rather long distances.

John Wheeler's "quantum foam" meme – the last part of the picture above – is a way to imagine how geometry gets modified when we're approaching the distances close to $10^{-35}$ meters, the Planck length. Note that even the topology of the curved space (and spacetime) gets variable. The characteristic size of the tunnels and handles is comparable to one Planck length, too.

However, the picture above is still way too classical. You should imagine that the true picture is more fuzzy, more probabilistic, and the very concept of "distances" becomes somewhat ill-defined and inappropriate to describe everything that is going on. Every classical "picture" you may imagine is clearly way too constraining and naive. Nevertheless, let's see that the proper distances start to "brutally fluctuate" if we focus at distance scales comparable to the Planck length.

We begin with the Einstein-Hilbert action for general relativity$S = \int \dd^d x\,\frac{R}{16\pi G} \sim \frac{1}{G} \int \dd^d x\,[\partial(\eta+h)^2]$ in $d$ spacetime dimensions. I wrote the metric tensor as $g_{\mu\nu}=\eta_{\mu\nu}+h_{\mu\nu}$ the sum of a background metric – imagine the flat Minkowski metric for the most important example – and a variation $h_{\mu\nu}$. Also, we appreciated that the Ricci curvature scale $R$ is bilinear in the spacetime derivatives – at least, that's the number of derivatives in the terms that are most important at long distances (and in mildly curved spacetimes). At the end, perhaps after some integration by parts, these leading terms produce the usual bosonic kinetic terms of the form $(\partial h)^2$ that only differ from the Klein-Gordon kinetic term by subtleties involving the contraction of the Lorentz vector indices.

We may now ask: If we consider a line interval whose length is approximately $L$ in the space, how much will the proper length of this interval fluctuate because of the quantum fluctuations in between? We only want an order-of-magnitude estimate. Well, write $h=\sqrt{G}H$ to get rid of the $1/G$ prefactor in the action. The relevant Lagrangian is then simply $(\partial H)^2$ and depends on no dimensionful parameters. Take a Fourier mode of the field $H$ with wavelength $L$. What is the characteristic $\Delta H$?

Because $H$ is dimensionful – it inherited the units of ${\rm length}^{1-d/2}$ from the square root of Newton's constant – and because there's no other dimensionful parameter that $\Delta H$ could depend upon (the action is written without extra dimensionful parameters; we have even eliminated $G$), the dimensional analysis dictates that it must be proportional to the appropriate power of $L$:$\Delta H \sim L^{1-d/2}.$ Note that for $d=4$, this scales as $1/L$. By our simple relationship between $h$ and $H$, this translates to$\Delta h\sim L^{1-d/2}G^{1/2}.$ In particular, $\Delta h\sim\sqrt{G}/L$ in $d=4$. But $\Delta h$ is nothing else than $\Delta L/L$, the relative uncertainty of the proper length because $h$ itself is the quantum variation of the metric tensor. And $\sqrt{G}/L$ is nothing else than $L_{\rm Planck}/L$. It means that the relative error in $L$ decreases as $1/L$ and it becomes comparable to 100 percent if $L$ itself is comparable to the Planck length!

Consequently, you can't really measure distances with a better accuracy than the Planck length. The Heisenberg uncertainty principle won't allow you such a feat. In an operational sense, the proper distances (much) shorter than the Planck length don't really exist. Those comparable to the Planck length marginally exist – they partly exist, partly don't exist. (These comments only apply to proper distances in some "invariant frame" such as the center-of-mass frame. All the people who claim that coordinates – and coordinates are not necessarily proper lengths of anything – can't be expressed with better-than-Planckian precision or that the wavelength of the photon can't be shorter than the Planck length misunderstand this business completely. The wavelength of a photon may be arbitrarily short, of course: you may always make it even shorter by the Doppler shift i.e. by going into a more boosted frame.)

But can't we probe ever shorter distances by paying ever larger amounts of money for ever stronger particle accelerators? Up to a moment, yes, we can. But beyond that point, we can't. If we collide two particles (in the LHC case, two protons) with the center-of-mass energy $E$, they (or quarks inside them) may exchange momentum comparable to $E$ which means that the process is sensitive to virtual particles of mass comparable to $E$, too.

However, if you built a (cosmic size) collider that would accelerate the particles to $E$ comparable to the Planck energy which is about $10^{19}\GeV$, things would change. You would expect that such a high-energy collider would test physics at distances comparable to the Planck length. However, the energy carried by the protons would be the Planck energy, the corresponding mass via $E=mc^2$ would be the Planck mass, and the Schwarzschild radius for the Planck mass is the Planck length. The Schwarzschild radius for a mass is the radius of a ball such that if you squeeze the mass to this ball (or a smaller one), you will unavoidably create a black hole.

And that's where we stand with two colliding particles whose center-of-mass energy approaches the Planck energy: you start to create Planck-length-sized black holes.

They're really the smallest black holes that may be discussed as "pretty ordinary black holes", marginally so. More precisely, proper black holes should always be larger (or much larger) than the Planck length. One reason is that the Planck-sized black hole evaporates in the Planck time, almost instantly; an even smaller black hole would evaporate in an even shorter time and the required "speeds" to evaporate would exceed the speed of light. It makes no sense to consider an object that would decay more quickly than the time that light needs to traverse it. ;-)

Can you probe distances shorter than the Planck length by increasing the energy of the two protons above the Planck energy? Nope. If you do so, you create an ever larger black hole. The black hole size $R=2GM/c^2$ is an increasing function of the black hole mass and because $E=Mc^2$, a larger energy of the proton translates to a larger mass and a larger black hole. The "inner architecture of matter" will be shielded by an event horizon of an increasing size. Instead of getting to shorter distances, you will be producing increasingly large black holes whose distance resolution will therefore start to drop again. The Planck length is the minimum proper distance you may (marginally) achieve by particle collisions.

An interesting fact is that if you pump an energy that vastly exceeds the Planck energy into the colliding particles, you will create macroscopic black holes. And their curvature radius is comparable to the black hole radius. If the black holes are very large, the curvature radius becomes large and the curvature itself gets tiny. It means that general relativity – the theory that is reliable for low energies and for low curvatures – becomes arbitrarily accurate once again.

That is why the Einstein-Hilbert action (and comparable terms describing other fields and their interactions and optimized for low energies) is good not only for very low energies of the colliding protons but also for very high energies of the colliding protons – because you produce black holes that admit a general relativistic "classical" description again. Only the transition regime, the intermediate range of energies comparable to the Planck energy (and distances comparable to the Planck length, the shortest possible ones) remain "completely unknown" to the classical general relativity. Whenever the center-of-mass energy is either much lower or much higher than the Planck energy, classical general relativity becomes an arbitrarily accurate approximation!

But the intermediate regime which is not calculable by classical tools – because the influence of quantum mechanics on anything is of order 100 percent – is completely crucial because it has to solve a difficult task: it must smoothly interpolate between two "favorite places of general relativity" where this 1915 Einstein's theory loves to apply itself with an impressive accuracy. It must smoothly interpolate between the very low energies and very high energies. At energies comparable to the Planck energy is where the bulk of "quantum gravity phenomena" take place in the full glory.

It's important to notice that the fact that classical general relativity (with a nearly classical background metric) should apply in both extreme regimes – very low energies and very high energies – is a highly nontrivial constraint on all candidate theories of quantum gravity you could propose. For example, you could think that you may build a theory of quantum gravity by starting with a long-distance theory and extending it to shorter distances, in some arbitrary way. However, that wouldn't give you a consistent theory because the behavior of these laws at very high, trans-Planckian energies would behave according to the desires of this random theory instead of the right way which is the production of ever larger black holes of the right shape.

We may compare the right theory of quantum gravity to a gravitational slingshot. A spaceship coming from the Earth (=behavior of the theory at very low energies) is approaching Jupiter and we want to exploit the largest planet to redirect the spaceship so that it continues to Mars (=behavior of the theory at very high energies). Clearly, the initial velocity has to be very nicely adjusted for the slingshot to end up with the desired outcome. Quantum gravity is analogous. Things at low energies must be very special so that if you extend them to much higher energies, they start to behave in the pre-guaranteed black-hole way rather than an arbitrary wrong way.

When we say "quantum gravity", we usually mean a description of quantum gravity that remains ambiguous or unknown exactly in the transitional regime near the Planck energy. However, whenever we use the otherwise equivalent notion "string/M-theory", we are talking about a much more specific description that can tell us not only what happens at very low energies (negligible gravity etc.) and very high energies (black hole production) but also the Planck-sized energies in between.

For example, M(atrix) theory and the AdS/CFT correspondence provide us with equations that allow us, at least in principle, to calculate the spectrum of black hole microstates for the smallest, Planck-sized black holes exactly when they're in the process of becoming black holes worth this Wheeler's name.

Perturbative string theory is a theory in which weakly coupled strings interact by interactions suppressed by the string coupling constant $g\equiv g_{\rm closed}$ which is taken to be much smaller than one (or, in other conventions, an appropriate power of the string length) if the perturbative expansion is well-behaved. It's the first historically known consistent description of quantum gravity except that Newton's constant goes like$G \sim g_{\rm closed}^2 \cdot l_{\rm string}^8$ in $d=10$ which means that gravity is very weak in the stringy perturbative regime $g_{\rm closed}\to 0$, too. While perturbative string theory allows us to see many new physical phenomena that are unfamiliar from ordinary non-gravitational quantum field theories, it's not quite true that we may use it to see the quantum foam or the extremely fluctuating curved geometry of near-Planckian black holes. Because the strength of gravity is suppressed by another small parameter $g_{\rm closed}\ll 1$, the strings in weakly coupled string theory are rather large (and therefore distant from each other) objects whose gravity is rather weak.

Just to be sure, this hierarchy goes away for $g_{\rm closed}\sim{\mathcal O}(1)$ and we have known how to understand string theory in this regime – at least in many vacua – for more than a decade, too.

However, even if you stay in the $g_{\rm closed}\ll 1$ regime where gravity is weak, string theory offers you lots of phenomena that emulate the general relativity's interpolation between very low and very high energies. There is a difference, however: the intermediate distance scale is $L_{\rm string}$ which is much longer than the Planck length. They differ by a coefficient such as$L_{\rm Planck}\sim L_{\rm string}\cdot g_{\rm closed}^\alpha$ where the exponent $\alpha$ is positive which makes the Planck length much shorter than the string length. Weakly coupled, $g_{\rm string}\ll 1$ string theory "protects" us against the mysterious phenomena near the extreme $L\sim L_{\rm Planck}$ distance scale by surrounding this very short distance scale by a longer one, $L_{\rm string}$, and producing lots of new phenomena at this longer distance scale, the string length.

And they're very interesting and rich phenomena, indeed. Note that for $g_{\rm closed}\sim 1$, the transition near $L\sim L_{\rm string}$ is really the same thing as the quantum gravitational transition near $L\sim L_{\rm Planck}$. However, let's assume that $g_{\rm closed}\ll 1$ (weakly coupled string theory) and look at the string theory's genes that are responsible for its ability to interpolate between the very low and very high energies (relatively to the string scale).

One of these long-short relationships that are often discussed in popular literature is the T-duality. If you compactify one of the spacetime dimensions in perturbative string theory on a circle of radius $R$, you get pretty much the same theory as the theory where the circle was compactified on a circle of radius$\tilde R = \frac{ L_{\rm string}^2 }{ R}.$ If you use the stringy units (for distance and other things) where $L_{\rm string}=1$ (you can't simultaneously set $L_{\rm Planck}=1$ because they're different distances, so you may use string units or Planck units but not both!), then you have simply $\tilde R = 1/R$. Let's work in string units in the text below.

(I am neglecting the technicality that by switching to the inverse radius, you must choose a "different kind of string theory" in general. For example, type IIA string theory on a circle of radius $R$ is equivalent to type IIB string theory on a circle of radius $\tilde R$.)

Why are the theories with a large radius $R\gg L_{\rm string}$ and a small radius $\tilde R=1/R\ll L_{\rm string}$ exactly equivalent to each other? Well, strings may have a momentum along the circular dimension of the spacetime, much like point-like particles. The momentum is quantized (for the same reason why $j_z$ is quantized etc.: the wave function has to be single-valued). You have $P=n/R$, just like if the strings were point-like particles.

However, strings may also wind around the circle $w$ times. This gives them an extra (or minimum) length $2\pi R\cdot w$ which also gives an extra/minimum mass to the string (much like the momentum) and which is directly proportional to $R$ instead of indirectly as $P=n/R$. Well, the interchange of the integer labels $n,w$ for the momentum and the winding composed with the inversion of the radius $R\to 1/R$ pretty much gives you the same spectrum (list of allowed masses) back. The interactions are invariant under this map, too.

When you try to visualize things, the momentum $n$ (how much the string moves) looks like something entirely different than the winding number $w$ (how many times the string is encircling the spacetime circle). However, the best picture is an equation. And the equations of string theory imply that regardless of your inner fantasies, the physical phenomena will not depend on whether you decide that $n$ is the momentum and $w$ is the winding number or vice versa!

This remarkable map means that if you try to shrink the radius of circular spacetime dimensions beneath the Planck length, you won't get any new choices. Every $R\lt 1$ may be inverted to get an equivalent radius with $1/R\gt 1$. A dualities vs singularities discussion shows that this is true for all extreme regimes in which the independent radii of tori are scaled to zero or infinity at general rates. All of such extreme tori may be mapped to equivalent tori. It's always more natural to pick the equivalent description in which all the radii are large – because dimensions much larger than the string length behave much like we're used to from point-like-particle quantum field theories.

But T-duality is far from being the only example of a phenomenon by which string theory relates substringy distances to the ordinary longer-than-stringy distances. An even much more characteristic example of this relationship is the relationship between light and heavy states running in quantum loops. That is, among other things, the explanation why string theory has no ultraviolet (short-distance) divergences.

In string theory, Feynman diagrams – which may be interpreted as a spacetime history of splitting and joining world lines of point-like particles propagating in the spacetime – are replaced by two-dimensional world sheets, tubes that split or merge much more smoothly. The simplest "loop diagram" in string theory (with closed strings only) is a torus. In a particle-like limit, the torus becomes a thin ring and then a circle – the same circle you may find in 1-loop Feynman diagrams for quantum field theories.

However, the stringy loop diagram, a torus, may be drawn as a rectangle (or a square) with the horizontal (red) edges identified with one another; and the vertical (blue) edges identified with one another. Just like in the Pac-Man PC game, if you cross the upper boundary, you reappear at the bottom and vice versa. If you cross the left boundary, you reappear on the right side and vice versa. As the Czech fans of Pepa Nos know, the Earth is round (and periodic) in all these respects. ;-)

If you want the torus to resemble a history of a point-like particle in quantum field theories, it should better be a thin torus and the much shorter edge of this torus should be interpreted as the (negligible) spatial dimension of the string while the longer dimension should be interpreted as the (temporal) direction of the Euclideanized time.

Now, there may also be spacetime tori (histories of strings i.e. world sheets) that are extremely thick in the opposite limit. But you may draw them as rectangles, rotate the rectangle by 90 degrees, and exchange the intepretations of the two sides of the rectangle! A very thick torus would be the part of the path integral that could contribute UV divergences but by this 90-degree rotation, you may reinterpret all of them as infrared divergences (which may always be there a priori but they either cancel or are innocently physically interpreted).

Because string theory contains no new integral over the "ultraviolet" shape of the tori – because the very thick tori are really very thin, just with a rotation by 90 degrees used to reinterpret the history – string theory is automatically free of short-distance divergences. My proof may look sloppy or heuristic to you and it arguably is both sloppy and heuristic. However, it's a description of something you may actually prove rigorously – and it has been proven rigorously.

Due to the fact that the 90-degree rotation is a symmetry of the world sheet (a gauge symmetry – a "large diffeomorphism" – of the world sheet theory, in fact), the path integral over toroidal world sheets which is a trace (thermal partition sum) may be interpreted in two ways that have to be equal:${\rm Tr} \exp(-\beta H) = {\rm Tr} \exp(-H/\beta).$ This really means that if you invert the temperature $T$ or inverse temperature $\beta$, you get back to the same partition sum (or, more generally, partition sum of another string theory). So just like the radii shorter than the string length aren't new, the world sheet temperatures above a self-dual value are copies of the very low ones, too. An extremely high temperature is the same thing as an extremely low temperature.

(These are not actual temperatures relevant for the physics in spacetime; these are temperatures for world sheet physics; the inverse temperature here should really be denoted by letters such as $\tau$ and not $\beta$. I don't want to go into these technicalities here.)

A funny fact is that if $\beta$ is very high, only the states with a small value of $H$ (very light string states) contribute to the trace on the left hand side above because all other states are heavily exponentially suppressed. On the other hand, $1/\beta$ is very low in the same case and the right hand side has contributions from lots of excited, high-energy string modes. It primarily counts their number or "entropy".

So the seemingly trivial fact that the rectangle may be rotated by 90 degrees has a remarkable implication for any perturbative string theory: the density of high-mass states in the string spectrum is actually fully determined by the energy and other properties of a few low-mass excitations of the string!

(Similar relationships exist for the cylindrical world sheet. In that case, the two related sides of the story have a different interpretation.)

This fact – symmetry of the toroidal path integral with respect to the 90-degree rotation – is known as "modular invariance". Classically, the invariance of the path integral over the torus "automatically holds". However, it may have quantum anomalies. They must cancel in a consistent string theory. This fact constrains the spacetime dimension, dictates that winding modes (and twisted strings and other boundary condition sectors) must automatically arise whenever you try to compactify some dimensions (or quotient the spectrum or impose GSO-like projection). They imply that the tori on which the heterotic strings' chiral dimensions are compactified must be derived from even self-dual lattices, and so on. It's a very powerful constraint.

Clearly, I got too technical and you need to study string theory properly – e.g. with a textbook – to really understand these issues. The basic toroidal modular invariance is actually enough to prove similar relationships at all loops.

However, I want you not to overlook the fact that these relationships between the light string states and heavy string states and similar UV/IR relationships in string theory are a toy model – a very explicitly understood cousin – of the relationship between the low-energy/high-energy behavior of the theory of quantum gravity; recall the black hole production discussion at the beginning. (A similar UV/IR connection has been observed in quantum field theories on non-commutative geometries, too.)

If you try to imagine that "quantum gravity" is a concept that is much more general than string/M-theory, it must still obey the UV/IR connections in some way except that the explicit "toroidal path integral" of perturbative string theory is replaced by something much more general and perhaps "much less two-dimensional", "much less constructive", and "much less geometric". But it's still true that quantum gravity is a very rare, heavily constrained animal.

And that's the memo.

#### snail feedback (48) :

Ha ha ha, nice piture of the "RNG flow" of scientific disciplines, I think I have spotted an attractive fixed point :-P ;-) :-D

I have only read the title and looked at the picture, because I have some "real world things" to do now, but I look forward to read this whole article later.

Cheers

Very interesting.....Thanks Lubos

So, quantum foam doesn't mean that at Planck scale the space-time has a kind of fractal structure? How does this really look like?

So this rotation of the rectangle by 90 degrees you are talking about, is this the S duality? Also is there a relation between S and T duality in string theory?

So this rotation of the worldsheet rectangle by 90 degrees you are talking about, is this the S duality or T duality? Also is there a relation between this UV/IR's connection and gravity's IR divergence getting mapped to dual gauge theory's UV divergence in gauge/gravity duality ?

"Of course, one could also make God more fundamental and place it above string theory. But that would be a slightly different God, a more scientific and less compassionate one. ;-)"

Yes, God belongs at the bottom of the tree -- an anthropocentric moral and historical idea, anthropomorphically conceived, which has undoubtedly been the single most influential idea in Western intellectual history, influencing our politics, economics, philosophical traditions, and everyday life for centuries.

Putting Him at the top was only a way of saying "His" values (moral commands) are (or should be) universal on earth, applying to all men everywhere, unlike the "power gods" of ancient Mesopotamia whose "jurisdiction" was limited to a particular city-state or empire. Of course this attempt to establish universal human values, while quite successful, had unfortunate consequences when taken literally and applied to the science of cosmology.

Dear Numcracker, a fractal is a self-similar structure so that its patterns at arbitrarily short length scales exist and are interesting - and analogous to those at longer scales. This sort of contradicts the Planck-scale-as-minimum-length-scale paradigm but I can't quite rule out every possible incarnation of the Planckian fractal idea.

Needless to say, the people who talk about a gravitation UV fixed point would be much closer to a Planckian fractal.

Well, it's neither S-duality nor T-duality. It's called modular invariance. But both S-duality and T-duality may be identified with rotations by 90 degrees of some other tori. Also, there are many relationships between S-dualities and T-dualities - and their incorporation into a larger unified groups of so-called U-dualities - in string theory. I don't want to explain all the possible relationships between them because it's a rich subject, a non-negligible part of the whole string theory.

it must be a good article

Well done, Lubos! I am far too mathematically challenged to really get this but your presentation is superb. It seems inevitable that string/M-theory really is the theory of everything and the only remaining task is to understand it. That may take a while.

Most remarkable from my standpoint is the tie-in between between QM and statistical mechanics. I have long considered both to be fundamental and unalterable so it should not surprise me that they are in intimately connected.

In the case G and c not constants and vary synchronously
Lpl,Tpl lost their sense in contradistinction to Mpl.

Lubos

It is my version to solve problems

http://vixra.org/abs/1212.0080

Hi Lubos-

Two naive questions from a novice (warning, I'm a disciple of Max Tegmark). i frequently look at your blog, and you often come across as viewing string theory as something almost inescapable, assuming that the world obeys QM, SR, and GR. Would you go one step further and predict that string theory can some day be formally PROVEN to be mandatory using some combination of a priori arguments and math assuming the three core theories are correct? Going even further, do you think that any of QM, SR, or GR might someday be provable using a priori reasoning alone?

Fascinating ideas and questions. Yes, I think that it's possible to show that string theory is inescapable given QM, SR, and GR, and this proof may already be "sketched" today although it's far from complete and rigorous at this point.

It's hard to see how the postulates or basic structure of QM or SR or GR could be extracted out of no assumptions, by pure thought, but I do find it plausible that the ideas behind QM, SR, GR, geometry etc. are less independent of each other that we think and a natural extension of these principles of physics actually implies all of them.

...Understanding reality only through the impossible.

I thought Amy Farrah Fowler disproved Sheldon's theory by pointing out that string theory takes place in the domain of neuroscience? ;)

Hi Lubos,

I am sorry to say but your first graph is obviously wrong. I agree one could in principle explain all the Standard Model starting from a single string theory, but don't tell me you pretend explaining economic sciences from string theory.... I guess you know what emergence is about ?

So in general, reductionism does not work. And that's the memo ;-)

Wow, now this post has greatly expanded :-D

It is exactly such very nice, interesting, and challenging by an appropriate amount TRF articles, that I need to comfort me and cheer me up when I still feel sad about the loss of Physics SE as a good source of interesting physics and support when trying to extend and deepen my knowledge.

For example I like the explanation about the UV divergencies can be "rotated away" by modular invariance and the relationship between high mass and low mass excitations, these I have not yet seen before :-)

In lecture 6 of his "Topics in string theory" course, Lenny Susskind explained that G = L_p^2 = g^2 x l_s^2 in quantum gravity (darn, I need LaTex). This makes me wonder a bit where in G=g_closed^2 x l_string^8 the power of 8 comes from? Sorry if this is a stupid and unimportant detail ...

I've only read this article to the end of the paragraph below the blue chart but am already compelled to exclaim that: I LOVE IT! \$-]

Dear UBT, what emergence is all about is that there exist complex processes and mechanisms such as economy that ultimately *do* follow from more fundamental theories, ultimately string theory.

http://motls.blogspot.com/2010/07/many-faces-of-emergence.html?m=1

Hi UBT,

One purpose of renormalization is to understand how cooperative behavior "emerges" in large systems with many degrees of freedom (see the introduction in Wilson's 1974 paper).
So where do you see a contradiction to reductionism in considerations of emergent (cooperative) phenomena?

Of course it isn't really a sequence, but a (hopefully acyclic) graph.

You also have the sequence atomic physics and chemistry -> astronomy -> cosmology that don't depend on biology at all.

In some sense this sequence seems to get *more* fundamental. For example, the standard model and cosmology are the two fields that care about the number of neutrino flavors. Perhaps this also reflects some sort of (qualitative) duality. Or it could just be because though the universe is bigger than stars and galaxies, it hasn't *always* been that big.

Dear Ralph, one may use the word "fundamental" in various ways - for example, for the fundamentalist believers in Islam - but in the interpretation here, caring about all details of the Universe such as the number of neutrino flavors is what makes a description of the word *more* fundamental, not less!

If one only cares about some selected features of the world, somewhat arbitrarily selected, it inevitably makes his comments about the world less accurate and this viewpoint is just less fundamental.

Now, concerning the size of the Universe, the Universe used to be smaller and indeed, it was comparable to an atom at one point, too. And it had been even smaller before that. That's why the small-scale laws of physics, including the number of neutrino flavors, had played a gigantically important role in the early stages of the life of our Universe. That's why these features affected our present, too. It's another reason why the number of neutrino species is among the fundamental things.

Yes, what a post.

Correct. This has to be a typo...

Dear Dilaton, Lenny was probably working with an (inconsistent) D=4 string theory.

The exponent 8 is, in general, D-2 where D is the spacetime dimension. It comes simply from dimensional analysis. In hbar=c=1 units, Newton's constant G has units of length^{D-2}.

That's easiest to see from black hole entropy. The entropy, A/4G, is dimensionless. A is the area of the surface of the black hole. The black hole has D-1 spatial dimensions (one time) and the surface of the volume therefore has D-2 dimensions. So A is length^{D-2}, and so must be Newton's constant to cancel it and get a dimensionless entropy S = ln(N). Here, also, k=1 (Boltzmann's constant).

Be sure that Lenny would agree my formulae are correct. No typos here.

Thanks for this nice clarification :-)

Yep, the topic of the lecture was black hole entropy, and Lenny Susskind used G=l_p^2 etc. So he was just considering D=4, indeed ...

Cheers

You do get to make one assumption: the universe has to permit things to happen.

Yes…G is approximately g^2 *Ls^2, in 4 dimensions when the other 6 dimensions are compact. Lubos was talking explicitly about the 10 dimensional G though...

Well, Giotis, the hidden dimensions have to be compact and their volume has to be of order l_string^6 so that g_s^2*l_string^8 / V gives Susskind's value.

With extra dimensions, it's often damn reasonable - and appreciated from the late 1990s - that the size of the extra dimensions may be much larger than the fundamental scale.

What seems puzzling is that even superstrings have to be formulated over a preexisting non-observable continuum spacetime. Strings are not excitations of spacetime but over spacetime, in such theory geometry becomes less real than it is GR. In some sense ST is constructed over a God's given arena and can't say anything about its origin ... so there may be a level above the highest one already drawn in Lubos' tree which should contain the origin of quantum geometry ... have I missed something here?

Yes, you have missed something, namely everything.

Everything you wrote is a layman's misconception, it is a delusion, it is completely wrong.

In string theory, the whole spacetime geometry is entirely constructed out of the strings. In particular, in perturbative string theory, it is made out of particular massless modes of closed strings. This is easily proven by seeing that the addition of a coherent state of closed strings is exactly physically equivalent - it is the same thing as - a modification of the expectation value of the spacetime geometry.

Strings surely *are* excitations of spacetime. More precisely, spacetime geometry is one of many fields linked to string excitations, among all other degrees of freedom that are unified within string theory.

The fact that every particular calculation of anything must make particular assumptions about the metric degrees of freedom is a tautology - it only means that we know what we're actually calculating. But it surely doesn't mean that there is anything non-stringy or otherwise "independent" about the spacetime geometry. If one weren't forced to talk about particular shape of the spacetime geometry in a theory while doing a calculation, it wouldn't "just" prove that spacetime geometry isn't a pre-existing arena. Instead, it would prove that there's no spacetime geometry at all. We surely don't want that and string theory actually *does* predict the existence of spacetime obeying the laws of the Riemannian geometry - and Einsteni's equations - in the limit of long distances. These equations and all the principles underlying them are derived from something more fundamental in string theory, namely from strings (perturbative from 2D CFTs).

I gather that you're parroting this complete rubbish about the spacetime geometry in string theory from an anti-string crackpot book, am I right? Once more and you're banned. This is *not* a discussion forum for scum that reads Smolin or Woit once he or she leaves this website. Such scum has no right to oxidate on my blog.

Actually the crackpot you mentioned I have read is named John Wheeler, he has a massive book called Gravitation, within a special chapter on pre-geometry. Actually he was the advisor of some other cranks as Feynman who has proven in his "lectures on gravitation" that any quantum theory of spin-2 particles propagating on Minkowsky spacetime must be nonperturbatively bootstraped to a fully quantum geometrodynamic incarnation of GR. Here the renormalization issues of GR emerges. In this simple respect I just have argued that ST is the kind of field theory living on (static) spacetime (a 10d one) that bypass renormalization problems perturbatively ... but it still is a theory unable to predict the underlying geometry/topology of this 10d arena, something that has to be put in by hand. Maybe it is just a philosophic unaceptable aspect of a theory aimed to encompass the whole physics. Thus, this is what I mean about the existence of a higher level of reality above the top level you drawn. Please, no offenses, hence I judge somebody asking polite questions must deserve (at least) polite answers from somebody educated at a PhD level.

Dear Numcracker, I insist that your comments about the character of spacetime in string theory are 100% misconceptions.

I am also totally sure that Wheeler's book doesn't address these questions about string theory - or any string theory, for that matter, so it can't possibly be relevant for our exchange. You are just making this shit up. You intuitively know that the content you're writing is rubbish so you're at least trying to "strengthen" your position by references to authorities but it won't work because 1) authorities don't matter in science, 2) all the authorities you mentioned were active long before these questions were settled.

String theory flawlessly passes all tests about a dynamically generated spacetime obeying all principles of general relativity and beyond.

Dear Lubos, hence you are also an authority in science, please just point me the crucial paper where the 10d arena arises dynamically. It would be simpler to set an answer and convince other readers without appealing to personal qualifiers. Regards.

Dear NumCracker, this is basic textbooks stuff on string theory. If you want a discussion of "background independence" in string theory where it's manifest, see e.g.

http://arxiv.org/abs/hep-th/9208027

Here is another very readable answer to the issue of background independence of string theory (and how gravity works):

http://physics.stackexchange.com/a/44738

Lubos, I have been reading about the connections between solid state physics, the Higgs Model, and QCD in relation to Ads/CFT. Do you think that super cold quantum physics experiments could mimic String/M theory near the Plank Energy?

Dear Physics Junkie, one may mimic various aspects of the string dynamics, e.g. create materials that have the same number of local excitations, both bosons and fermions, as the Green-Schwarz superstring. There have been fun condensed matter papers about that. But including all the interactions, perhaps branes and all that, I think that the answer is No.

Well, any "ordinary physics" system dual to an AdS space is a form of string theory but they're usually contrived vacua (or superselection sectors, I should say) on the string theory side, if they're exact vacua at all, so in this sense the relationship is very indirect.

http://arxiv.org/abs/0809.3962

Hi Giotis,

this pinged me instead of NumCracker, but I like the link too ;-)

Thanks indeed Giotis, it is a really illuminating paper! In its own
language I would reformulate my question as: do you know if there is a
"boundary independent" formulation of ST (or all them are restricted to a
specific "superselection sector")?

Hi,

as I said I am not talking about Physics but Science in general, including human/social/etc. Science. I am not claiming any contradiction, I just observe that we are, in practical, unable to reduce all the laws of Science to a single framework. And that is why we need to study Nature at different levels. So maybe I was just misunderstood ?

Dear UBT, what about the possibility that you were perfectly well understood but you are simply wrong?

The fact that science studies Nature at many levels does *not* imply that the levels can't be reduced to more fundamental levels. Indeed, they *can* be reduced.

RNG methods are applied and useful in sociology, economy, neoroscience, etc too; for example to model nett effects on "larger scales" of the behaviour of single persons, households, (smaller groups of) neurons, and so on...

Did someone already gave a proof of what you are stating ?

Interesting... but I am not sure of the connexion between "using similar method" and "reduce to the same framework". These are two different things to me.

re

The smallest building blocks or the most fundamental level may certainly depend on the field (physics, biology, economic systems, etc) you want to apply remormalization methods to.

But a renormalization step always consists roughly of

1. coarse graining (integrate or sum over small scales)

2. derive the new effective Hamiltonian, Lagrangian, etc

3. rescale couplings, fields, etc

This is nicely described in the introduction of the PhD thesis of Dirk Homeier: