Sunday, January 17, 2010 ... Français/Deutsch/Español/Česky/Japanese/Related posts from blogosphere

Answers to 24 questions by Sean Carroll

In contemporary physics, there are many questions that are too deep to be sensibly asked: we don't have the right tools and language to constructively think about them. There are many unanswered questions that are deep but that can already be asked.

Off-topic: Isaac Newton's story about the apple, an original hand-written 183-page document from Newton's memoirs by William Stukeley in Silverlight. Thanks to Tom Weidig and BBC
But there are also questions that have been answered, that are tautological, that are too shallow or too vague, that make some incorrect assumptions, or that have other reasons not to be interesting.

Sean Carroll's 24 questions mostly belong to the latter category.

 1. What breaks electroweak symmetry?

The electroweak symmetry is broken by the Higgs field's vacuum expectation value.

The unitarity of the WW scattering implies that a new term with a scalar exchange has to contribute below a TeV: a contribution from the exchange of a Higgs particle. This is no speculative physics: Steven Weinberg got most of his Nobel prize in 1979 for this insight.

The corresponding particle has to be relatively stable and its mass must be in an accessible interval to make it work. It will be seen at the LHC. Somewhat more speculatively, there may be several such Higgs fields (like in SUSY) or this Higgs field may be composite (like in technicolor) but these are technical additions that are not strictly necessary to answer the question above. The bulk of the question was answered by the first sentence.

 2. What is the ultraviolet extrapolation of the Standard Model?

This question is amusing and the probable reason why it was asked was that the author didn't understand and doesn't understand the meaning of the word "extrapolation". The answer to the question in this form is, of course, "the Standard Model".




By a definition of "extrapolation", the formulae from the Standard Model are taken to be valid in all regimes, regardless of the energy. In fact, the Standard Model may really be extrapolated up to the Planck scale - as long as the Higgs mass belongs to a realistic range.

What Carroll probably wanted to ask is what is the ultraviolet "completion" of the Standard Model, i.e. what theory replaces it when its extrapolation breaks down (just the opposite than what he asked). This is way too general a question because it essentially says "tell me everything about new physics". Supersymmetry, grand unified theory, Kaluza-Klein theory etc. are all likely to be a part of the answer.

At any rate, this more ambitious question - at this limited level of detail - can also be answered and the most correct answer is string theory. However, it is even more certain that the question in the form that was asked is trivial and the answer is, once again, "the Standard Model".

 3. Why is there a large hierarchy between the Planck scale, the weak scale, and the vacuum energy?

These are, of course, two most famous hierarchy problems of current physics.

The Planck-weak hierarchy is most likely stabilized by supersymmetry. The stabilization is necessary but not sufficient a condition for the hierarchy to occur. Supersymmetry probably plays some role in the smallness of the cosmological constant in the Planck units - the other problem included in this question.

However, the "truly tiny" observed value of the vacuum energy can't be derived at this moment. It is unclear whether a "canonical" dynamical explanation exists: it is plausible but not guaranteed that the anthropic explanation is everything one can obtain. It is surely true that if the cosmological constant were vastly different, life similar to ours couldn't exist.

Individual vacua allow one to calculate all these values: some of the vacua give answers that are vastly different from the observed hierarchies, some of them may give answers that are close or exactly equal to the observed figures.

 4. How do strongly-interacting degrees of freedom resolve into weakly-interacting ones?

In quantum field theory, the number of particles is not conserved. So particles of any kind can "transmute" into particles of other kinds as long as the strict conservation laws are obeyed.

The "character" of the final particles doesn't have to coincide with the "character" of the initial ones. For example, a strongly interacting pion may decay into two photons - and/or various combinations of leptons that are only interacting by the electroweak interactions. There's nothing unusual to it: they decay via a virtual W boson or similar channels. This has been understood for more than 70 years (recall e.g. Fermi's theory of beta-decay).

 5. Is there a pattern/explanation behind the family structure and parameters of the Standard Model?

Yes, of course.

Obviously, the multiplicity of leptons and quark families may only be derived from a "deeper" principle in the framework of string theory. Whoever is hoping that a non-stringy framework could ever shed light on any of these big questions is fighting a lost battle: it's simply not possible to avoid string theory in answering any of these questions.

The number of families may be calculated in various stringy constructions by well-understood mathematical algorithms. In the most classical case of heterotic strings on Calabi-Yau manifolds (with the identified spin/gauge connections), the number of families equals one half of the Euler character of the Calabi-Yau.

Analogous but different formulae exist in other frameworks and more complicated vacua - braneworlds, vacua with fluxes, M-theory, F-theory. Also, Yukawa couplings, gauge couplings, masses, and other parameters may in principle be calculated although the calculation depends on the scenario. The right question that summarizes these unknown things is: Which limit of string theory (heterotic, IIA, M-theory, F-theory) is most useful (weakly coupled) to describe the reality?

 6. What is the phenomenology of the dark sector?

Dark matter has the well-known gravitational effects on the galaxies etc. that forced the physicists to discover it a few decades ago. Besides that, various decays appear with some frequency.

And the dark matter particles - such as neutralinos - can participate in a limited number of additional types of interactions. Assuming the standard MSSM neutralino realizations (or other scenarios, for that matter), these things are mostly understood. Dark matter accounts for a higher percentage of the mass of the Universe than the visible matter but that doesn't mean that it has a more interesting phenomenology.

Even if the MSSM were wrong, it's pretty obvious that it's the other way around. Besides the basic gravitational impact and some decays and perhaps pairwise annihilation, dark matter probably doesn't exhibit too much interesting behavior. At least, that's the thing we should conclude - in a preliminary way - based on our knowledge and basic principles such as Occam's razor.

 7. What symmetries appear in useful descriptions of nature?

One must be careful what types of symmetries we are talking about. Only global unbroken symmetries are "really objective" features of the reality. It's very likely that we have found the full list and it includes the CPT-symmetry, Poincaré symmetry (including Lorentz, translational, and rotational symmetries), and the U(1) from the conservation of the electric charge.

Additional symmetries may exist but they are broken i.e. non-linearly realized - SUSY and the electroweak symmetry. SU(3) is confined and there may exist additional confined groups. But the presence of gauge groups really depends on the description and it is never a sharply defined physics question whether a symmetry in a description is "useful".

Asking whether something is "useful" doesn't belong to physics - it's a "meta" question related to our strategy whose purpose is for us to be more capable to ask and answer other, more objective questions. Different dual descriptions of the same physics usually have different gauge symmetries and there's no contradiction here.

Also, perturbative string field theory may be usefully formulated with the help of an infinite-dimensional gauge symmetry principle (at each point). Such gauge symmetries may be pretty in formulations of physical theories but they're not really necessary for physics - they're not physical because physical states must be singlet under all gauge groups (so physical objects know nothing about the representation theory of these groups).

It's conceivable but far from guaranteed that a generalization of our knowledge about symmetries will lead to further progress in the understanding of the fundamental laws of physics.

 8. Are there surprises at low masses/energies?

Probably no. One can ask this kind of question in any context. For example, are there surprises when the number of baryons becomes comparable to the number in the stars?

Surprises only occur when they occur. The most likely answer to the question above - and any other question about the surprises - is No. If that wasn't the case, the new things couldn't be surprises according to a rational evaluation of their surprising power.

Various hints indicating that there could be such surprises - e.g. alternative dark-matter-free explanations of dark-matter effects, or MOND alternatives to dark matter and dark energy, and so on - have been heavily disfavored by recent observations. The most likely possible new physics at low ("officially understood" mass scales) may be axions or some light gravitinos etc. but they wouldn't be "real" surprises because they've been studied as legitimate possibilities.

We can't be quite sure but I would bet 10:1 against completely new surprises at low masses that don't follow among the names above and similar particles, fields, and effects that have been intensely studied in the seriously considered literature.

The question of the type above is manifestly not useful because it is, by its very design, trying to go beyond any level of understanding we would achieve at any point. But interesting questions only reside at the boundary of our understanding, not far beyond it.

 9. How does the observable universe evolve?

Thanks for asking, very well.

The standard model of cosmology, including dark matter and dark energy, is almost certainly enough to account for every major cosmological observation we have made or we are making. We pretty much know the evolution from a tiny fraction of a second after the Big Bang - and maybe even the GUT scale length after it - until the present. It's very likely that the Universe will continue in the accelerating expansion and come closer to an empty de Sitter space.

10. How does gravity work on macroscopic scales?

It's described by general relativity. For many purposes in astrophysics and cosmology, we may replace this theory due to Einstein by its Newtonian approximation.

Obviously, the harder question is how gravity works at microscopic and ultramicroscopic scales. Changes to Newton's law at accessible microscopic scales would suggest additional large or curved spacetime dimensions - but we know from the newest experiments that they shouldn't be larger than 10 microns or so.

Modifications of gravity at the ultramicroscopic, Planckian scales is what quantum gravity is all about. String theory tells us that gravity can't be separated from other forces (and forms of matter) in this extreme regime. However, it also tells us that gravity works according to the older established approximate theory - Einstein's theory - at microscopic scales. Otherwise we wouldn't consider it as a theory of quantum gravity.

11. What is the topology and geometry of spacetime and dynamical degrees of freedom on small scales?

When this question is separated from the things that are known, it becomes fully equivalent to the question asked at the end of the answer 5: What is the most weakly coupled description of string theory to describe the reality?

It's important to notice that the geometry only becomes "objective" if it is large and classical. The geometry and topology of small - especially Planckian - manifolds is a question that doesn't have a unique answer because of dualities. There exist lots of dualities that imply that string theories on entirely different geometric backgrounds are completely equivalent to each other.

Also, the additional "Planckian" degrees of freedom may be - and are somewhat likely to be - non-geometric: they don't allow one a useful reformulation in terms of a finite number of fields on a classical geometry. The question above is incorrectly asked because it makes a somewhat unlikely assumption that the classical geometry is guaranteed to be useful at the scales where new degrees of freedom resembling "new dimensions" emerge.

12. How does quantum gravity work in the real world?

Very well, thank you for asking.

One can show that quantum gravity - i.e. string theory - fully reduces to the conventional effective theories - especially effective quantum field theories - in the context of the "real world" i.e. relatively long-distance and low-energy phenomena. As we have already mentioned in answer 10, that was the first test that made us de facto sure that we actually know what the right theory of quantum gravity is.

13. Why was the early universe hot, dense, and very smooth but not perfectly smooth?

Again, the answer to this question is well-known in the (by far) most likely picture of the early Universe we know.

Flatness and relative smoothness of the Universe today requires cosmic inflation. In inflationary cosmology, the smoothness arises from the gigantic exponential expansion of the Universe during the inflationary era. The inflation is terminated by reheating and the structures start to form later, by the gravitational clustering and collapse.

It's important to appreciate that the cosmic inflation achieves the right outcome independently of the detailed state of the Universe that existed before the inflation. And we actually don't know what the state of the Universe right before the inflation was. It's pretty likely that the inflationary regime is directly adjacent to the Planckian regime of the Universe.

And in the Planckian regime, we actually can't say that the Universe was "too smooth but not perfectly smooth". There's no sensible definition of the word "smooth" that would allow us to say that the pre-inflationary Universe was smooth. Otherwise, the hotness and high density of the Planckian world is a trivial result of dimensional analysis: things had to be Planckian - and both the Planckian density and the Planckian temperature are extremely high relatively to the densities and temperatures that we know from our lives.

14. What is beyond the observable universe?

The same kind of the Universe as the visible one.

It's extremely likely that what we see does qualitatively continue a huge distance beyond the cosmic horizon we can observe today - and the appearance of the cosmic horizon is just a "technicality" caused by our Universe's finite age (so far). This answers the question above.

We could try to go even further - beyond the Universe that looks similar to ours. There could be some non-uniformities or domain walls separating us from other patches of the eternally inflating Universe. And there could exist segments of the Universe or multiverse that are completely causally disconnected from our region of spacetime.

If that's so, these questions won't ever become a part of the empirical science. Otherwise, when we talk about the "whole" Universe or multiverse including the completely disconnected patches that we can never observe and we will never observe, I find it obvious that we must say that all the vacua (and their excitations) connected in all conceivable ways that satisfy the equations of string theory do exist "somewhere".

Clearly, such an extended Universe is too big, and to get closer to what was known as physics, we must refocus on those segments of the multiverse and/or the landscape that are closer to our world (either in the spacetime, or in the configuration space).

15. Why is there a low-entropy boundary condition in the past but not the future?

The second law of thermodynamics abruptly answers both questions: these issues have been discussed on this blog very many times. After all, the question above is incorrectly formulated, too.

There are no "boundary conditions in the future". By its very definition, the future is whatever will evolve from the past (through the present) - and nothing else. If you first decide what the future has to be, and then calculate the present or the past (like in various doomsday scenarios), you're a victim of a wishful thinking or a bigot with an agenda, not a rationally thinking person.

The boundary conditions (usually known as the "initial conditions") can only be defined in the past. Boltzmann's H-theorem and equally strong derivations then imply that the future is guaranteed to have a higher entropy than the past - i.e. that the past is guaranteed to have a lower entropy than the future.

The fact that the past has a low - and essentially vanishing - entropy is a tautology. We simply want to find the "oldest" or "most fundamental" explanation of our origin, so our theories are obliged to go as far into the past as they can. Because the entropy is a non-decreasing and mostly increasing function of time, this criterion means to reduce the initial entropy to the lowest allowed value - essentially zero.

While it's trivial to see that the entropy of the initial state is low or zero - by the definition of the term "initial state of the Universe" - it's much less clear what the relevant state actually was. It is even mysterious what the right degrees of freedom in which the initial state should be described are: it would be much more scientifically interesting to ask this question than Carroll's tautological question.

But once again, this question is equivalent to the other questions above about the nature of the Planckian regime. Once the right degrees of freedom are known and the full calculational framework is understood, the question about the identity of the initial state may be answered by a form of the Hartle-Hawking state but we obviously don't know how to calculate the full answer in the right stringy framework today (as opposed to some minisuperspace approximations).

16. Why aren’t we fluctuations in de Sitter space?

Because to create our brains, one needs to arrange something like 10^{26} molecules into pretty organized configurations. As understood by Boltzmann in the 19th century, the probability that such an organized object (whose entropy is 10^{26} times Boltzmann's constant) occurs as a random fluctuation is comparable to exp(-10^{26}) which is completely negligible relatively to the probability of our origin that actually has a non-random reason (and a detailed, sensible evolution).

The creation of a macroscopic organized object out of chaos is so unlikely that the probability is, for all practical purposes, zero anywhere and everywhere. Any calculation that ends up with a "reasonably finite" probability for such a crazy event is a result of trivial mistakes, especially an incorrect multiplication of the tiny probability by the (infinite) volume of the spacetime etc.

Probabilities can't be artificially "inflated" by multiplying them with incorrect infinite factors, especially if you don't do the same thing with the probabilities of the "alternative", namely right explanations. Probabilities of events must always be normalized together with their alternatives so that the sum equals one. And a spontaneous creation of a macroscopic organized system is always much less likely than the appearance of evolution composed out of a few steps whose probabilities are comparable to one.

Also, probabilities can't ever be proportional to the (expected) spacetime volume in the future because the future is uknown at this moment. By the basic rules of causality, it can't affect the present - or a rational calculation of probabilities at the present. They are always calculated as functions of our observations of the past, using theories which were also induced from observations of the past. The knowledge about the future can't ever be independently used in any rational argument because this knowledge doesn't exist (and the future will actually depend on random events - and we may only predict their probabilities by the rules of quantum mechanics). That's also why it's wrong to multiply probabilities by spacetime volumes (or numbers of observers) in the future.

We only "directly" know something about the past and there is only one finite past - 13.7 billion years of it. There was one long process of evolution that ended with people like us, among other things, and this evolution is surely much much more likely than a spontaneous creation of a macroscopic organized brain - by a factor of exp(10^{26}) or so (not far from a googolplex).

If you want to read a whole book filled with similar silliness, a book that denies the very basic and self-evident facts mentioned above, please buy Sean Carroll's From Eternity to Here using the link on the left. ;-)

17. How do we compare probabilities for different classes of observers?

Probabilities are numbers between 0 and 1, so they are compared in the same way as any other pair of real numbers. For example, 0.6 is greater than 0.5. It's questionable whether it's useful to compare probabilities of entirely different things, but there's no doubt that we know how to do it. I may have misunderstood the question but in its current form, it is genuinely idiotic, right?

I suspect that this silly question originated from some misconceptions described in answer 16: Sean Carroll may have wanted to artificially inflate probabilities by "acknowledging" some properties of the observers etc. But rational observers calculate probabilities of physical phenomena according to objective algorithms that don't depend on the observers or their class at all.

So if the question was supposed to be "how should we imprint our identity - the class of observers of our kind - into our calculations of probabilities of events", the obviously right answer is that it shouldn't be imprinted at all. Proper laws of physics - those that can hopefully be obtained by a legitimate scientific process - work for the people in China, the Vatican, as well as the extraterrestrial aliens. Nothing that depends on the citizenship of the observer belongs to physics or science, for that matter.

18. What rules govern the evolution of complex structures?

The detailed evolution of all complex structures is governed by the microscopic laws that govern the elementary building blocks, applied to a large number of ingredients.

Various kinds of behavior may be described approximately but there's no universal answer to the question "which approximations are both accurate enough and useful". Generally, we are often interested in the evolution of "collective coordinates" and approximate degrees of freedom that are constructed out of the full ensemble of the degrees of freedom.

Such "simplified descriptions" of systems often approximately follow the most general type of behavior that is compatible with the existing symmetries and other qualitative constraints. Most typically, effective quantum field theories and other "critical phenomena" may govern such behaviors.

One can systematically expand the full equations governing the subset of the degrees of freedom in a derivative expansion around the strict long-distance limit. There are many kinds of such behaviors - because this question really covers all of sciences. It's way too inclusive a question to be useful. It's likely that only "scaling limits" of various kinds may give us effective descriptions that can become "arbitrarily accurate". All other approximate descriptions of complex systems are vague and their inaccuracies are inherently finite.

19. Is quantum mechanics correct?

Yes, it is, thanks for asking.

All imaginable attempts to show that a non-quantum mechanics could replace quantum mechanics as an explanation of the observed phenomena have been de facto ruled out. All kinds of hidden-variable theories have been shown impossible - incompatible either with direct observations of entanglement or indirectly with observations of locality and Lorentz invariance or incompatible with basic consistency rules such as the rules for the sum of all probabilities (unitarity).

Also, all existing motivations to look for theories that violate the postulates of quantum mechanics have been shown unsubstantiated even though it wasn't clear from the beginning.

To present a major example, it's been shown that the string-theoretical dynamics of black holes in the "quantum regime" - including black hole evaporation and the flow of the information - is fully compatible with all the universal postulates of the black hole. In 2010, the search for a non-quantum explanation for the phenomena that need quantum mechanics is equivalent to the full denial of 100 years of all the key evidence in physics.

20. What happens when wave functions collapse?

Nothing objective happens.

A wave function is nothing else than a tool to predict probabilities; it is no real wave. When such an object "collapses", the only thing that it means is that we learned something about the random outcomes of some measurements, so we may eliminate the possibilities that - as we know - can no longer happen. For our further predictions, we only keep the probabilities of the possibilities that can still happen i.e. those that are compatible with the facts about the events that have already taken place.

Everyone who thinks that "something real happens" when a wave function collapses - that a wave function is on par with classical waves (such as electromagnetic waves) - has misunderstood the basic meaning of quantum mechanics.

21. How do we go from the quantum Hamiltonian to a quasiclassical configuration space?

A classical configuration space is only useful when we can describe the Hilbert space using a basis of simultaneous eigenstates of a collection of observables.

If the "configuration space" is a good concept, the spectrum of these observables must be continuous and the Hamiltonian must be equal to a function of these observables, up to small terms of order Planck's constant. That's when a classical limit occurs, and the derivation of classical physics from the full quantum laws is a well-known procedure in this case.

Semiclassical physics is a standard undergraduate material and decoherence covers all the issues about the "conceptual" transition from the quantum framework (including interference and quantum probability amplitudes) to the classical framework (without interference, with at most classical probabilities, and with emergent determinism).

Most of the well-known quantum theories have actually been constructed in the opposite way - as "quantizations" of a classical starting point. However, one must realize that such a classical theory doesn't have to exist for a given quantum theory. There are many quantum theories that don't have any classical limit, at least not in their most interesting regimes. The real world is a quantum system and the existence of classical limits is just a "bonus" and all classical theories we may find useful are always just approximations.

22. Is physics deterministic?

No, classical determinism has been proven invalid in our Universe.

Once again, our Universe follows the principles of quantum mechanics. Questions 19, 20, 22 are clearly equivalent, and they're not real questions. The purpose of these sentences ending with a question mark is to promote mystifications about fundamental science and fog surrounding the most solid pillars of modern physics - fog that is based on no rational evidence, neither empirical nor theoretical, whatsoever.

23. How many bits are required to describe the universe?

Currently around 10^{100}.

This entropy is dominated by the large black hole entropies. Before we knew about them we thought that the number was closer to 10^{90}, dominated primarily by the cosmic microwave background. Still, a description in terms of 10^{100} bits is inevitably an effective, approximate one. To describe the phenomena in our de Sitter space "exactly", one needs those 10^{120} bits that holographically live on the de Sitter cosmic horizon (it's approximately the area of the cosmic horizon in the Planck units).

As the entropy of our Universe is increasing, the answer is clearly getting closer from 10^{100} to 10^{120}. The evolution is seemingly non-unitary - because the Hilbert space couldn't grow - but this only corresponds to the emergence of the previously unknown information from the de Sitter horizon.

However, it's likely that a finite number of bits - such as 10^{120} - is probably inadequate to describe the de Sitter space fully in the quantum framework. The full theory almost certainly needs to work with infinitely many bits because there's no preferred finite number of bits: 10^{120} is just a truncated subset that is good enough for the effective estimates of information but bad enough for the detailed evolution.

However, it is questionable whether such a complete theory makes any sense in de Sitter space: the phenomena in de Sitter space can have an inherent unpredictability due to the unpredictable thermal noise coming from the horizon. One must realize that the time spent with the research of "overly accurate" questions about de Sitter space could be a wasted time because no solid interesting answers may exist. It's plausible that solid formulae only emerge in the limit when the space becomes effectively infinite.

24. Will “elementary physics” ultimately be finished?

Yes.

It will either be finished when a complete formulation of the elementary laws is found by the scientists, or by the death of the mankind (or other life forms in the Universe), or by its de facto death when people's intellectual qualities will sufficiently deteriorate so that they will no longer work on it.

Add to del.icio.us Digg this Add to reddit

snail feedback (0) :