## Wednesday, October 13, 2004

### Parameters of Nature

This blog is still rather new, and I am tempted to write a lot. :-)

Well, let me tell you something about the parameters of Nature.

This is a topic that recently emerged from a discussion on the sci.physics.research newsgroup about the predictive power of string theory. These discussions often end up with completely hopeless exchanges because it is often more than 95% of the participants who believe in totally weird things related to very elementary questions in physics, and they are convinced that anything that they misunderstand about particle and fundamental physics must be the particle physicists' fault.

You might say that this interpretation of the disagreement is just my point of view, but it may be more important that it is the correct point of view. :-)

Of course, they are repeatedly mistaken. Don't get me wrong: I don't claim that all other participants of the sci.physics.research newsgroup are absolutely dumb. But the percentage of almost absolute morons on that newsgroup is sufficiently large to destroy a decent debate.

What is the issue now? To emphasize that I am far from being alone with my opinion, let me start with a citation of an authority. At the end of 2000, David Gross, the winner of the 2004 physics Nobel prize, formulated 10 most serious questions in physics in his article (or interview?) in the New York Times. He often repeats - for example on the recent "Future of Physics" conference in Santa Barbara - that the most important question in physics (one that he would ask God) is the following:

Can all fundamental dimensionless continuous parameters of Nature be calculated from theoretical principles, without any input from experiments?

Let me say that I subscribe that this is the biggest question in physics for me, as well. It does not have to be the biggest question for everyone else, but I guess that all particle physicists and string theorists realize why it is big. This biggest question originally comes from Einstein, but I am sure that it has become a part of our knowledge (and our culture).

Let's look at various details of the question: we only talk about dimensionless parameters - such as the fine-structure constant or the proton/electron mass ratio - because the dimensionful constants depend on our choice of units, and because these units are more or less arbitrary (results of coincidences in the history of the human civilization), such numbers of course cannot be explained scientifically (unless you explain why a guy several centuries ago had a thick inch). It's only the dimensionless numbers (those without any units) that all civilizations would have to agree upon.

The sentence also talks about continuous parameters only because the discrete parameters are qualitative, in some sense, and they can be determined absolutely exactly, even with very inaccurate experiments. However, the continuous numbers must be measured totally precisely if we want to predict physics completely accurately. The continuous parameters are the main problem.

How far has physics gotten in this dream to explain and calculate all measurable numbers in Nature? Instead of zillions of seemingly arbitrary parameters describing the properties (spectral lines, permitivity, density, ...) of each element (and perhaps each compound) and each nucleus and so on, we now have a theory where everything follows from roughly 19+10 parameters - namely the Standard Model. The parameters are roughly three couplings, and a lot of elementary particle masses and mixings between them.

String theory, by its very nature, does not admit any continuous dimensionless parameters whatsoever, so once you determine all possible discrete choices (there is a countable number of them), you can decide whether string theory is right or wrong. We don't know which answer is correct yet, at least not with certainty, but the basic feature of string theory - that all apparent parameters are either fixed by consistency, or they are dynamical fields whose values is determined dynamically at the end - is more or less well-established.

Note that even in the anthropic catastrophic scenarios in which we would have to deal e.g. with 10^{300} vacua, string theory is amazingly predictive. Using the usual anthropic counting (and the exponents below will be sort of random guesses which nevertheless describe how the experts are thinking about this problem), the roughly correct cosmological constant can only be found in 10^{200} of them, roughly 10^{100} of them look like the Standard Model. We only have 100 decimal digits to adjust to discretely choose a vacuum, and therefore 30 parameters can only be roughly fine-tuned with the accuracy of 3 decimal digits if string theory is wrong and the agreement is a pure coincidence. It's pretty clear that we can measure more accurately, and therefore it is likely that no vacuum from this ensemble will match string theory with the verifiable precision if string theory is not quite correct. Even though 10^{300} looks like a large number, it is really much less than infinity.

If one of the vacua matches reality, it would be a spectacular success, probably the biggest achievement in the history of science. It would allow us to calculate anything with an arbitrary precision.

(Let me now avoid the controversial questions whether we should believe that our Universe has "average" properties because this question is controversial among the string theorists.)

OK, 10^{300} is large for some purposes, but small for others. It is certainly true that the Standard Model can be pragmatically a more useful model to describe reality - even though the Standard Model can never be determined "exactly" since it depends on continuous parameters (unlike string theory that is completely rigid). But there are some things that should certainly not be controversial:

Realistic theories in physics depend on a finite number of continuous parameters. The dependence of the results on all parameters is smooth almost everywhere. The space of parameters is therefore a manifold that is differentiable almost everywhere, and one can talk about its dimension - the number of parameters. The theories with a few parameters are more constrained, satisfactory, and predictive than the theories with many parameters. The theories with infinitely many parameters - such as the nonrenormalizable theories - are unpredictive and unacceptable as complete theories of anything because one needs infinite input to determine the theory fully.

OK, it was not just these statements that I gave them. One had to explain them all the misleading examples they raised - which includes an explanation of effective theory (where things are approximate, and the results only depend on a finite number of parameters "up to a certain scale" whatever exactly "scale" means).

You would think that these are so obvious statements that no one who is interested in physics should have problems with them. 't Hooft and Veltman received a Nobel prize for 't Hooft's proof of renormalizability of the electroweak theory, and you would therefore think that the importance of renormalizability of a quantum field theory theory - proposed as a fundamental description of anything - is something that must be obvious to all those interested in theoretical physics. You would also think that all of them understand that renormalizability means that there is just a finite number of parameters that must be determined from the experiments, and everything else can be predicted at least to all orders in perturbation theory.

Well, you would be wrong. These obvious facts constitute a barrier that is inpenetrable for many people - and some of them are even rather well-known scientists.

One of the less famous ones starts to argue that one parameter is the same as 30 parameters and the theories with one or 30 parameters are equally predictive and satisfactory. In order to prove that he is not just a moron, but rather a sophisticated moron, he offers the proof that the one-dimensional continuum can be mapped to a higher-dimensional (or infinite-dimensional) continuum, even by continuous (but highly pathological and non-differentiable, and therefore physically irrelevant) functions. He repeats that one number as well as 30 numbers constitute an infinite amount of information, and therefore there is no difference. That's a hopeless case, of course.

But what would you think if you met someone else, someone nice, someone who is around, namely someone who can be described as a leader of quantum computation, who argues that there is really no difference between renormalizable and non-renormalizable theories as far as predictivity goes (he immediately and explicitly gives you the Standard Model and quantized General Relativity with all counterterms up to five loops as examples) - and he even states that drawing a graph of a function (which is a part of the input of a theory) is giving you a more predictive theory than if you know the function analytically, as long as the analytical function looks too complicated to you?

I don't know what you would think, but I am totally stunned and scared. What sort of physics is being taught at the high schools and colleges? Is it really necessary that people - even the famous people - don't understand what does it mean to "understand" or "explain" something in physics? Does he really believe that if we draw a curve describing a black body radiation experiment, so that we intuitively feel that our drawing agrees with our experiments, we have a more predictive theory than Planck who finds an analytical prescription for the function? Does he really think that Kepler's and Newton's laws did not mean any progress in understanding - and predicting - of the paths of celestial bodies, as compared to the phenomenological observations before Kepler (or before epicycles)? What do they consider progress in theoretical physics if it is not obtaining increasingly analytical formulae describing increasingly larger classes of phenomena increasingly accurately with a decreasing number of arbitrary and independent assumptions and parameters?

It's just very hard to discuss physics if people can't agree even about the fact that the renormalizable theories are more predictive than the non-renormalizable ones - and the fact that a correct analytical form of a function is always better (and more predictive) than just a numerical knowledge of this function.

1. There is an alternative philosophy to what Lubos advocates here. Mathematician Dan Freed constructed in his thesis the Kaehler geometry of loop spaces, and found that it is unique and possesses infinite-dimensional isometry group defined by the loop group itself. The basic reason is that Riemann connection leads outside of the tangent space in the infinite-dimensional context for the generic metric.

The generalization of this idea in my own approach to
unification looks like follows. The configuration space is the space of 3-surfaces in H=M^4xCP_2, product of Minkowski space and complex projective space, the space of the "classical worlds". If loop space has a unique Kaehler geometry then this space has even better reasons to have a unique Kaehler geometry (assuming it has any!) than loop spaces. Symmetries imply constant curvature space structure and finiteness Ricci flatness and Einstein questions and the potential of being Hyper Kaehler manifold.

"Infinite-dimensional existence is unique" would be the
alternative to the philosophy what M-theory people are

The guess inspired by elementary particle spectrum is that this requirement fixes the imbedding space to be H=M^4xCP_2 and that the dimension of fundamental objects is 3. There are good algebraic reasons for this. For instance, there is support for the conjecture that space-time surfaces could correspond to quaternionic sub-manifolds of H with octonion
structure. This would mean that tangent space at each point spans quaternionic subalgebra. This conjecture allows also Abelian version in terms of generalized 8-D hypercomplex numbers generated by units 1, e_1,e_2,e_3, e_i^2=1 and represented in terms of mutually commuting covariantly constant sigma matrices of imbedding space spinors. One easily finds that only M^4xCP_2 allows to realize the algebra and that this algebra allows 4-D subalgebras, say that defined by 1,e_1,e_2, e_1e_2.

The principle can be made concrete by noticing that Kaehler geometry is determined by Kaehler function and that general coordinate invariance in 4-D sense requires that it must be possible to assign unique 4-D surface X^4(X^3) to a given 3-surface X^3: classical physics determined by a generalized Bohr orbit. On physical grounds Kaehler function should relate to some variational principle and the simplest guess is that it corresponds to a minimum of Maxwell action for Kaehler form of CP_2 induced to the space-time surface and defining classical Maxwell field.

The Kaehler coupling strength is the only parameter
appearing in the exponent of the Kaehler function defining the functional integral measure (not path integral, and free of local divergences since it is non-local functional of X^3!). Its possible values are completely analogous to critical temperatures and determine all other coupling constants.

Coupling constant evolution would be discretized in this
approach and replaced with what I call p-adic coupling
constant evolution. Each discrete value would represent a fixed point coupling for an underlying conformal field
theory.

For overall summary of the resulting approach see

http://www.physics.helsinki.i/~matpitka/tgd.html#tgdevo

and

for details the chapters

http://www.physics.helsinki.i/~matpitka/tgd.html#X, X=
kahler, compl1, compl2, cspin.

Matti Pitkanen

2. I have some problem understanding this:

"Can all fundamental dimensionless continuous parameters of Nature be calculated from theoretical principles, without any input from experiments?"

As Gross et al taught us, the coupling constant runs (or equivalently, dimensional transmutation). So QED's value of 1/137 is not intrinsically 'fundamental': at EW scale it is 1/128 (haven't done QFT in a while, so I may be off :)).

Now given the QCD Lagrangian, how can one EVEN IN PRINCIPLE hope to find what \alpha_{QCD} is in terms of mathematical constants (How can QCD Lagrangian pick a Lambda_{QCD}---isn't that an experimental input)?

Or does the statement mean that given \Lambda_{QCD}, in principle one should express all hadron masses in terms of alpha_{QCD} and mathematical constants?

"It's just very hard to discuss physics if people can't agree even about the fact that the renormalizable theories are more predictive than the non-renormalizable ones"

Finally, someone saying that the emperor has no clothes! According to Weinberg in his QFT book (Chap 12), this change in perspective arose because there is no renormalizable QFT of gravity (It is interesting to read his 1979 Nobel lecture where he waxes eloquent about renormalizability). But string theory resolves the renormalizability issue (if correct of course :)) So I suppose to a string theorist, all QFTs are effective and hence nonrenormalizable.

But for those not believing ing string theory, I think that is a cop-out. Of course, one can write down all possible terms consistent with relativity given the fields with undetermined constants and 'explain nature', but that is hardly explaining ANYTHING-content-free.

But I do think that even they would agree that inconsistent theories (ANOMALIES) are definitely not meaningful, even if non-renormalizable.

3. Your points don't really change the essence, and some of your statements sound really weird.

The fine-structure constant is not the most fundamental and natural parameter of Nature, nevertheless it is a completely well-defined parameter of Nature that waits for an explanation. ;-)

It's not the most fundamental because of at least two things: we now explain electromagnetism as a part of the electroweak theory, and the couplings of SU(2) x U(1) in the electroweak theory are "more fundamental" while the electromagnetic U(1) is derived.

Second, the couplings run (depend on the energy scale). But the fine-structure constant does not really run below the scale of the mass of the electron. It's because the electron (and positron) is the lightest charged particle, and if you're at energies (much) below its mass, all charged particles can be forgotten (as irrelevant high-energy physics) and the contribution from the electron running in the loop to the beta function goes to zero.

It is certainly possible to calculate the values of all these constants in a well-defined predictive theory. I have no idea what's unclear about it. The fine-structure constant is really a constant at very low energies, well, it is 1/137.03604... - a number waiting for an explanation. Even if you consider the more general running couplings, they are some functions of the scale. Most of the facts about these functions are already known from QFT - and the only remaining thing is *one number*, for example its value at some specific scale (which can be taken to be the zero energy scale in the case of electromagnetism), which can then be used to reconstruct the whole function exactly and analytically.

Of course that the hadron masses can be expressed in terms of Lambda_{QCD} and mathematical constants (plus in terms of bare quark masses which are rather small corrections). This is what QCD is good for - you don't even need string theory for that. The mathematical constants are rather complicated and not exactly expressible in terms of simple functions - but it is purely a mathematical problem. You don't need any extra information, just a powerful brain and computer or something like that, if you want to compute the hadron masses. After the hadron scattering, this is the 2nd most important application of QCD; this is why QCD is powerful - and I have no idea what you think that Gross, Wilczek, and Politzer got their Nobel prize for, if you think that QCD does not allow one to calculate the properties of hadrons.

Could you please clarify your arguments? As you wrote them, they sound pretty stupid. What do you think QCD is if it is not a theory to calculate the properties of hadrons?

4. Thanks for your patient explanations. Let me tell you what is confusing me.

"Most of the facts about these functions are already known from QFT - and the only remaining thing is *one number*, for example its value at some specific scale (which can be taken to be the zero energy scale in the case of electromagnetism), which can then be used to reconstruct the whole function exactly and analytically."

I agree. ALL I am trying to say ---coupling "value at some specific scale" has to be determined from experiment; it cannot be predicted by staring at the Lagrangian (Or can it?).

"But the fine-structure constant does not really run below the scale of the mass of the electron. It's because the electron (and positron) is the lightest charged particle, and if you're at energies (much) below its mass, all charged particles can be forgotten (as irrelevant high-energy physics) and the contribution from the electron running in the loop to the beta function goes to zero."

I completely agree.

But the question asked by Gross is

"Can all fundamental dimensionless continuous parameters of Nature be calculated from theoretical principles, without any input from experiments?"

Now I think alpha_{QCD} is a fundamental dimensionless continuous parameters of Nature---can it really be calculated from theoretical principles, without any input from experiments?

Likewise, for any renormalizable QFT, once one knows the value of couplings (at any scale), in principle one would expect that all physical quantities should be expressible in its terms. There is nothing profound about it (except perhaps in a pure mathematical sense, but I don't think that is what interests Gross).

Where am I being stupid here?

"I have no idea what you think that Gross, Wilczek, and Politzer got their Nobel prize for, if you think that QCD does not allow one to calculate the properties of hadrons."

Of course, QCD allows one to calculate properties of hadrons. But everyone knows that if we were smart enough, the single unknown parameter in (pure gluon) QCD is \Lambda_{QCD} and so anything in that theory should be expressible in terms of it; there is no other mass scale (save small bare quark masses)!

Obviously, I am missing something...

5. You wrote:

"ALL I am trying to say ---coupling "value at some specific scale" has to be determined from experiment; it cannot be predicted by staring at the Lagrangian (Or can it?)."

Yes, the value of the coupling at a given scale CANNOT be determined from quantum field theory - especially not from its Lagrangian - which is why quantum field theory is NOT a complete, final (or quite satisfactory) theory of the Universe. The value of all numbers such as the couplings however CAN be calculated from string theory, and it would have to be calculable from any other theory that would claim that it is deeper than quantum field theory.

Then you wrote:

"Now I think alpha_{QCD} is a fundamental dimensionless continuous parameters of Nature---can it really be calculated from theoretical principles, without any input from experiments?"

As you said yourself, alpha_{QCD} is not a number. It is a function of the scale. You must define it more precisely if you want to call it a "number", for example you must say alpha_{QCD}(m_{Planck}). Then it becomes an actual dimensionless number, and Gross' question (which both of us repeat) is whether this number and all other numbers like that can be calculated from the theory, without looking at experiments.

If string theory is correct, and we have circumstantial evidence that it should be, then all such numbers, such as alpha_{QCD}(m_{Planck}) can be calculated as soon as we identify the correct background - and there are only a countable number of choices.

Lambda_{QCD} is not a mathematical (i.e. dimensionless) parameter in QCD. Pure QCD has no dimensionless parameters whatsoever, it is completely determined and unique. In reality, the QCD is not pure because quarks have extra bare masses. I am not sure what two of us are exactly missing. You questioned whether QCD can calculate the properties of hadrons, and I explained why the answer is yes, did not I?

My opinion is that I have written a transparent explanation many times, and the only problem is that you and others don't want to understand (or believe?) or whatever it is. I just don't know how to break this barrier because the question looks absolutely obvious to me. QFTs such as the Standard Model does not explain its parameters, deeper theories such as string theory can obviously do it, and the question is whether these theories are correct and will predict, at one day, all these parameters. I just don't know how else should I express this idea so that it becomes comprehensible; in my opinion it is a totally trivial idea which should not lead to this huge waste of time.

6. No, personally I don't have an axe to grind; I am not doing theoretical physics anymore. I am just trying to understand the statement of the problem and greatly appreciate your valuable explanations.

I reread your explanations and I agree that way too many words are being wasted for something quite simple. Please don't waste much more time :) I think I am looking for a simple, precise statement (non-NY Times or even non-Physics Today kind) of the following form:

The dimensionless parameters in the Standard model of weak, EM and strong interactions are the Yukawa couplings (that lead to fermion masses and CKM mixing) and theta_{QCD}. Can string theory (or whatever) tell the value of ALL these parameters (at say, M_{planck}).

Am I correct? (Hopefully, only a Yes will be needed from you:) )

7. Hey! I think that you got it. ;-) Sorry for my limited patience, but it is now mostly caused by limited amount of time that I can dedicate to more transparent, more friendly and more time-consuming debates.

The Standard Model is an effective field theory, that approximates possible deeper physics - and substructure of particles and underlying mechanisms of forces - and replaces these simple things by a theory that needs parameters, a theory made of effective fields and interactions.

However, if one reveals the structure behind it, those arbitrary parameters may be explained, much like quantum mechanics explained the properties of individual atoms.