## Sunday, October 28, 2012 ... /////

### Preons probably can't exist

Don Lincoln is the star of several cute Fermilab videos in which he explains various issues in particle physics. He's also authored several related texts for Fermilab Today.

He chose a much more controversial topic, namely preons, for his fresh article in the Scientific American:

The Inner Life of Quarks
Preons are hypothetical particles smaller than leptons and quarks that leptons and quarks are made out of. But can there be such particles?

At first sight, the proposal seems natural and may be described by the word "compositeness". Atoms were not indivisible, as the Greek word indicated, but they had smaller building blocks – the nucleus and the electron. The nuclei weren't indivisible, either – they had protons and neutrons inside. The protons and neutrons weren't indivisible – they have quarks inside.

Why shouldn't this process continue? Why shouldn't there be smaller particles inside quarks? Or inside the electron and other leptons?

Many people who pose this question believe that it is a rhetorical question and they don't expect any answer. Instead, they overwhelm you with detailed speculations about the possible composition of quarks and leptons while they possess lots of wishful thinking when they believe that all the problems they encounter are just details that can be overcome.

(Pati and Salam introduced preons for the first time in 1974. One of the other early enough particular realizations of preons were "rishons" by Harari, Shupe, and a young Seiberg, which means "primary" in Hebrew. I guess that prazdrojs and urquells would be the Czech counterparts. The terminology describing preons has been much more diverse than the actual number of promising ideas coming from this research. The names for "almost the same thing" have included prequarks, subquarks, maons, alphons, quinks, Rishons, tweedles, helons, haplons, Y-particles, and primons.)

However, the question above is a very good, serious question and it actually has an even better answer that explains why.

Mass scales and length scales

Since the mid 1920s and realizations due to Louis de Broglie, Werner Heisenberg, and a few others, we've known about a fundamental relationship between the momentum of a particle and the wavelength of a wave that is secretly associated with it:$\lambda = \frac{2\pi \hbar}{p}.$ You may use units of mature particle physicists in which $\hbar=1$. In those units, you may omit all factors of $\hbar$ because they're equal to one and the momentum has the same dimension as the inverse length. Note that adult physicists also tend to set $c=1$ because the speed of light is such a natural "conversion factor" between distances and times that has been appreciated since Einstein's discovery of special relativity in 1905.

In those $\hbar=c=1$ units, energy and momentum (and the mass) have the same units, and space and time have the same units, too. The first group is inverse to the second group. Particle physicists love to use $1\GeV$ for the energy (and therefore also momentum and mass); the inverse $1\GeV^{-1}$ is therefore a unit for distances and times. One gigaelectronvolt is approximately the rest mass of the proton, slightly larger than the kinetic and potential energies of the quarks inside the proton; the inverse gigaelectronvolt interpreted as a distance is relatively close to the radius of the proton.

At any rate, the de Broglie relationship above says that the greater momentum a particle has, the shorter the wave associated with it is. Similarly, the periodicity of the wave obeys$\Delta t = \frac{2\pi\hbar}{E}$ where $E$ is the energy. The phase of the wave returns to the original value after a period of time that is inversely proportional to the energy. Now, it is sort of up to you whether $E$ is the total energy that contains the latent energies $E=mc^2$ or whether these terms are removed. If you want a fully relativistic description and you're ready to create and annihilate particles, you obviously need to include all the terms such as $E=mc^2$.

On the other hand, if you study a non-relativistic system, it may be OK to remove $E=mc^2$ from the total energy and consider $mv^2/2$ to be the leading kinetic contribution to the energy. That's how we're doing it in non-relativistic quantum mechanics. These two conventions differ by a time-dependent reparameterization of the phase of the wave function (which isn't observable),$\psi_\text{relativistic}(\vec x,t) = \psi_\text{non-relativistic}(\vec x,t) \cdot \exp(-i\cdot Mc^2\cdot t/ \hbar)$ where $M$ is the total rest mass of all the particles. The relativistic wave function's phase is just rotating around much more quickly than the non-relativistic one.

Preons don't explain any patterns

Fine. Let's return to compositeness and preons. When you conjecture that leptons and quarks have a substructure, you want this idea to lead to exciting consequences. For example, you want to explain why there are many types (flavors) of leptons and quarks out of a more economic basic list of preonic building blocks. It's not a necessary condition for preons to exist but it would be nice and sort of needed for the idea to be attractive.

This goal doesn't really work with preons. Note that it did work with quarks; that's how Gell-Mann discovered or invented quarks. There were many hadrons and the idea that all these particles were composed of quarks was actually able to explain a whole zoo of hadrons – particles related to the proton and neutron, including these two – out of a more economic list of types of quarks.

Gell-Mann's success can't really be repeated with the preons. The list of known leptons and quarks is far from "minimal" but it is not sufficiently complicated, either. Quarks have three colors under $SU(3)_c$. And both leptons and quarks are typically $SU(2)_W$ doublets. And both leptons and quarks come in three generations.

These are three ways in which there seems to be a "pattern" in the list of types of quarks and leptons; three directions in which the lists of quarks and leptons seem to be "extended". But none of them may be nicely explained by preons. First, you can't really explain why there are $SU(2)_W$ doublets or $SU(3)_c$ triplets. Whatever elementary particles you choose, they must ultimately carry some nonzero $SU(2)_W$ and $SU(3)_c$ charges – and the charges of the doublets and triplets are really the minimal ones (the simplest representations) so whatever the preons are, they can't really be simpler than quarks or leptons.

(Here I am assuming that the gauge bosons and gauge fields aren't "composite". The possibility of their compositeness is related to preons and the discussion why it's problematic would be similar to this one but it would differ in some important details. The conclusion is that composite gauge bosons are even more problematic than preons.)

Also, you won't be able to produce three families out of a "simpler list of preons". To produce exactly three families, you need something that comes in three flavors, i.e. a particle of "pure flavor" that has three subtypes and that binds to other particles to make them first- or second- or third-generation quarks or leptons. But there must still be other particles that carry the weak and strong charges so the result just can't be simpler.

The comments above were really way too optimistic. The actual problems with the "diversity of the bound states" that you get out of preons are much worse. Much like there are hundreds of hadron species, you typically predict hundreds of bound states of preons. Moreover, they should allow multiple arrangements of the preons' spins, they should be ready to be excited, and they should produce much more structured bound states. None of these things is observed and the predicted structure just doesn't seem to have anything to do with the observed, rather simple, list of quark and lepton species.

But there exists a problem with preons that is even more serious: their mass.

If it makes any sense to talk about them as new particles, they must have some intrinsic rest mass, much like quarks and leptons. What can the mass be? We may divide the possibilities to two groups. The masses may either be smaller than $1\TeV$ or greater than $1\TeV$. I chose this energy because it's the energy that is slightly smaller than the LHC beams and that is already "pretty nicely accessible" by the LHC collider. Maybe I should have said $100\GeV$ but let's not be too picky.

If the new hypothetical preons are lighter than $1\TeV$, then the new hypothetical particles are so light that the LHC collider must be producing them rather routinely. If that were so, they would add extra bumps and resonances and corrections and dilution to various charts coming from the LHC. Those graphs would be incompatible with the Standard Model that assumes that there are no preons, of course. But it's not happening. The Standard Model works even though it shouldn't work if the preons were real and light.

So we're left with the other possibility, namely that preons are heavier than $1\TeV$ or $100\GeV$ or whatever energy similar to the cutting-edge energies probed by the LHC these days. But that's even worse because the very purpose of preons is to explain quarks and leptons as bound states of preons – and the known quarks and leptons are much lighter than $1\TeV$.

To get a $100\MeV$ strange quark, to pick a random "mediocre mass" example, the rest mass of preon(s) inside the quark, several $\TeV$, would have to be almost precisely cancelled by other contributions to the mass and energy, with the accuracy better than 1 in 10,000. Clearly, the extra terms can't be kinetic energy which is positively definite: the compensating terms would have to be types of negative (binding) potential energy.

But it's extremely unlikely for the energy to be canceled this accurately, especially if you expect that the cancellation holds for many different bound states of preons (because many quarks and leptons are light).

Note that the virial theorem tells us that in non-relativistic physics, it's normal that the kinetic energy and the potential energy are of the same order. For example, for the harmonic oscillator with the $kx^2/2$ potential energy, the average kinetic energy and the average potential energy are the same. For the Kepler/Coulomb problem, $V\sim - k/r$, and the kinetic energy is $(-1/2)$ times the (negative) potential energy. More generally,$2\langle E_{\rm kin}\rangle = -\sum_{m=1}^N \langle \vec F_m\cdot \vec r_m\rangle$ and if the potential goes like $V\sim k r^n$, then $\langle E_{\rm kin} \rangle =\frac{n}{2} \langle V\rangle.$ If you need the potential energy to cancel, you have to assume $n=-2$. But the attractive potentials $-1/r^2$ are extremely unnatural in 3+1 dimensions where $-1/r$ is the only natural solution to the Poisson-like equations you typically derive from quantum field theories. You won't be able to derive them from any meaningful theory. Moreover, relativistic corrections will destroy the agreement even if you reached one. I was assuming that the motion of preons may be represented by non-relativistic physics – because the preons are pretty heavy and at relativistic speeds, they would be superheavy. If you assume that they're heavy and relativistic (near the speed of light), you will face an even tougher task to compensate their relativistically enhanced kinetic energy.

Even if you fine-tuned some parameters to get a cancellation, it will probably not work for other preon bound states. The degree of fine-tuning needed to obtain many light bound states is probably amazingly high. And we're just imposing a few conditions – the existence of light bound states that may be called "leptons and quarks". We should also impose all other known conditions – e.g. the non-existence of all the other bound states that the preon model could predict and the right interactions of the bound states with each other and with other particles – and if we do so, we find out that our problems are worse than just a huge amount of fine-tuning. We simply won't find any working model at all even if we're eager to insert arbitrarily fine-tuned parameters.

If you think about the arguments above, you are essentially learning that you shouldn't even attempt to explain light elementary particles – those that are lighter than the energy frontier, e.g. the energy scale that is being probed by the current collider – as composites. It can never really work. Quarks and leptons are much lighter than the LHC beam energy and because no sign of compositeness (involving new point-like particles) has been found, it really means that there can't be any.

Compositeness has done everything for us

So while the idea of compositeness is responsible for many advances in the history of physics, nothing guarantees that such "easy steps" may be done indefinitely. In fact, it seems likely that there won't be another step of this sort although some bold proposals that the top quark etc. could still be composite exist and are marginally compatible with the known facts.

After all, wouldn't you find it painful if the progress in physics were reduced to repeating the same step "our particles are composed of even smaller ones" that you would repeatedly and increasingly more mechanically apply to the current list of particles? The creativity in physics would be evaporating.

There exists a sense in which quarks and leptons are composite and the counter-arguments above are circumvented. In string theory, a lepton or a quark is a string. That means that you may interpret each such elementary particle as a bound state of "many string bits", pearls or beads along the string. If the number of conjectured "smaller building blocks" becomes infinite, like it is in the case of the stringy shape of an elementary particle, the cancellation between the kinetic and potential energy may become totally natural.

Despite the inner structure of elementary particles, string theory has an explanation why there are massless (or approximately massless, in various approximations) particles in the stringy spectrum. To some extent, this masslessness is guaranteed by having the "critical spacetime dimension" $D=10$ or $D=26$ for the superstring and bosonic string case, respectively. Well, string theory circumvents another problem we mentioned, too. We said that the kinetic energy is positive and the sum of all such positive terms must be positive, too. However, string theory uses the important fact that the sum of all positive integers equals $-1/12$ which provides us with a very natural opportunity to cancel infinitely many terms although all of them seem to be positive.

Comparing preons and superpartners

The LHC hasn't found traces of any new particles beyond those postulated by the Standard Model of particle physics yet. However, that doesn't mean that all proposals for new physics are in the same trouble. In particular, I think it's important to explicitly compare preons with superpartners predicted by the supersymmetry.

At some point in the discussion above, I mentioned that preons could be either lighter or heavier than $1\TeV$. The case of "light new particles" is generally excluded by the LHC (and previous experiments) because we would have already produced these new particles if they existed and if they were light.

The case of preons heavier than $1\TeV$ was problematic because their "already high mass" must have been accurately cancelled by some negative contributions to the total energy/mass of the bound states and the negative potential energy required to do so seemed impossible, fine-tuned, and generally hopeless.

But the case of superpartners heavier than $1\TeV$ doesn't have any problems of this sort. No supersymmetry phenomenologist really has any "rock solid" argument of this sort that would imply that the gluino is lighter than $1\TeV$ or heavier than $1\TeV$. We just don't know, these new particles may be discovered at every moment, and even at several $\TeV$ or so, they still immensely improve the situation with the fine-tuning of the Higgs mass etc.

So while preons are pretty much completely dead – because you just can't construct light particles out of heavy ones, if I oversimplify just a tiny bit – superpartners remain immensely viable and well-motivated. The superpartners may still be rather light – the lower bound on their mass are often significantly lower than the lower bounds on other particles' masses in models of new physics – but there's nothing wrong about their being much heavier, either.

Much like in many texts, it's important not to become a dogmatic advocate of some ideas you decide to "love" in the first five minutes of your research. You could fall in love with the preons. Except that if you impartially study them in much more detail, you find out that this paradigm doesn't really agree with the known features of the world of particles well and some clever enough arguments may actually exclude rather vast and almost universal classes of such models. You should never become a blinded advocate of a theory who becomes blind to arguments of a certain type, e.g. the negative ones that unmask a general disease of your pet theory.

Preons are pretty much hopeless while other models of new physics remain extremely well motivated and promising.

And that's the memo.

P.S.: There will be a Hadron Collider Physics HCP 2012 conference in Kyoto in two weeks; see some of the ATLAS talks under HCP-2012. The detectors should update some of their data from 5 to 12+ inverse femtobarns of the 2012 data which means from 10 to 17 inverse femtobarns of total data. It's just a 30% improvement in the accuracy. Expect much more in March 2013 in Moriond.

Also, Czechia celebrates the main national holiday today, the anniversary of the 1918 birth of Czechoslovakia.

#### snail feedback (29) :

So deep down after Quarks we have strings right ?

Yup, I think that it's pretty much inevitable that there's no substructure in quarks all the way up to the near-Planck scale, i.e. to strings or branes or whatever degrees of freedom could be relevant at the scales of quantum gravity. All other options are so constrained and unmotivated that they're pretty much impossible.

For a possible counterexample, you may see this paper by Lisa et al. with composite top quarks

http://arxiv.org/abs/arXiv:1201.1293

There are other proposals of this sort but as far as I know, all of them agree and have to agree that all the light quarks have to be elementary up to very high energies, not far from the Planck scale.

Awesome choir on the video... in a Cathedral I guess ? Czech kids are so beautiful !

Dear Lumo,

congratulations to your national holiday ;-), I hope you enjoy it even if it is probably to bad that if falls on a Sunday (?)

I like your nice and very accessible explanations why preon models would need way to many fine tuned futch factors to give meaningful results.
Is technicolor not among such composite models, or a version of them ...?

Cheers

Thanks for the wish, Dilaton.

As I said, I think you can't construct realistic preon models even if you are eager to brutally fine-tune them. The problem is worse than just fine-tuning.

Technicolor tries to make the Higgs boson (or related particles playing the Higgs boson's role) composite. Preons are trying to make leptons and quarks composite. It's a similar effort but not quite the same thing.

Ah ok, thanks for the clarification Lumo :-)

[Laymen question]
Will it be possible for String Theory to completely settle the number of particles in nature one day? Or is this a naive question because the answer is "that number is infinite"?

OT story--Frank Wilczek started talking to Murray Gell-Man about partons. According to Wilczek he said,"
Partons? Those are Feynman's put-ons. You must mean quarks." :)

LOL, yes, I know the story, maybe even from one of these two men.

And then you have my proposal that the preons of the superpartners are again the quarks (udscb) themselves.

Dear Lubos, You present well the case against preons and as I am prejudiced for string theory I am willing to accept that if they were light we would have seen them already and if they are heavy they cannot fit in a quark.

My curiosity though made me check on google "preons as asymptotically free", i.e.whether the nesting in smaller components follows the quark behavior and I found some links playing with the idea. Would you say this is also excluded by not seeing non standard model resonances up to now ?

I saw the Scientific American article at a newsstand two days ago and said to myself,”I’ll bet Lubos jumps on this one!”

You’ve been kinder than I expected, actually. From my relatively primitive point of view the electron is a point particle and cannot have any substructure. If it had components that are held together by a force of attraction it would occupy space and exhibit rotational excited states, which it does not. I just assumed that all elementary particles including neutrinos and quarks are the same sort of thing. Obviously, if a particle is composite it is not elementary.

Does Lisa’s article really suggest that the top quark is composite?

Dear Anna, tx. I think it was one of my main points that because the preons must be heavy - they haven't been produced - they must also be strongly interacting at short distances so that the negative potential energy compensates their huge positive mass to yield light bound states as a result.

So they can't be asymptotically free. Or at least they can't be free or weakly coupled at the energies comparable to their mass.

Dear Gene, maybe I was too kind, indeed, but I don't think your argument can work in this simple way.

Of course no one is proposing a chaotic, high-entropy composite electron that would have many distinguishable rotating state, like the spinning Earth. ;-)

Instead, the analogy is supposed to be with the proton which has spin j=1/2 just like the electron and you can't really make it spin faster because the centrifugal force for the next level, j=3/2, would already lead to a highly excited and therefore unstable state.

Still, the compositeness of the proton may be seen, in the deep inelastic scattering, and even at very low energies, one may see that the proton's magnetic moment heavily deviates from the prediction of the Dirac equation. So it is composite. The exact same tests for the electrons show that the electron isn't composite.

Right, all elementary particles such as leptons and quarks are qualitatively the same. This is a hypothesis which should be tested and may be challenged, however, and it's incorporated to the Standard Model - and also guaranteed by grand unification because leptons and quarks are related by symmetries in grand unified theories.

Actually the preon paradigm doesn't violate this assumption because all the particles are still qualitatively the same - they are composite. The only problem is that theories of this sort don't work - can't be compatible with the observed facts such as the lightness of the hypothesized bound states and their pointlike character up to very short distances.

I think there are some issues with definitions when it comes to the question of compositeness. While the preon model might not be viable, we have to accept a certain type of compositeness anytime we appeal to some sort of expansion to explain particle states. So I think its good habit to be very specific when one says there is no compositeness in quarks, by saying there are no additional component fields that we could decompose the quark into (except at very high energies possibly). However, this says nothing about forbidding about expansion of quark states.

Interesting but could you please be a bit more quantitative what you mean by an expansion of quark states that doesn't involve new component fields?

Do you mean the number of "copies" of particles in the Universe or the number of species?

The Standard Model has a finite number of elementary "species" of particles but it has infinitely many excited particle states - as we go to the less stable ones, they become less real and less relevant - but near the quantum gravity regime, the difference between elementary and composite particles breaks down and above the Planck mass, one may say that all black hole microstates are just "new species of elementary particles" - so the total number of species is infinite and any separation to a finite number of "elementary" ones is an artifact of approximation.

Concerning the number of copies in the Universe, it's infinite in the whole Universe if the Universe is spatially infinite. If it is finite or if we talk about the visible Universe only, the number of physical particles is finite although each particle such as a proton may be imagined to have a potentially infinite number of virtual partons inside, and during each acceleration, an infinite number of soft (very low energy) photons is emitted, too.

So with some approximations, the useful number is always finite, but if you insist on making no approximations, the number of particles - and the number of particle species - is infinite for various reasons. But that doesn't mean that it's unknowable. The spectrum of particle species may be fully calculated from string theory even though the list is infinite.

Here is what I mean by definitions. If one looks at the parton model proposed by Feynman (e.g. quark model) then we see the derivation of momentum distributions for different types of partons (quarks). We have to assume again that seeing a quark at some momentum carries uncertainty, so calculationally we have to expand out the potential momentum states. In my mind this is a type of compositeness, even if it isn't defined as such from other points of view.

Not sure what happened but the response got lost, although Captcha wasn't fully displayed in the box. I'll try again. The parton model proposed by Feynman, which ultimately evolved into the quark model assigned different momentum distribution to different types of quarks. Since we don't know what the momentum of a quark is unless we observe it, then we have to assume it has some probability distribution over several momentum states, which would be a type of compositeness.

Oh I see. But this type of parton-like compositeness is also excluded by the mass-comparison argument for quarks and leptons as bound states, isn't it?

Indeed I was refering to the number of elementary particles species. Thanks for the answer. So from what I understood, the standard model particles are just the low energy side of the spectrum of particles species, and string theory may calculate the full spectrum. So now I understand why some folks seem so eager that no more particles are discovered at LHC... :)

Dear Cesar, it's because most of the "new particle species that are pretty much obliged to exist" have energies of order 10^{16} - 10^{19} GeV. The LHC probes energies at most at 14,000 GeV, i.e. 10^{4} GeV. There may easily be a big desert between the Standard Model and the GUT/Planck high energies, and particles needed to solve the hierarchy problem (explain why the Higgs is so much lighter than the GUT/Planck scale) such as superpartners in particular are the only truly well motivated new particle species that may be argued to be probably light enough to be within the LHC's reach.

I think the question of mass really depends on the energies we are able to probe to. The paper below discusses the difference between valence quarks and "sea" partons that are quark-antiquark pairs that appear at high energies. Again, I don't disagree on the points about preons, but I think caution is warranted at high energies in terms of what we call particles. The conclusion of the paper discusses potential implications of discovering deviations from parton/quark models in terms of identifying additional quark substructure (which would be interpreted as new physics)

//www-zeus.desy.de/~liuc/physics/parton.pdf

Well, caution is a good thing but it is not enough to construct a model with bizarre properties you are proposing.

I think that the main question here is whether a particle that seems to be free of substructure up to the distances L, based on scattering, may still have component particles whose mass is lower than M=1/L. I don't think it's possible, whether one uses the visualizations as with quarks or with partons.

For the substructure to be able to hide itself up to short distances L, the characteristic energy scale of new physics has to be equal to or higher than 1/L, at least for regular point-like particle-like building blocks. I think that the validity of this claim doesn't depend on whether one thinks in terms of quarks and particular composite fields or partons and their phenomenological distribution functions.

Fair enough

"Fine tuning" is a problem that shows up a lot in attempting to unify physics. To simplify the expected complicated mass bound states, I think you need to consider the limit as the masses of the preons go to infinity. This makes them impossible to observe directly and it means that the observable particles are restricted to only be the bound states that have, to lowest order, zero mass. This makes sense when you consider how cold and low temp normal matter is, compared to Planck energies / temperatures.

Also, I'm glad to see that the yeast cleared up. I ended up in the hospital (with difficulty swallowing) and got diagnosed with eosinophilic esophagitis, which may not be diagnosed yet in Europe. It seems to be allergic reactions in the throat.