Stefan Antusch, Christian Gross, Vinzenz Maurer, and Constantin Sluka of Basel, Switzerland (Antusch is also affiliated with the Werner Heisenberg Institute, a part of the Max Planck Instiute in Munich) released an extremely intriguing preprint:

A flavour GUT model with \[\Large\theta_{13}^{PMNS} = \frac{\theta_{\rm Cabibbo}}{\sqrt 2}\]Their model – or class of models – combines the constraints of supersymmetry, grand unification, and the \(A_4\) family symmetry to predict 20 parameters related to the fermion masses out of 14 parameters whose values they optimize. Among the 6 parameters they're able to predict without assuming them, 4 of them seem to match the experimental values very well and 2 predictions are completely new, expecting to be falsified or confirmed (a Majorana phase and the Dirac CP-phase).

That's quite something. Look at Table 3 on page 10 of the paper to see those amazingly accurate predictions for the masses, mixing angles, CP-violating angles, and neutrinos' squared-mass differences. I am impressed, especially because four of the confirmed predictions (a rather large number) seem to result as nontrivial predictions of their models.

I won't discuss supersymmetry or grand unification here because they're widely discussed topics on TRF and they're complicated, anyway. (See e.g. an article on neutrinos in grand unification.) Instead, let me focus on some "more special features" of their model, especially on the \(A_4\) flavor symmetry.

Recently, the last real angle in the neutrinos' mass matrix (PMNS matrix) was measured by the NuFIT collaboration. A surprise for many people was the relatively large value of the angle:\[

\theta_{13}^{PMNS} = 8.75^\circ\pm 0.43^\circ

\] Physicists tended to assume that the angle was either zero or extremely tiny. Well, as some string-inspired models (F-theory phenomenology...) had been predicting for quite some time, it's not small at all. (TRF readers could have suspected the figure was large since June 2011.) If you multiply this angle by \(\sqrt{2}\), you pretty much get the Cabibbo angle. This seems like some crazy numerology except that this condition may be rather naturally explained by some group-theoretical assumptions. And they do it.

The idea of the \(A_4\) flavor symmetry began around 2001 in papers such as Ma-Rajasekaran (around 500 citations). The hierarchy of the charged lepton's masses (electron, muon, tau) as well as the near-degenerate neutrino masses (with large mixing angles) naturally arises from such a model. The symmetry has to be "softly broken".

What is \(A_4\)? It is the group of even permutations of four elements. The group \(S_4\) of all permutations (the so-called symmetric group) has \(4!=24\) elements; the alternating group \(A_4\) has \(4!/2=12\) elements. These elements come in four distinct types, the so-called "conjugacy classes": the identity (1 element like that), a cyclic permutation of a triangle leaving the fourth element intact (8 elements like that), and a composition of two transpositions (3 elements like that: 12-34, 13-24, 14-23).

There is a general result about finite groups that says \(K=R\): the number of conjugacy classes is equal to the number of inequivalent irreducible representations. Because we have four classes, we must have four representations. Moreover, the sum of their squared dimensions must be equal to the number of elements of the group: \[

\sum_{i=1}^R d_i^2 = 12

\] How can you divide twelve to the sum of four integers that are squares of positive integers? The unique decomposition is \(1+1+1+9\). The first one-dimensional representation is the trivial one: every element is mapped to the identity operator on a one-dimensional space. The other two one-dimensional representations are complex conjugate to each other: they assign one to everything except for the rotations by \(\pm 120^\circ\) around any axis which are represented by \(\exp(\pm 2\pi i/3)\); the phase gets inverted for the other representation.

Finally, the three-dimensional irreducible representation of \(A_4\) is nothing else than the space in which the tetrahedron whose 4 vertices are being permuted is being embedded as \(A_4\) is embedded into an \(SO(3)\). This 3-dimensional space is identified with the space of the three generations of leptons and quarks.

Symmetries are always legitimate principles except that a broken \(A_4\) isn't the only symmetry one could consider. At this moment, I don't know of any top-down or stringy reason why a softly broken \(A_4\) symmetry should be a part of the description of the generations but even in the absence of such an explanation (F-theory on a tetrahedron? some \(A_4\) orbifolds? centralizers inside some group?), I find the tetrahedral symmetry to be a plausible part of Nature (vaguely justified by some bottom-up, phenomenological observations) and the accurate prediction of 4+2 parameters besides the 14 input parameters seems like a rather strong piece of evidence that there is something about this idea.

*A Feynman diagram from 1968. Click for more.*

**Two other papers**

An off-topic bonus comment. There are two other new papers I want to quickly mention. Leonardo Modesto argues he can define finite quantum gravity in any spacetime dimension. He uses some non-polynomial functions of the Riemann tensor in the action to conclude that the theory is free both of renormalizability and ghost problems. It seems self-evident to me that both of these lethal problems are actually there. Non-quadratic functions of the fields inevitably create negative-residue poles, the ghosts (leading to negative probabilities), and the non-polynomial "form factors" are completely undetermined, leaving infinitely many unknown coefficients which is the real problem with non-renormalizable theories (complete lack of predictive power).

J.L. Chkareuli of Georgia wants to describe gauge fields as Goldstone bosons triggered by a spontaneously broken SUSY. He talks about some Lorentz symmetry breaking which doesn't break the actual physical Lorentz symmetry and doesn't really write any supersymmetric Lagrangians of recognizable types.

I am sure that for all experts, it is very painful or impossible to read similar papers because these papers not only look self-evidently wrong but the authors seem to be unaware of the elementary reasons why these papers are wrong. They just don't seem to interact with credible physicists so it seems they haven't even been told why their papers are wrong. Or they have been told but they have misunderstood it. Or they understood the reason but pretend that the reason doesn't exist.

At any rate, Mr or Ms Chkareuli and Modesto, you can't really make credible physicists read your papers unless you address the obvious reasons why your papers are wrong at a very visible place – and in the abstract. When one reads the abstract and the first page and determines that you don't seem to be aware of the basic knowledge and arguments and principles, he just throws your paper to the trash bin before he gets to the second page because acting otherwise would mean to waste time. I am not 100% certain and I can't be 100% certain that every paper that seems wrong after this quick reading is indeed wrong and empty of any valuable stuff but if there are too many papers like that and the estimated probability that they're wrong is too high, it's simply sensible to throw them as soon as possible. Their authors don't seem to care.

## snail feedback (15) :

Lubos, I think you might like this preprint

http://arxiv.org/abs/1210.0194, entitled

"If no information gain implies no disturbance, then any discrete physical theory is classical"

Since the converse is known to be true, it seems to me that this shows that LQG/cellular automata/"the universe is running on a computer"/etc. can't work as physical theories. Of course, we already have plenty of other reasons to believe those ideas are totally hopeless, but this seems like a particularly strong argument against them.

*the converse of the hypothesis is known to be true.

Lubos, did you see Cumrun's paper on M-strings?

http://arxiv.org/abs/1305.6322

Wow! That's impressive, but what about proton decay in this type of GUT?

Dear OON, a good question. Proton decay is typically slowed down by flavor symmetries - I don't know whether A4 and/or this exact model does so well but here is an example with the group Q6:

http://arxiv.org/pdf/hep-ph/0511268.pdf

Surely looks ugly! Even 1/2 of the page 16 is pretty ugly.

Fitting 18 parameters with 17 input and predicting two others doesn't seem that great. On the other hand, the idea that A4 was thought too constraining gives a bit of hope this could work.

Looking ahead, does measuring the still unknown neutrino properties give hope of locking down a GUT in general? Apart from the masses, is there anything else we don't yet know?

For some time now the angle theta 13, of neutrinos, I have shown in my work, that depends on extra diemnsiones

Its exact value is 9.43353127 º

In these links, of my papers are available:

1) "On the Possibility of a New Principle of

Equivalence and Its Relation With a String Theory Based on the

Foundations of Quantum Mechanics by Angel Garcés Doz"

http://www.fqxi.org/community/forum/topic/essay-download/1373/__details/Doz_FQXi_Essay_Angel_Garces.pdf

2) "

The God Particle: the Higgs Boson, Extra Dimensions and the Particle in a Box

Authors: A.Garcés Doz"

http://vixra.org/pdf/1204.0038v1.pdf

3) "

Simple Formulas that Generates the Quarks Masses

Authors: A. Garcés Doz"

http://vixra.org/pdf/1301.0015v1.pdf

4) "

Quantum Information and Cosmology: the Connections

Authors: A.Garcés Doz"

http://vixra.org/pdf/1305.0029v1.pdf

Thanks

minor point, but you don't mention that the 8 3-cycles are 2 conjugacy classes of 4 elements each.

Thanks, these groups of 4 are equivalent to each other but only through an outer automorphism of the group...

Here's the paper as I understand it, and it does seem to show that almost any theory with discretized space (besides classical mechanics) contradicts quantum mechanics. The authors examine a class of probabilistic models over finite dimensional vector spaces. Reference [1] of the paper claims that the discretization of Hilbert space follows from discretized space, in the sense that if the Hilbert space has a basis \sum_n a_n |n>, the a_n are drawn from a finite set. This implies that there are finitely many pure states. Next, the authors claim that, in this class of theories, the state spaces of classical physics are simplices, and take as a postulate a statement about information gain that is true in quantum mechanics. The authors then show that the class of models in question either have continuously many pure states (i.e. take place in a continuous state space, which implies physical space is continuous), have a simplex as their state space (are classical physics), or violate the information postulate. Therefore any non-classical model of this type contradicts quantum mechanics.

Now, that being said, the argument for the finiteness of the set of coefficients given in reference [1] which the whole argument rests on seems very vague and hand-wavy. I am not sure why one couldn't simply assert the a_n are continuous, even in discretized physical spaces, since the reference itself admits no experiment can answer the question

[1] http://arxiv.org/abs/hep-th/0508039

I don't understand why you say that a discretized 3D space implies a discretized Hilbert space. It's clearly wrong, isn't it? A lattice QCD has a discrete space but it still has continuous complex probability amplitudes for everything.

In a similar fashion to your last paragraph or so these two papers were actually back to back in my daily feed in the astrophysics section:

http://arxiv.org/abs/1305.6847

http://arxiv.org/abs/1305.6859

The first patently ignores things like the Bullet cluster and BAO while the opening sentence of the abstract of the second one reads:

"The evidence for the dark matter of the hot big bang cosmology is about as good as it gets in natural science."

I found this juxtaposition humorous.

Wow, that's obvious. Now I feel silly; I'm not sure why I didn't think of that myself while reading the paper. I'm a lower division undergrad who's had 1 QM class, so my ignorance can be understood (although probably not forgiven ;-)). But how do referees that are experts in this field miss such an obvious and fatal flaw in a paper during review, before it was published? Is it common for such things to get published? Anyways, I guess the lessons of this blog have yet to sink in well enough.

(Apologies if this is duplicated - I'm unsure how to search comments.)

John Baez has been running a great math series on symmetries, very good at explaining what A4 is:

http://johncarlosbaez.wordpress.com/2013/06/02/symmetry-and-the-fourth-dimension-part-9/

PS - you might, however, totally ignore his politics! Happily he keeps them distinctly apart.

Post a Comment