Sunday, November 02, 2014 ... Français/Deutsch/Español/Česky/Japanese/Related posts from blogosphere

How a paper about dark matter interactions got misrepresented

A potential deviation from ΛCDM described via "eating"

At least sixteen news outlets ran stories about "dark energy that is devouring dark matter" in recent two days. I think that the journalists started with a University of Portsmouth press release that described the recent publication of a British-Italian paper in Physical Review Letters.

The article by Salvatelli and 4 co-authors has been available since June:

Indications of a late-time interaction in the dark sector (arXiv)
It has 6 pages, 6 figures. As of today, it has only collected 2 citations but it's an OK paper, I think.




What is the paper about? They try to reconcile the cosmic microwave background (CMB) data from WMAP, Planck, and others, and design a model that correctly incorporates cold dark matter (CDM) as well as the growth rate of the large-scale structure (LSS) – that rate currently looks lower than previously thought.




And when they construct their model, which combines these things slightly differently than existing papers in literature, they find some deviation slightly greater than 2 standard deviations somewhere. So the next question is how this deviation can be cured if it is real. And they effectively propose two possibilities to change the successful ΛCDM model (cosmological constant plus cold dark matter). Either to allow the neutrinos to be massive – that's represented by adding \(m_\nu\) in front of ΛCDM – or by adding "vacuum interactions" i.e. switching to iVCDM (interacting vacuum cold dark matter).

I would say that the neutrino masses should be favored in that case but all the celebrations are directed at the second possibility which is described by the phenomenological equations\[

\eq{
\dot \rho_c + 3 H \rho_c &= -Q,\\
\dot V &= +Q.
}

\] Here, \(V\) is the dark energy (density) – it's equal to \(\Lambda/8\pi G\) in the normal cosmological constant case – and \(\rho_c\) is the cold dark matter (density). The terms \(\pm Q\) have to be the same due to some kind of energy conservation and their nonzero value is what adds "iV" to the iVCDM model. This phenomenological adjustment was first proposed in the mid 1970s so it's very old. A nonzero \(Q\) is supposed to arise from some interactions in the dark matter sector but I think that no convincing particle-physics-based model of this kind exists as of now.

You know, the whole concept that \(V\), the dark energy, is non-constant is highly problematic. A cosmological constant has to be constant in time (that's why it's called a constant) and it's a far more convincing explanation to keep it constant and add something else (like the neutrino masses). If you want to make \(V\) variable, you have to add at least "some degrees of freedom", and if you add too many, so that it will resemble some other particles, the pressure will be much closer to zero than the \(p=-\rho\) value that you approximately need. However, it's a possibility that \(V\) is (and especially was) changing and the authors make various Bayesian and other exercises to quantify how much they think that it's favored over the ΛCDM.

OK, the content of the paper is rather clear. It makes some sense but it's not conclusive and there's no revolution yet (or around the corner). But let's have a look what the media have made out of this technical paper on cosmology.

The Daily Mail's title is
Is dark energy turning the universe into a 'big, empty, boring' place? Mysterious force may be swallowing up galaxies
As you can see, the title has pretty much nothing to do with the paper. First of all, the positive cosmological constant (dark energy) has been known to turn the Universe into a big, empty, boring place for almost a century, and this scenario has been expected to occur in our actual Universe since the experimental discovery of the cosmological constant in the late 1990s. So the new paper they are trying to describe – but they utterly fail – has surely not discovered that this is what the future of the Universe is going to look like.

Are galaxies being swallowed by a mysterious force? First, the sign of \(Q\) may be positive or negative which converts some dark matter into dark energy or vice versa. The sign is really just a technicality and there shouldn't be too much ado about nothing. If you care about the sign, they prefer \(Q=-q_V HV\) with \(q_V\approx -0.15\) so the minus signs cancel and \(Q\) is positive. The displayed equations above indeed say that the dark matter goes up and the dark matter goes down so the press is right and Richard Mitnick is wrong.

And is it right to suggest that the galaxies are being swallowed now? Swallowed is surely a very strong word here – if something is reducing the amount of dark matter, it is very slow today and it occurs rather uniformly everywhere. Equally importantly, most of the paper is about past cosmological epochs. They divide the life of the Universe to four bins, \(q_1,q_2,q_3,q_4\), and it's the fourth bin that contains the present.

They see some small but potentially tantalizing deviations in all the bins. But because this is based on cosmological observations that basically study the distant past only, it is very problematic to suggest that the finding, even if it were real, primarily tells us something about what is happening today.

The other popular articles mainly differ in the choice of verbs. Sometimes the dark matter is being "swallowed up", sometimes it is being "gobbled up". But whatever the wording is, I think it is fair to say that most cosmologists wouldn't be able to reconstruct the point of the paper – even approximately – from these popular stories in the media.

And make no doubts about it. It is very important whether there exists a functioning, particle-physics-based model (ideally embedded in string theory) that describes the microscopic origin of the required extra term – otherwise the extra term looks like a nearly indefensible fudge factor. For those reasons, I think that the neutrino masses would be preferred as an explanation if the deviation were real (which I find very uncertain). There may be other explanations, too. We just need some correction at a generic place in the equations of energy densities. Perhaps, some cosmic strings or cosmic domain walls could help, too.

Whatever the right calculation of these processes and the explanation of any possible deviations is, don't imagine that a big goblin is walking throughout the Universe and gobbles up the galaxies (or the dark matter in them). Even if this term existed, it's just another boring term that the laymen (and perhaps even most experts) would be annoyed by and most likely, the microscopic explanation would be in terms of some basic particle reactions similar to hundreds of particle reactions that are already known.

The journalists' desire to present even the most mundane and inconclusive suggestions as mysterious discoveries that change absolutely everything is unfortunate. It may be needed to get the readers' attention – but this is largely a fault of the previous overhyped stories. The dynamics is similar to a p@rn consumer who demands increasingly more hardc@re p@rn. Where does this trend lead?



By the way, one more news on theoretical work on the identity of dark matter. Three days ago, Science promoted SIMP (strongly interacting massive particle) models as a replacement for WIMP. SIMPs could account for the observed multi-keV line and offer you their own of a SIMP miracle, matching the WIMP miracle. See e.g. this paper by Feng and others or this paper in PRL for extra ideas.



On Tuesday, SciAm and others promoted a paper arguing that Hooper-like gamma rays from the center of the Milky Way may be created by dark matter explosions instead.



Guth, Linde, Dijkgraaf, and some other well-known characters were hired as instructors at the World Science U(niversity). Register now.



A new team has created an app to detect cosmic rays with people's (and your) smartphones – with the apparent goal to make the Auger experiment obsolete. ;-)

Add to del.icio.us Digg this Add to reddit

snail feedback (20) :


reader Leo Vuyk said...

IMHO, dark matter is connected to black holes, growing by eating the surrounding Higgs field vacuum and leading to a big crunch as the start of a new explosion of the BB Black hole into galaxy creating splinter black holes. see image.


reader kashyap vasavada said...

Very
interesting blog. If I understand, ΛCDM has been found inadequate at the level
of 2 STD. Since most people believe in neutrino masses anyway for other
reasons, my feeling is that, if neutrino masses can explain this deviation,
there is no need to bring in dark energy-dark matter conversion. Then the question is : has anybody shown that
neutrino masses can explain this deviation?


reader Luboš Motl said...

Hi Kashyap! Neutrino masses are surely nonzero but what has been measured - by neutrino oscillations - are just differences of the (squared) masses of different neutrino eigenstates. The absolute masses - the additive shift in "mass squared" - is unknown and I guess that only large enough values are useful for the removal of discrepancy. The paper should say how large the required the masses are somewhere.

There are some other experiments - e.g. POLARBEAR I discussed a week ago

http://motls.blogspot.com/2014/10/polarbear-announces-detection-of-b-modes.html?m=1



that promise to decide about the neutrino masses from the analysis of the CMB.


reader Tony said...

I find CIB and CXB terminology a bit confusing. These are clearly not parts of CMB thermal spectrum at 3K (right?), but a residue IR and X-ray radiation they see when they subtract all that they know of and can subtract.


reader kashyap vasavada said...

Thanks. i see your point.


reader Leo Vuyk said...

Tony: Infra Red:=CIB and X-ray =CXB.

see: NASA Chandra, Spitzer Study Suggests Black Holes Abundant Among The Earliest Stars.

http://www.nasa.gov/topics/universe/features/abundant-black-holes.html


reader Tony said...

Heh, I should create a Web page like this:

http://libertesphilosophica.info/blog/disproof-of-bells-theorem-book/

and then ask for donations for my new bubbles theory which is: what we are really seeing is a low temperature electron zitterbewegung. Electron just jumps between the bubbles, at the speed of light, before they can collapse.

Just kidding, of course.


reader Leo Vuyk said...

Tony: The early Universe seem to show primordial Galaxies with clear dumbbell structures of dual Black holes.

see also:https://www.flickr.com/photos/93308747@N05/page8/?details=1


reader Lino said...

Lubos:


I understand the entire history of the controversy, and the fact is that QM works splendidly just the way it is. It leaves you unsatisfied. The measurement problem makes you scratch your head. And, of course, Feynman's advice to graduate students comes to mind.


However, since I'm not a professional physicist I have the luxury to spend time, and effort, pursuing these things. I'm convinced---this is what intuition makes possible---that when physics abandoned the notion of the ether, it left behind something very important. The fact that after almost 100 years of GR there is still no unification of GR and QM (yes, I know that this is one of the chief benefits of string theory, though, as yet, the LHC has not confirmed superparticles) should tell us that something is wrong.


Current thought is that the world at the macro-level is continuous (classical mechanics) and that at the smallest of levels it is discrete (quantum mechanics). I would turn it around and say that the universe is continuous at the smallest of dimensions and then becomes discrete at larger ones. IOW, the ether will remain beyond our direct detection of it, but that it, the ether, contains the seeds of unification of GR and QM.


These are my ideas. We're free to have our own ideas.
And ideas are cheap.


But, as I said, not being a professional physicist, not having a reputation to protect, I can enjoy the freedom to indulge my intuitions.


As I posted before, the Copenhagen interpretation has provided fantastic experimental results. If it isn't broken, they why fix it?


But, of course, I've enjoyed fixing things my whole life.


reader Lino said...

Lubos:


More technically, are you saying that the electric charge of the electron can be broken up into fragments within the liquid helium?


If the charge cannot be broken up, then it would seem that the electric charge portion of the energy, and the surface tension of the helium would both remain constant, and so we're left with understanding the source of the "pressure" within the fragmented 'bubbles.'


Could you clarify for me?


reader Luboš Motl said...

Hi, you are free to look for your satisfaction anywhere. I am just telling you that as I know the world around me, it seems virtually certain to me that the satisfaction you get from envisioning an "aether" or some "realist mechanism" underlying the wave function is not a satisfaction caused by a proper deep understanding of Nature.


reader Luboš Motl said...

Dear Lino, the electric charge in a region is an operator whose only allowed eigenvalues - the values that can be measured - are integer multiples of "e", the positron charge, or "e/3", the down antiquark charge - the latter if one allows regions that cut hadrons into pieces.

In 2 dimensions, it is also possible to have other fractions of the elementary charge in the fractional quantum Hall effect

http://en.wikipedia.org/wiki/Fractional_quantum_Hall_effect



String theory may allow some very heavy new particles which carry fractional electric charges, too, but it is virtually certain that they haven't appeared in any experiment done as of today.


Otherwise the charge is surely a discrete observable, and the question whether the electron is somewhere or not, is similarly discrete. However, different possibilities "what is out there" are generally combined into linear superpositions (wave functions) which encodes the probabilities of one version or another.


That's also true for the size of a bubble and the pressure inside it, and *every* other observable in Nature. All those things have some spectrum - the set of allowed values - and the most general state in which the physical system may be found is a complex linear superposition of eigenstates.


reader Lino said...

Lubos, I certainly know that and respect that.


reader NikFromNYC said...

It's taken me about two YEARS to get you to reply to the TACTILE reality of molecular orbitals.

So I'm guilty of a technicality!

I admit how chemistry is alchemy. But! We are MADE of chemistry. We do have good intuition about it, thus. Depending on how the brain works. And it DOES work.


reader Pat said...

On the surface the paper looks fine except it has used MCMC over a sample space with uniform prior that has the q=0 case (non-interacting) on one end of the distribution, hence the MCMC walks in a biased manner near that edge. Their prior should have included q>0 values for the analysis to be taken seriously.
Secondly, they rule out Lambda-CDM with 99% confidence only after they constrain their model parameters using Redshift-Space Distortion (RSD). With other reliable data sets only (Planck, supernovae) they do not receive this result.


reader Steve R said...

Dr Motl:
I have been wondering if physicists feel quite sure that an antineutron shows the same instability as a free particle as would a free neutron? If it could exist, would a stable, free antineutron be a candidate for dark matter? And if so, could some of the primordial matter/antimatter asymmetry be explained as well?

Are there any examples of a particle's anti-pair having slightly different properties (like a half life of 15 BY rather than 15 minutes?)

'Happy holidays from Florida
Steve R


reader Luboš Motl said...

Dear Steve, a particle and its antiparticle have the same lifetime - among many other "sign-free" properties - thanks to the CPT-theorem.


reader Steve R said...

Interesting, thank you. What about quarks and antiquarks existing in the same nucleon? Could 2 upquarks exist next to an anti-downquark for instance? Or would that lead to annihilation?


reader Luboš Motl said...

Dear Steve,


one quark and one antiquark may always annihilate - the process is called in this way even though something is left if the two particles are of different flavors, like up vs anti-down.



Mesons are composed (mainly) of one quark and one antiquark, and because of the previous paragraph, they quickly decay to quark-free particles like photons or muons etc.


2 quarks plus 1 antiquark can't exist in isolation because there's no way to make this collection of 3 particles "color neutral", and all bound states of quarks and antiquarks have to be color-neutral.


reader Steve R said...

Thanks,
I am trying to get caught up in modern Physics lately after having been diverted by a civil engineering career for the past 25 years. I read a lot and watch online lectures but I don't have anyone who can answer questions.
Regarding dark matter, should we expect that it is right here in our solar system in the same ratio as other locations in the galaxy? If so, how could Newton's law have been so precise in calculating planetary motions and mass? Is it just because the solar system is far more dense than the average galactic density?