Saturday, March 20, 2021 ... Deutsch/Español/Related posts from blogosphere

Hiroshima supernovae and the journalists' selection of science



Maxim Turbulenc's disco version of the "Dwarfs' Wedding", a fun Czech (originally) country music song by Rangers/Swimmers (and/or Zdeněk Svěrák [lyrics] and Jaroslav Uhlíř [music]? That was a bit confusing!). Numerous stories about the Snow White's seven dwarfs attending Dopey's (Shmoodla's) wedding in the forest, after Dopey returned from the military, are presented with cute diminutives everywhere; so e.g. someone played a little Mendelsohn on the little organ, and they were cooking meat in a little Papin's pressure cooker. When I translate it from the ultimately playful Czech language to the much less poetic English, it sounds so dull!

Yesterday, several people asked me about a catchy preprint

Actinide crystallization and fission reactions in cooling white dwarf stars
... by C.J. Horowitz, M.E. Caplan ...
because it was accepted to PRL, a widely admired journal, and lots of the media outlets have covered the story. The popular headlines obviously sound much cooler than the boring technical title above, e.g. New Paper: Space Is Full of Naturally Occurring Atomic Bombs. And I would say that in this rare case, these titles are eye-catching but fair summaries of the proposal.



After some time that I spent by looking at it and thinking about it, I do think that this is a much more serious science than I originally thought – partly based on the terrible pseudoscience that the fake news media are promoting all the time (including cold fusion and warp drives). But even if this proposal is promising and passes some consistency tests, I still feel that it clearly shows the tendency of the media to obsessively and prematurely write sensationalist things which is greatly winning over the scientific evaluation of the propositions, despite the persistent claims that the journalists like science.



OK, first, I think that most of the readers of similar catchy popular articles don't have – but should get – a basic introduction to the players. First, what is a white dwarf? It is a leftover from the central parts of a star that wasn't capable of shining any more (because the hydrogen and similar fuel has been burned out) so it died and collapsed into a very special form of matter, the so-called "electron-degenerate matter". About 97% of stars in the Milky Way will become white dwarfs later in their life (yes, it is the 97% consensus: neutron stars and black holes don't exist according to the consensus science LOL) – which is not necessarily the last stage of their life-and-death. It covers almost all stars that are lighter than 10 solar masses. Heavier stars die into neutron stars (instead of white dwarfs) or black holes.

The Sun itself will first become a red giant approximately in 7 billion years from now (stars between 0.3 and 8 Suns in mass like to do it). The Sun will be so widespread and diluted that it will incorporate the Earth's orbit. If you're worried about global warming, you also need to pick the right sunshield to live inside the Sun which is what expects our planet, too. Meanwhile, the central part of the Sun will remain intact and I think that it will be a white dwarf, too.

Why do the cores of stars eventually shrink? It's because before they shrink, they are being kept unnaturally large by the heat and radiation that is moving in all directions because the stars are burning the thermonuclear fuel (mainly hydrogen). The pressure pushes the atoms further away from each other. You may also say that the plasma (ionized gas) occupies a large volume because the temperature is high and the temperature is only high as long as the reaction is burning. Tens (or hundreds, and on Earth, we need hundreds) of millions of degrees Celsius can be found inside Sun-like stars (but the surface temperature is just some 6,000 °C).

Once the star runs out of the fuel, the pressure and temperature decreases and the core stars to become denser (the outer layers don't have to because there is a lot of instability and inhomogeneity there which is enough to shoot parts of the mass to huge distances). Stars below 10 solar masses become electron-degenerate matter which is supported against further shrinking by electron degeneracy pressure.

OK, what is happening is that this material is no longer consisting of nice atoms (balanced small families with a nucleus and electrons) but by "morally but not geometrically segregated" electron gas and the proton+neutron gas (or nuclear gas). The Pauli exclusion principle makes it impossible to squeeze the electrons into a tiny volume. There is at most one electron per state whose volume is \((2\pi\hbar)^3\) in the 6D phase space (per electron).

Many electrons choose to fill the momentum states up to the Fermi energy which dictates the size of the electron cloud in the momentum space. The volume in the position space is (by the uncertainty principle) Planck's constant over the volume in the momentum space times the number of electrons. You can't make \(V_x\) too small; that would increase \(V_p\) and therefore the total energy that scales with the Fermi energy. But gravity does want to make it small. So there is a contest. The Pauli exclusion principle plus the uncertainty principle try to make the volume \(V_x\) large, to reduce the kinetic energy coming from the Fermi energy etc.; but gravity tries to shrink \(V_x\) because that makes the gravitational potential energy more negative (the system is more bound). As in most cases, there is a balance (at the local minimum of the total energy) which the white dwarf ultimately chooses. A Rutgers PhD course has taught me to calculate these things very accurately although I have already forgotten whether it was our condensed matter course or our astrophysics course (both were extremely useful for me although I found the subjects boring to start with). But I still think that both Jerry and Piers could do this calculation if you woke them up at 3 am. ;-)

Note that the electron is the lightest among the particles inside a white dwarf which means that the same \(\Delta x\) which translates to \(\Delta p \sim \hbar / \Delta x\) will be translated to the highest Fermi energy of the order \((\Delta p)^2 / 2m_e\) (the low gets mapped to high because in the momentum language, the mass is in the denominator quantifying the energy). That is why the geometric size of the white dwarf is dictated by the electrons; the protons and neutrons are there just to accompany the "small" electrons while the electrons actually decide about the size etc. It may be counterintuitive for a person who "thinks in terms of classical physics" but quantum mechanics often guarantees that the lightest objects are the most important ones (electrons are also the key particles in all of chemistry including biochemistry). Here, we find out that the electrons decide that the density of the electron-degenerate matter (i.e. of the white dwarf) is between 10 thousand and 10 million times that of water.

Chandrasekhar (a Nobel Prize winner who was more important than most co-laureates) has played with this stuff and showed that below his limit, 1.44 solar masses, the pressure will prevent a further collapse. Heavier white dwarfs are likely to collapse further, into neutron stars and/or black holes. Note that the "next small step" for a white dwarf is to become a neutron star which is degenerate matter as well – but one that has no electrons. How can you avoid electrons? Well, they have merged with protons and transmuted to neutrons. A neutron is heavier than one proton plus one electron combined ("an excessively heavy neutron") so you pay for this merger by the increase of the energy. But for heavy enough objects, the gravitational potential energy will be negative and so significant that it beats the fine from the excessively heavy neutrons; and that is why Nature chooses to reorganize the electrons and protons into neutrons in a neutron star.

Neutrons still obey the Pauli exclusion principle but because they are heavier than electrons, the kinetic energy \(p^2 / 2m_n\) ends up being smaller than for electrons, and that is why we may reach even higher densities. However, the rough calculation of the conditions in the neutron degenerate matter is (in the Newtonian approximation) equivalent to that of the white dwarf (just the particle mass is different, with a corresponding scaling of all other quantities by powers of 2,000, the neutron-electron mass ratio). That is why the neutron stars work just like the white dwarfs – but they are even denser because the decisive particles are neutrons instead of electrons (that are completely absent in a neutron star). The typical density of a neutron star is 1017 times that of water (pronounce "one hundred quadrillion"). (Most of you are playing with white dwarf and neutron stars tennis balls all the time but at this moment, I should already tell you that electrons in an ordinary metal are a tangible example of degenerate matter, too.) If you add some mass to a neutron star, it usually has to collapse to a black hole which is "no material at all" and it is the most extreme and final form of existence of heavy masses that the laws of Nature allow (which is why only black holes are really exciting for fundamental physicists; the lighter material-like objects are a domain of condensed matter physicists and astrophysicists).

That is enough for a basic introduction to prerequisites.

Horowitz and Caplan proposed that some Type Ia supernovae, some explosive events that people observe through telescopes, may be "natural Hiroshimas". Some snowflakes with lots of uranium and other actinides may cool down before everything else. Some neutrons may be there and they become more effective at igniting reactions when they're slow (the cross section increases when the velocity decreases). And assuming that all the important details of their story are roughly correct, the cooling leads to reaching the critical conditions in which the fission chain reaction starts to exponentially explode just like in the Little Boy (Hiroshima) bomb. And that could be what we observe as some type Ia supernova explosions (those with sub-Chandrasekhar ejecta masses and short delay times). The underlying mechanism is the same as that in Hiroshima but many of the numbers are completely different. The density of the nuclei in the white dwarf matter is much higher than in the regular terrestrial uranium. On the other hand, the relative abundance of uranium may be very low.

I am surely sympathetic to the idea that there is a tipping point – followed by a transition to a dramatic process – that is actually starting after the temperature drops below a certain temperature (an apocalyptic process is supposed to start when things cool down to a magic layer, in this sense, it is the opposite of the global warming apocalyptic orthodoxy). The Universe may contain numerous such tipping points – when a dramatic process suddenly starts when a quantity reaches a seemingly innocent value.

There are also many details that could prevent these natural supernova Hiroshimas from exploding such as
  1. insufficient local concentration of the uranium (actinides) i.e. too much uniformity
  2. the nuclei's being split into smaller ones already from some earlier stages of the stellar collapse
  3. the possible ability of the surrounding white dwarf matter to act as a neutron moderator; and that this electron degenerate matter that has a huge density is capable of absorbing all the vibrations from the seemingly less extreme uranium fission processes in some way
OK, the first two points are about the question whether you may have enough fuel to ignite the Hiroshima explosion; the last question of mine is whether the subsequent evolution would be that of Hiroshima. It could also be just a Fukushima (guess why these two words rhyme!), a nicely behaving nuclear power plant (that may slightly leak after a starshake and a tsunami but it probably won't be observable from Earth LOL). Some local chain reaction could modify the conditions in a region of the white dwarf in such a way that the reaction slows down again (also, the local elevated pressures could lead to the birth of the neutron degenerate matter). It is possible that they have really considered all the processes, approximate magnitudes of everything, and the correct signs of all the responses to the elevated chain reaction but I still think that theirs is a story which requires many things to be correct simultaneously and one of them may easily be wrong or be stopped by an overlooked process.

So after some hours, I do think that it is a fun paper and if I couldn't find a clear and demonstrable problem with that (and I really couldn't), I would have recommended it for the publication in PRL, too (because I wouldn't want to be at risk of censoring a potentially big discovery). And it may be a genuine promotion of the scientific curiosity when the science media discuss this possible "cosmic Hiroshimas" (note that it's believed that places on Earth have also been the natural Fukushimas). This is a potentially exciting flow of ideas between the "natural" and the "artificial". With our bombs, we could have plagiarized some processes that Nature has done many times before us (and our thermonuclear power plants are trying to emulate the regular burning in the stars: here we are realizing that we are plagiarizing Mother Nature but so far we are not quite successful).

But I still think that the media convey a very distorted picture about the status of the paper. You may check Google Scholar to see that after more than two weeks it has spent on the arXiv, the paper still has no citations. Two weeks is not a long period but zero is not a terribly high number, either. ;-) So this is an interesting proposal, like hundreds or thousands of similar proposals in this field or related fields that are published every year. I think that most readers of this stuff automatically conclude that "this natural Hiroshima is already a fact" and "this paper is what most astrophysicists are excited about" – and I believe that these statements are completely untrue. Readers should be informed about the rough number of papers and the rough number in the subset that excites a roughly similar number of scientists to a similar extent.

In this case and many others, the big promotion of the paper in the media was kickstarted by the decision to publish the paper in PRL which is a respectable journal. But I still think that such a publication is way too little for the media to present the paper in this way. It's enough for a single referee (or two) to be lazy to look for problems and he or she or they may just write "go ahead, publish it". Such a verdict may be extremely far from a scientific verification of a new idea because when non-stupid authors present a seemingly complex theory, you often need a comparable time as the (long enough) time that they have spent to find a problem which may be there, and the referees usually spend a vastly shorter time with someone else's paper. In mathematics, the proofs are supposed to be completely self-sufficient and a verification may be a complete validation but in physics and natural sciences, this is not really possible because some "possibly overlooked things" may implicitly hide almost everywhere and no paper can be "quite a rigorous proof" that nothing was overlooked.

With PRL, it happens often. I have seen dozens of preprints with zero or nearly zero citations that have been hyped as the "new great facts in science" because they were accepted to PRL (or another journal, in some cases). It is easy to see what is going on. The journalists don't actually impartially look for the scientific truth; and they are not really (born or) incentivized to be very honest, scientist-like journalists, either. They want sensations and many readers that come with them – and their relationship to science is just an accidental context that they were often randomly pushed to. And when they are marketing themselves as "science journalists", they have certain restrictions that (at least sometimes) prevent them from hyping anything. OK, so the paper they hype should be printed in a credible enough journal, a rule says. But the problem is that once this is a sufficient condition and if and when the sensationalism is still the main thing that they actually care about, the popular science media are still basically guaranteed to be dominated by things that aren't right – and that may be downright stupid in almost all cases. Why? Because the amount of garbage that is accidentally published in credible journals is still high enough. If the journalists are incentivized to publish things that may be almost accurately described as "garbage", a single condition – like the paper's being published in a paper journal – is totally insufficient to stop the dominance of the garbage over the popular science media because there's still a plenty of "approved garbage" that they may choose to amplify. The journalists simply need higher standards and because they are unlikely to impose it themselves, there should be someone else who tries to enforce it (because the brainwashed readers don't do such things, either).

It is paradoxical that I wrote these things about "the journalists that deliberately pick garbage" in the context of a paper about astrophysics that is actually very interesting and may be even true and important. But maybe it's even safer for me and for everybody because I believe that the authors of good science – and these two authors may very well be examples – are more likely to agree with my claims about the risks (which are apparently being realized) that the science media end up publishing garbage at almost all times. If they are really good scientists, and there are many hints that it is the case, they probably realize that their paper was being adopted by the popular science media way too quickly and uncritically. The real problem are the authors of self-evident junk science – and they are almost universally guaranteed to team up with the most dishonest journalists they can find and claim that things are just fine in this way.

Add to del.icio.us Digg this Add to reddit

snail feedback (0) :

(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-1828728-1', 'auto'); ga('send', 'pageview');