Wednesday, October 28, 2015 ... Français/Deutsch/Español/Česky/Japanese/Related posts from blogosphere

Basic science vs Matt Ridley the bumpkin

Modern technology couldn't exist without science

Just a week ago or so, Bill Zajc sent me a mail announcing that he liked some texts by Matt Ridley, a British conservative journalist and a member of the House of Lords, who may be classified as a climate lukewarmer of a sort. You know, this "purple" political flavor is something that I share with him to the extent that I don't subscribe to every point believed by some conservative mass movements such as the U.S. evangelicals.

Well, I wouldn't count myself as a lukewarmer – I sympathize with Dick Lindzen's view that it's much more accurate for us to proudly call ourselves the deniers (because we don't have any significant doubt that the climate fearmongering is nonsensical) – but due to Ridley's pragmatic common sense attitude to many issues, you could think that I find his ideas close to my heart, too. But sometimes, there comes a shock. Most recently, Ridley wrote a rant titled

The Myth of Basic Science
in the Wall Street Journal five days ago. He argued that basic science – or the funding for any science – isn't needed because the technological innovation is the only thing that matters and it occurs mostly automatically. It's done by regular people, sometimes workers, who are driven by purely practical motives. And Edison, Newton, Darwin, and all famous people of science and technology had lots of competitors who would do exactly the same anyway, so the big conclusion by Ridley is that discoverers don't actually find their discoveries. Instead, the discoveries are just automatically finding their discoverers and the character of the progress is becoming even more automatic in this sense as the computer epoch is advancing.

That's a great story that is right – except for the cases when this story is completely wrong, of course, and these "exceptions" actually represent the overwhelming (and increasingly overwhelming) majority of the progress of the mankind.




First of all, you must notice that Ridley's value system is that of just another primitive populist rural bumpkin who doesn't see any value of science outside the practical implications of the scientific discoveries. Many speeches have been said to point out how fundamentally misguided this world view is. For example, I recommend you the eloquent 1955 National Academy of Sciences speech by Richard Feynman, The Value of Science (full text), which was reprinted both in What Do You Care What Other People Think as well The Pleasure of Finding Things Out by RPF.




Feynman explains why it's silly for scientists to refocus from their basic science problems to practical issues. First, scientists may be as dumb as other random guys when it comes to topics in which they're not experts. Second, the scientific knowledge (and the feeling that scientists enjoy when they expand it) is an important commodity by itself, one that helps to determine the value of the society.

It may be unfashionable among Matt Ridley's fans to point out that the society composed of those people but no theoretical physicists would be far less valuable if not worthless but this unpopularity doesn't make this fact any less true. Matt Ridley isn't refined enough to appreciate science as a separate value so there's nothing more to discuss. He didn't mention the intrinsic value of the scientific understanding at all and he wouldn't get it, anyway.

Instead, I want to talk about the same topic that he focused on, the implications of research for the technological innovation. He mentions examples of inventions in technology and discoveries in science that were done by several people. Light bulbs were constructed by 23 people before Thomas Alva Edison, Alexander Graham Bell and Elisha Gray patented their phones on the same day, Boyle's law is the same thing as what the Frenchmen call Marriotte's law, Isaac Newton and Gottfried Leibniz co-invented calculus, and Charles Darwin and Alfred Russel Wallace co-discovered the evolutionary theory of the origin of species, Ridley thinks.

It's surely possible to find examples like that. Indeed, lots of progress has been virtually unavoidable and the inventors and discoverers were largely redundant. But the problem is that there are also counterexamples and these counterexamples are exactly those that are much more essential for the long-term evolution of the mankind. In these counterexamples, big hurdles had to be overcome and decades or centuries or millenniums could have been easily wasted if some important people hadn't lived.

Pretty much by definition of the importance, the discoveries that could have been delayed by a long time in the absence of the "right people" – the discoveries that Ridley chose to completely hide – are much more important than the examples of incremental advances that Ridley has highlighted. And let me tell you something, if the number of people who invented or discovered something at about the same time was about 2, it is such a low number that it could have easily happened that the number 2 fluctuated down to 0 and the advance would have simply not materialized. Relatively to the world population – currently above 7 billion – 2 people is a very small number that you shouldn't try to substantially reduce even more.

There are complicated questions about the effectiveness of the funding of applied science or research and development. Obviously, lots of the research institutions do almost no useful work. Also, I believe that if something is studied for the practical and economically beneficial applications that are expected or promised to emerge within a decade or less, such research shouldn't get the taxpayer's money at all because there exist profit-driven motives for people to fund such things and there is no reason to bend the invisible hand of the free markets. For example, I think it's absolutely outrageous for Tesla – a company that demonstrably brings nothing else than incremental technological progress – to receive subsidies or be allowed to evade taxes.

On the other hand, I do think that science that isn't studied because of the practical implications at all – pure science – and applied science that needs several decades or more to (possibly or hopefully) bring practical "fruits" might need some funding by the governments because one may rather successfully argue that the free markets break down for those things.

Just to be sure, I don't think that this argument is a rock-solid argument showing that a government is needed. Centuries ago, even very basic research and advances occurred without any government-like sponsors. Rich enough individual sponsors (or family assets) were often enough to support pure scientists throughout their careers – and they could be enough in the modern era, too. The scientist, if honest and allowed to be driven by his curiosity, is ultimately doing the same thing so the source of the funding shouldn't matter much.

Ridley says that the Portuguese mapmaking was getting better because of the constant application of trial and error by rather ordinary sailors with common knowledge, skills that are not too special, and no systematic scientific education, abstract scientific research in the lab, let alone abstract scientific theories were needed. In this case, fair enough. He mentions a few more example of this kind and implicitly presents the whole progress of mankind throughout its history as this inevitable practical fight of ordinary people who don't need any special knowledge.

While all his examples might be OK and there are many other examples in which ordinary enough practical people were enough and the evolution was almost automatic, his picture as a picture of the progress of the whole mankind is breathtakingly idiotic, however.

There were 23 inventors of a light bulb before Edison but this only says that the revolutionary content of Edison's contribution was largely a sociologically driven hype (Edison's particular moves had far-reaching commercial implications, however). But there are lots of breakthroughs that were not done by 23 people and not even 2 people. Ridley tries to present Isaac Newton as an angry guy who didn't like that the calculus was also independently invented by Leibniz. Well, I am open-minded on whether this is the right historical description of these mathematical realizations.

But what I am not open-minded about is that calculus is just a tiny percentage of the actual revolution that Isaac Newton brought us – and pretty much nothing else that Newton has contributed was "shared" with other people. Newton discovered the first laws of physics, the differential equations of classical mechanics, the universal law of gravity, and lots of other things. No one else did those things during his lifetime. Ridley's article shows a picture of Isaac Newton and his roommate in a lab where a prism decomposes light to the colors. If Ridley really meant to pick Newton's lab as something that wasn't important for the history of the mankind, then he is a totally clueless savage. Most fellow members of the House of Lords will agree that politely speaking, using the English aristocracy's most refined jargon, he is a wild f*cking animal. Newton may have been the most ingenious and consequential man of the whole history.

Similar comments apply not only to Newton but to Galileo, Einstein, and other giants. Galileo was the only guy of his era who has clearly defined the scientific method as the elimination of some of the competing hypotheses. And it is no coincidence that it was the same guy who created very useful telescopes, thermometers, and lots of other things. No one else than Einstein has really completed the special theory of relativity and if you appreciate that David Hilbert wasn't working quite independently from Einstein's intermediate papers, the same comment may be said about general relativity, too. No one else has independently explained the photoelectric effect using the photons. And I could continue with hundreds of other discoveries by dozens of "singular" giants of science.

Most of the truly essential discoveries in pure physics were actually found by individuals who had no clear replacement among their contemporaries. I find this point so obvious that it's a waste of time to even discuss it.

But even when you talk about pillars of the technological progress, a huge fraction of them depended on the results of pure science. Let's generously reduce the steam engine and similar mechanical inventions to the work by practitioners who had nothing to do with Newton's main revolution – the birth of physics. I don't think that it's right (and that the post-Newton timing of the industrial inventions is a coincidence) but let's do so. But once we get to the 19th century, the dependence of the cutting-edge technological progress on the "special scientific knowledge" becomes so obvious that only a complete lunatic could deny it.

We are using radios, TVs, GPS signals, smartphones etc. and they send signals through electromagnetic waves, right? Without full-fledged physicists, the mankind would have no idea about the generalized light that we call the "electromagnetic waves". James Clerk Maxwell – not one of hundreds of random engineers but a top figure of the 19th century physics (who also did important work in statistical physics and other subfields) – has unified the laws of electricity and magnetism, added an extra correction term to complete his equations of electrodynamics, and predicted the existence of electromagnetic waves (while realizing that light was a special case of those waves). Heinrich Hertz decided to experimentally validate Maxwell's theory in pure science and created the first electromagnetic waves.

Hertz was no engineer, either. He was a great experimental physicist who also observed the photoelectric effect – he observed the illumination of materials for purely scientific reasons (knowledge, not applications). Just tell me how it could happen that we would be sending and detecting electromagnetic waves without full-fledged pure scientists such as Maxwell and Hertz. Maybe, a random Portuguese sailor would accidentally send some electromagnetic wave while playing with the frog muscles that cause electromagnetic induction and send waves when they oscillate. But would there be another sailor who would detect the electromagnetic wave (by another frog? Or what would the sailors' first antenna look like?) so that the discovery of his friend wouldn't remain unknown to the mankind?

Please, don't be silly. We wouldn't have the electromagnetic waves – all the TV, radio, GPS, cell phone etc. communication – if people hadn't studied physics much like the 19th century physicists. And while there are other examples in the 19th century as well, the role of physics got even more extreme in the 20th century.

In your computer and smartphone etc., there are lots of things called the transistors, you must have heard about them, right? Transistors are essential for all the smart electronic devices around us to be able to "think". The modern transistors are really "printed" and no longer visually resemble the first big transistors but the physical mechanism is still the same. Were transistors invented by the Portuguese workers or sailors by trial and error while solving their everyday problems? Please, Mr Ridley, you surely know that this view is absurd, don't you?

Transistors were developed by John Bardeen, Walter Brattain, and William Shockley in 1947. These were not janitors, sailors, or engineers. They were full-fledged physicists and they had to be. To present an argument that John Bardeen was a physicist, let me mention that he is still the only person who has won two Nobel prizes in physics: for transistors in 1956 and for the BCS theory of superconductivity in 1972. The second Nobel prize makes it clear that Bardeen was a stellar theoretical physicist, too. You know, the BCS theory envisions the (Leon, not Sheldon) Cooper pairs of electrons (held together by the attractive force resulting from the exchange of phonons) that get Bose-condensed.

The whole explanation why transistors actually work is all about quantum mechanics. Try e.g. this chapter of Feynman's Lectures on Physics and the previous one (13). How could have ordinary people driven by practical everyday problems ever invented the transistor and/or combine lots of them so that they would get a working computer out of them? Don't be silly.

We would have no wireless/radio communication and no thinking electronic devices (at least those smaller than a room which need transistors; but vacuum tubes wouldn't exist without physicists, either, and similarly relays were started by Joseph Henry who was both an inventor and a Princeton science professor – his name is the SI unit of inductance) without physicists' research. And it's the 19th century physics of electromagnetic waves and the 1947 discovery of the transistor effect. The need for the experts' physics knowledge is getting more extreme every day. The newer components of the future machines that may replace the transistors or the existing RAM or flash memories depend on physical effects that require an even greater expertise in (especially condensed matter) physics. Consider effects such as giant magnetoresistance.

And there are dozens of big things like that. Could the practically driven people without a formal training in proper science and without the scientific rigor and integrity ever discover magnetoresistance? Without a proper scientific approach, the "inventors" couldn't even isolate the effect from lots of other effects. If you can't isolate it because you don't really know what's physically going on, you can't use this effect as a building block for greater devices.

If you are not crazy, you may see the critical role of physics research everywhere around you. Take displays – CRTs, LCDs, LEDs. The cathode rays and all the semiconductor inventions depended on physics research that was driven by the desire to understand the laws of Nature that govern matter.

Or take lasers. Einstein had to make some truly fundamental insights about the existence of the stimulated emission. You know, the absorption rate is proportion to the number of photons around you, \(N\), which drops to \(N-1\) once one photon is absorbed. The time reversal of absorption is the emission so that the number of photons around you jumps from \(N-1\) to \(N\), so the probability of emission must be proportional to the final number of photons, here denoted as \(N\), too. But normally, we use \(N-1\to N\) for the initial number so the final number after emission is \(N\to N+1\). With this normal notation, the probability of emission has to be proportional to \(N+1\), Einstein basically argued. The term \(N\) is the stimulated emission proportional to the number of photons that are already out there; the term \(1\) is the spontaneous emission that occurs even when there are \(N=0\) photons around.

This is just the "most theoretical" part of the physics behind lasers – the reason why emission may be stimulated at all. You have to combine it with the understanding of the atomic transitions; with the Bose-Einstein behavior of the photons; and with lots of detailed physics of pumping and other things. People who were able to construct such things were doing physics and had to be doing physics, not just some engineering and random discoveries by trial and error.

By the way, this discussion reminds me of the creationists' debates about the irreducible complexity. They like to say that the evolution couldn't have developed things like the eye because it's composed of many things that don't increase the survival odds separately and it's very unlikely for the numerous separately useless components to meet at one place. That would be a great argument against evolution if it were right. Except that it's not. You may look at the evolution of eyes over the tens of millions of years and you will see how the predecessors of the human eye were always helpful for the animal and how the usefulness was getting higher. Even without the lens, some detection of the light intensity is useful for the animal, even one pixel. Many pixels make the detection more accurate. Lens help to increase the precision of the picture and so on.

But sometimes, you find "engines" in the biological system that look surprising. Why are they there at all? I forgot the best example but there are examples of organs that evolved for completely different reasons but happened to be useful in a new context later.

Not just the examples above but lots of the technological advances couldn't have taken place without curiosity-driven physics research. Ridley's main thesis is that the technological progress is analogous to the evolution of life. I agree with that. But the problem with his story is that he's wrong about the evolution of life, too. Even in the evolution of life, there had to be some moments at which the evolution depended on big jumps, big mutations, something that went beyond the business-as-usual and that turned out to be a game-changer. And such mutations or moments were more important than millions of years of business-as-usual everyday evolution!

And exactly the same thing holds in the technological progress. You can obviously say that the accelerated progress in the recent 6,000 or 500 or 100 years is "natural" (it doesn't contradict the laws of Nature!) but the "natural" evolution that is this fast unavoidably depends on science (something that no other known species on Earth has mastered) much like our progress depended on the brain or the human eye (and some other "natural" things). You can even say that our usage of science is also progress by "trial and error". After Galileo's accidental discovery of science, the mankind has tested whether it's possible to live without science again and by "trial and error", the people have figured out that the answer is No! To think that an equally fast progress could occur without science means to misunderstand the modern world completely. Mammals have been around for tens of millions of years but they haven't invented much for this whole time. Even human beings – with a comparable intrinsic IQ to us – have been around for millions of years. But it seems that the true civilizations only began 6,000 years ago and technologies only exploded a few centuries ago etc.

Why so recently?

Science is a key reason. Science may be said to be a part of the evolution and the everyday struggle to survive as well except that science represents a new strategy, one that focuses at much grander goals, ideas, and long-term questions, and it turns out that these novelties have actually helped the humans to become the technological masters of the planet and to hugely accelerate the technological progress. People (and raccoons) always wanted to live better but they didn't have the right know-how – especially the recipe to be curious about science and do it carefully – and that's why they weren't making much progress for thousands (or millions) of years.

I've mentioned electromagnetic waves, lasers, and transistors as top examples of omnipresent technological components that wouldn't have existed without physics. I could tell you a dozen of similarly paramount examples and a hundred of examples from the following, just little bit less groundbreaking, category. And I could add dozens of additional ones in which the progress depended on both engineers and physicists and one wouldn't be certain which group was more important.

But if I wanted an example of a technological advance that exposes the laughable character of Ridley's (mis)understanding of the technological progress most obviously, I would probably pick nuclear energy. Do you really believe that Portuguese sailors, workers in a textile company, or miners would construct a nuclear power plant without any interest in physics – radioactivity of Madam Curie, nuclear physics that evolved from those findings, \(E=mc^2\) of Albert Einstein, up to Fermi's impressive physics experiments in Chicago (and the project in Los Alamos)?

Just imagine that these ordinary people would be making lots of experiments by trial and error to find out that if they place a lot of uranium into a conventional explosive, they can make a really big explosion. Well, if they were doing such experiments by trial and error, they would either end up beneath the critical mass and see nothing; or they would die. (Well, they would probably die long before that, due to the radiation exposure, but let me ignore those things.) Their countrymates would probably not learn what were the last steps before the big explosion. Would they try to investigate similar things if they knew that it has killed their friends but they wouldn't know how big their explosion was going to be? You must know in advance that something like that may happen and what the impact is approximately going to be.

It's ludicrous to imagine any similar progress without science. Mankind wouldn't learn how to build an OK nuclear bomb without a scientifically rigorous attitude. But even if you imagine that they would, could they invent the peaceful nuclear energy, the nuclear power plants? It's a similar mechanism that would have killed millions of people in the trial-and-error approach to nuclear energy. Now, some people would propose to use a "similar effect" to produce electricity. Would it be sensible for the public to allow the research by trial-of-error of something that is "similar" to the deadly weapon that has killed millions? Even if they allowed it, it wouldn't work. There are just so many things you need to calculate before it even becomes safe enough to play with large enough amounts of these radioactive substances!

The idea of having electromagnetic waves, transistors, and lasers without physicists is laughable but a science-hating layman could have adopted a wishful thinking and say "yes we can". But when it comes to the research of nuclear energy without the scientific rigor, even this layman would say "no, we shouldn't!".

I will stop with these examples because Ridley's general assertion that (almost?) all the technological progress could have occurred without actual physicists is ludicrous beyond imagination. But even if you consider technological advances that only occurred as "random side effects" of physics research, it seems very likely that the concentrated physics research was largely inevitable for dramatic enough advances.

The Internet – in the sense of the global network that sends computer-readable data – was primarily developed by the U.S. military. You can imagine why. The military needs to send some reliable data that affect the planning and moves. But as you know, the World Wide Web – the "subset" of the Internet that exchanges the data you may see in your web browser – was developed by the folks at CERN.

Could it have occurred elsewhere, at some completely mundane place? It could but it was much less likely. At CERN, they needed to transmit not just information but many kinds of data types. Text in different fonts. Pictures. And they arguably had enough time to play as well – for the multimedia etc. So Tim Berners-Lee invented both the HyperText Markup Language (HTML) around 1980 as well as the Hypertext Transfer Protocol (HTTP) to transfer such HTML files over the Internet by the end of 1990, basically the full system you need to have browsers and servers on the pre-existing Pentagon-style Internet wires.

Someone like Tim Berners-Lee could have worked at Pentagon or PepsiCo or be a Commodore 64 hobbyist at home, too. But you know, at home in the 1980s, you didn't really need computer files to share pictures with your family. And 64 kilobytes of Commodore 64 wasn't enough to invent elaborate schemes and protocols to encode web pages with pictures etc. You were grateful for being able to squeeze a few low-resolution pictures to the memory at all. And you didn't have the Pentagon-style Internet wires at home. And at Pentagon, they would have fired Tim Berners-Lee for playing with a system that allows the employees to look at pictures in the web pages because pictures aren't serious work for the army. And so on. Obviously, I can't rigorously prove that the web couldn't have been invented outside the Big Science environment. But the actual history of the web is an argument. And there are many arguments like that. Too many similar inventions are connected with the people who worked at CERN or Princeton or at least the Bell Labs in New Jersey.

Again, to summarize, Ridley's picture of the technological progress as some "easy business that occurs automatically, without scientific rigor, integrity, training, and expertise, without the concentration of exceptionally talented brains, without curiosity and desire to understand Nature, without the careful sharing of accessible enough insights, and without special research funding" is valid for the part of the technological progress that is easy and that occurs automatically, without the scientific rigor, integrity, training, and expertise, without the concentration of talented brains, without curiosity and desire to understand Nature, without the careful sharing of accessible enough insights, and without special research funding.

But the key point is that the most important developments in technology can't be squeezed into this straitjacket. Technology couldn't be where it is without science, without its rigor, integrity, and all the other things I have already written twice in the previous paragraph. :-) Those things are needed for the big breakthroughs and they will be increasingly needed if the progress is supposed to continue. And the world of innovation needs exceptional brains – and their concentration – for similar reasons why business requires great managers and the concentration of the capital.

By his populist claim that the ordinary people and their purely practical motives and everyday work are enough for all the progress, Ridley may earn lots of cheap political points but he can't hide the fact that his understanding of the whole history of the civilizations is a pile of stinky garbage.

Add to del.icio.us Digg this Add to reddit

snail feedback (0) :