Update: Ross McKitrick has pointed out a fascinating and related 2003 economics article about persuasion bias. The authors not only argue that people are not able to properly realize that some sources of information are repetitive and fail to be independent but they also derive mathematical consequences out of their Markov chain model: it turns out that this dynamics naturally polarizes opinions into a one-dimensional (left-right) continuum even if the questions are multi-dimensional. This statement obviously boils down to the "+1" eigenvalue of any stochastic matrix, encoding a fixed point (opinion difference between main groups). If you look at the bulk of the paper, it is actually a hardcore mathematics paper with a lot of matrices and theorems. Enjoy!Imagine that the opinions of the scientific community are representative of the actual likelihood that a certain assertion is true, according to the best methods and data that are available to the human civilization. Such an assumption may probably look more or less realistic in the case of many sciences even though you should realize that this statement has no eternal value. Eventually, the probability of any well-defined assertion goes to 0% or 100% because we learn what the correct answer is. Once we learn the right answer, the previous probability between 0% and 100% becomes falsified.
Many of us think that because of various political and economical pressures and because of group-think discussed below, this assumption is very far from being true in the case of the climate science. But let us assume, for the sake of simplicity, that it is true even in the case of climatology.
The percentage of the public that thinks that global warming is either not man-made or it is not dangerous is comparable to 40 percent or so. Among the scientific community, it may be estimated to be around 20 percent. The remaining 80 percent are not necessarily alarmists because most of the group is composed of the silent majority that dominates in such issues.
But among the speakers - politicians - at the high-level event in the United Nations on Monday, the percentage was around 0.5 percent. Czech President Václav Klaus was the real reason why the number was nonzero. It is not hard to see that 0.5 percent is much less than 20 percent or 40 percent. The participants of the climate summit are not representative of the opinions of the public and they are not representative of the opinions of the scientific community either.
Where does the discrepancy come from?
Well, politicians usually think that it is a good idea for them to represent a majority because they feel that it implies that they will enjoy a greater political support: they will be more likely to win elections and they will have a greater influence. Because the people who believe in man-made global warming (or who don't openly disagree with it) seem to be a majority both in the public as well as the scientific community right now, a "rational" politician may find it natural to modify his own opinions to be compatible with such a majority.
In the scientific community, we decided that the percentage of climate skeptics is 20%. Let's not argue about the exact number: the real point is different and we only need to agree that the percentage is much higher than 0.5%. If you trust the scientific community, you might say that the probability that the skeptical hypotheses are correct are comparable to 20%. So is it OK that this number becomes 0.5% in the United Nations?
Needless to say, I think it is very bad. If the percentage of people who happen to okay a particular conjecture happens to exceed 50%, it surely doesn't mean that the conjecture is correct. Only imbeciles could think otherwise. The brutal decrease of the number should be counted as nothing else than an example of political distortion of science.
The fact that there were many fewer than 20% skeptical speakers in the United Nations means that the institution is failing as the voice of the people of this planet. Equally importantly, it means that this international institution exerts illegitimate pressure on scientists to push their research and conclusions in a particular direction. All these things are very bad.
Can these mechanisms be fought with?
In the previous paragraphs, we mentioned that the expansion of majorities is not an exception but a result of a behavior that is, in some sense, rational. Many politicians are spineless jerks who do politics to maximize their own benefits - much like most people in many other occupations, after all. And when they evaluate expected costs and benefits, almost all of them simply conclude that it is a better idea for them to side with the majority.
Two obvious questions should be asked:
- Is their behavior truly rational?
- If you assume that it is truly rational, should we design policies that would prevent such an amplification of majority opinions?
Concerning the first question, I only think that their behavior is rational because of a bad atmosphere in the society and because of undemanding voters. Indeed, many people prefer politicians who agree with them right now and who defend their interests: more general moral values are secondary. Whether a politician can actually be trusted - whether he or she builds on honesty and other moral foundations - is not too important. If honesty were viewed as an important value expected from politicians, the "amplification of opinions" would obviously diminish and the percentage of skeptical politicians would be much higher i.e. much closer to the percentage in the general public or the scientific community.
I actually think that Václav Klaus is not losing any political capital in the Czech Republic by his "unpopular" opinions but I tend to agree that if you look at the whole global political scene, the answer is that an average politician loses whenever he offers "unpopular" opinions. Incidentally, unlike the global press, the Czech press dedicated a lot of room to Klaus' speech and praised it. People in the U.S. should also understand that no foreign journalist would ever criticize leader's imperfections in English especially if the leader's English is better than English of most other leaders and virtually all journalists. ;-)
The unusually rational approach of the Czech media contrasts with the scientific (!) magazine Nature that just called Klaus a "renegade". Sorry, guys from Nature, this is not scientific terminology - it's language of religious cranks. Moreover, you are using the term incorrectly because renegades are people who fell from (originally Christian) belief. Klaus has never believed similar kinds of a politically-driven pseudoscientific silliness so he can't be a renegade.
Fine. So let us accept that honesty is not a value in the present world. With this assumption in mind, we still want to ask whether the politicians' behavior is rational.
Amplification of opinions as a bubble
To answer this question, I would like to propose an analogy between the amplification of opinions i.e. group-think dynamics on one side and financial bubbles on the other side. Whenever virtually all politicians decide to agree with a majority about a question that only influences their life by the perceived agreement with others and not directly, they are participating in an ideological counterpart of a financial bubble.
Once a spineless politician concludes that a certain opinion is likely to get stronger, he may want to jump on the bandwagon. This desire to jump on the bandwagon will be getting increasingly strong because all politicians know that other politicians will be jumping on the same bandwagon because of the same reason. The result is that virtually all politicians join the bandwagon. The analogy with the bubble is hopefully manifest. In sociology, we talk about group-think.
Group-think is the most typical reason why the probability that a majority is wrong is often much higher than the percentage of the minority which is why our first assumption was incorrect anyway. ;-) Just to be sure: the probability can also be much lower but the most typical situation when it's much lower involves a minority that is intellectually insufficient to analyze the question rather than group-think.
If the analogy really works, you may want to ask whether the bubble can burst, much like the financial bubbles. The answer is, of course, affirmative. It is affirmative not only on paper: we can list a lot of examples from the history.
A virtually identical dynamics as the current global warming hysteria has appeared in many countries, societies, and communities during many eras. But let us choose Germany of the 1930s. An ever growing percentage of the public and the politicians would support the views of the NSDAP. This societal group-think is another example of the bubbles we talk about. When did this particular bubble burst? Well, people had to wait until 1945 or so for the bubble to fully burst. ;-) But it did burst, after all. Bubbles can't last forever if they're only filled with hot air.
In the case of the opinion bubbles, the finite life expectancy is even more obvious than in the case of the financial bubbles. When the percentage of the people who endorse a certain opinion approaches 100%, their position loses any advantage because most of their competitors are advocating the same opinion anyway. Because the relative benefits of such a majority position converge to zero, the original motivation to act in this way fades away. People inevitably return to other, usually more rational ways to decide what they should think and say about a certain question. In the case of climate change, that means to return from 99.5% to 80%. Meanwhile, the figure of 80% may really converge to 0% because of some objective evolution - for example when your empire faces setbacks against Stalin and the Allies or if the temperatures start to drop again, to mention a particular nightmare of the alarmists.
My main point is thus the following: even if you're a spineless, greedy politician - such as most of those we have seen in the United Nations yesterday - your group-think might only reflect your poor ability to quantify the risk. When bubbles burst, it can be pretty painful. So I discourage you from threatening scientists and encourage you to choose your opinions about climate change and related issues by a careful appraisal of the evidence that is available to you rather than by a calculation which position will bring you the highest political profit in the short term. If you act wisely or if the voters force you to act wisely, no special policies to fight against group-think in politics are needed.
And that's the memo except that I want to write a few more paragraphs about another analogy that may have come to your mind.
Amplification of opinions and proportional vs majority systems
A reader could simply point out that there were 99.5% alarmists in the United Nations because in each country, they represent a kind of majority - or at least a majority among the activists - which is why the speakers don't reflect the proportional composition of the society. This mechanism does contribute but you can't explain the data without the group-think dynamics: it is not just about a selection of speakers. For example, most prime ministers are alarmists.
But it is true that majority systems will naturally amplify the opinions of majorities, especially if the composition is something like 80:20 that we discussed above. Is that a good thing or a bad thing? Above, I have mentioned that it is surely a bad thing is this political dynamics distorts the information about the likelihood of various answers as understood by the scientific community. But more generally, are majority elections worse than proportional representation?
I don't think that there exists a universal answer to this question. But one can obviously say the following thing: if there exist good rational reasons to think that the minorities are really bad or incompetent people, a system that suppresses them - majority elections - may be superior. And vice versa: if there are reasons to think that minorities bring something to a business that is unique and essential or at least equally valuable, proportional representation may be a better way to go. I don't want to discuss specific examples and where they belong because it could be too controversial and off-topic.