Last time when I talked about the Sleeping Beauty Problem a week ago, I wanted to convey two points.

One of them – the woman's visit to Guantanámo Bay – was that the belief in \(P=1/3\) is equivalent to a major flaw in thinking that turns many people into conspiracy theorists: they think that even if some scenario is very unlikely (e.g. that the woman is transferred to the facility in Cuba), the low value of the probability may be "beaten" and made irrelevant by inventing "huge implications" such as a very long sequence of torture and interviews.

But it isn't the case. If the probability of her transfer to Cuba is just \(1/5,000\) per week, the 50,000 interviews on Wednesday only share the tiny probability that she's transferred to Cuba in the first place. They can't be added up because they're not mutually exclusive. The message is that if some evolution is insanely unlikely so that it happens much less frequently than once per the lifetime of the Universe, you may just assume that this event is impossible regardless of the potentially huge hypothesized consequences of the unlikely scenario.

Another yet related point I wanted to make is that the Bayesian inference implies \(P_{\rm tails}=1/2\) if you do it right even if you include the "hypotheses about your state" among the competing hypotheses. "Doing it right" involves realizing that "Monday heads" and "Tuesday heads" aren't really mutually exclusive possibilities. You may have been confused why I divided some probabilities by two at some point but I think that Bob Walters (see also his bonus text), with some help from Nicoletta Sabadini, makes all these points clearer than I did.

First, he enumerates a couple of people at very famous places who give the invalid \(P_{\rm tails}=1/3\) answer to the elementary Sleeping Beauty Problem. He says that when she wakes up, she is getting no nontrivial information that would reduce or increase the odds of "heads" or "tails" – because both hypotheses imply that she would be woken up at least once, and that's the only thing she is observing. Because she is learning nothing new, her subjective probabilities of "heads" and "tails" have to be the same as they were on Sunday, namely \(1/2\) and \(1/2\).

But he also has a nice new discussion of the "probabilities of different states" using the machinery of the Markov chains.

The Markov chain looks like this. The state \(0\) is Sunday and it evolves either (with probability \(1/2\)) to \(1\), "tails Monday", which then evolves to \(2\), "tails Tuesday" (she's not woken up on Tuesday in the "tails" case). The closed loops indicate later days that we identify with each other. Or from \(0\) "Sunday", it may evolve to \(3\), "heads Monday", which then certainly goes to \(4\), "heads Tuesday" – the second interview in the case of "heads".

**The egregious error**

What Walters identifies as the key mistake in the thirders' reasoning is that they don't appreciate the following fact:

In a Markov process one cannot in general talk about theIn a Markov process, we may only talk about the probabilities of transitions and we must specify the number of steps, he points out. This refinement has two points that I am going to discuss in more detail.probability of being in a state. One must talk about the probability of passing from state \(x\) to state \(y\) in \(n\) steps. (If the process is ergodic then in a limiting distribution there is some sense of talking about the probability of being in a state, but that is not the case here.)

(Walters also says that the initial state and the number of steps may become irrelevant after a process of thermalization when the ergodic theorem implies that the "probabilities of a state at a random moment" may be identified with a "probability distribution on the phase space". But without this identification i.e. without thermalization, the notion of a "probability at an unspecified moment" isn't well-defined.)

First, the probabilities in a Markov process have to specify the whole transitions, including the initial state. There's nothing such as a "probability of a final state" without specifying any information about the initial state. It's sort of obvious – one can't derive any probabilities if he isn't told anything – but the necessity to have some initial specified state is the kind of insight that the likes of Carroll love to overlook because they hate the idea of a beginning of a process in Nature (especially a low-entropy beginning of the evolution of any physical system, whether it's the Universe or anything else).

But in the case of the Sleeping Beauty Problem, the need to specify the initial state \(x\) doesn't really make a difference because everyone knows that we may assume that the \(x=\)"Sunday" with the coin toss is the initial state so it doesn't hurt when we assume this initial state.

Second, more importantly, the probabilities in the context of a Markov chain have to specify the "time" which, in this case, means the number \(n\) of steps that the evolution from \(x\) to \(y\) takes. The probabilities for different values of \(n\) have to be treated separately. Why? Walters wrote the explanation remarkably clearly so that, I believe, it should erase all your doubts.

The probabilities\[

P_{x\to y,n}

\] of the mutually excluding transitions from \(x\) (Sunday) to various final states \(y\) in \(n\) steps obey\[

\sum_y P_{x\to y,n} = 1

\] because after \(n\) steps, the system has inevitably reached one of the allowed states \(y\). However, the thirders' fallacy is to treat the transitions \(x\to y\) with different values of \(n\) as mutually excluding, so they effectively assume\[

\sum_{n=1}^{D_{\rm max}}\sum_y P_{x\to y,n} = 1\quad \text{(wrong!)}

\] However, this assumption of "mutual exclusiveness" of the evolutions – in our case, the mutual exclusiveness of "Monday tails", "Monday heads", "Tuesday heads" – is clearly wrong. We may calculate the right value of the sum above and

**this sum is clearly greater than one**. Well, the summand is equal to \(1\) for each value of \(n\) – after \(n\) steps, one is guaranteed to get

*somewhere*– so the "total probability" envisioned by the thirders is equal to \(D_{\max}\), the maximum allowed number of steps (or days). It may even be infinite. But for a set of options to be considered mutually exclusive, the sum of probabilities has to be one!

So what's the probability that "Sunday" evolves to "Monday heads" or "Tuesday heads"? The thirders would incorrectly say that this is simply the sum of the probabilities over these two options and over all possible values of the number of steps \(n\). In practice, it means the sum of the probability of "Sunday to Monday heads in one step" and the probability of "Sunday to Tuesday heads in two steps".\[

P_{{\rm Sun}\to \text{any day, heads}} = P_{{\rm Sun}\to {\rm Mon,heads},1} +

P_{{\rm Sun}\to {\rm Tue,heads},2}\\ \text{(wrong!)}

\] Both terms on the right hand side are equal to \(1/2\) because we know that the coin must end up "heads". You see that the sum of these two probabilities – and the thirders had to use the sum because they incorrectly assumed "Monday heads" and "Tuesday heads" to be mutually exclusive – is already \(2\times 1/2 = 1\), so it should mean that there is a certainty that one gets heads. For the evolution to "Monday tails" (in one step), there is only one term that is again equal to \(1/2\). That's twice smaller than the sum \(P=1\) we got for "heads" which is why thirders ultimately incorrectly assign the probability \(P_{\rm tails}=1/3\) to the "tails" and the remaining \(P_{\rm heads}=2/3\) to the "heads".

However, you see that the intermediate results giving this answer really involve the assumption that the evolution from "Sunday" to "any heads" is equal to one – although there's surely no certainty that you get "heads". This is obviously a manifestation of the fact that their probabilities do not sum to one. Their sum exceeds one.

I have mentioned that the thirders' invalid assumption of the mutual exclusiveness of different "days" is mathematically expressed by the wrong equation\[

\sum_{n=1}^{D_{\rm max}}\sum_y P_{x\to y,n} = 1\quad \text{(wrong!)}

\] The actual sum over both \(n\) and the final states \(y\) is actually greater than one. Is there a way to assign probabilities to different states in the Markov chain, assuming an initial \(x\), in a way that pays no attention to \(n\)?

Yes, there is, but we are not allowed to

*sum*over all values of \(n\), the number of steps, because this sum yields the "total probability" for all options to be greater than one, as we have repeatedly said. Instead, the right ways to calculate \(n\)-blind probabilities of different states is to compute a

*weighted average*of the probabilities assuming a certain value of \(n\). The weights aren't determined in general but if there is no difference between the "Monday heads" and "Tuesday heads" interviews, we may choose the weights to be equal to each other i.e. equal to \(1/2\). It means that the thirders' wrong equation\[

P_{{\rm Sun}\to \text{any day, heads}} = P_{{\rm Sun}\to {\rm Mon,heads},1} +

P_{{\rm Sun}\to {\rm Tue,heads},2}\\ \text{(wrong!)}

\] has to be replaced by\[

P_{{\rm Sun}\to \text{any day, heads}} = \frac { P_{{\rm Sun}\to {\rm Mon,heads},1} +

P_{{\rm Sun}\to {\rm Tue,heads},2} }{2} \\ \text{(OK!)}

\] Even with different weights, it would still be true that the probability of the evolution from "Sunday" to "heads during an interview on any day after an unspecified number of steps" is a weighted average of two values that are equal to \(1/2\), and the weighted average is equal to \(1/2\), too. That's the same as the probability for "Sunday" to evolve to "tails on an interview on any day after an unspecified number of days" which means that the "heads" and "tails" continue to be equally likely, with \(P=1/2\).

You may see that the probabilities of \(A\) "or" \(B\) are sometimes the sums and sometimes the (weighted) averages. This is very analogous to (and in fact, a special case of) the point made in many blog posts about the second law of thermodynamics: probabilities for the evolution of "one ensemble" to "another ensemble" are summed over final microstates but averaged over the initial microstates. Because \(n\), the number of steps in a Markov chain that have led to the present state, is a part of the information about the past, it follows the rules of the past so we must average over different values of \(n\) and not sum! More generally, thirders are the people who would always like to sum probabilities and interpret the sum as a probability – while not caring whether the sum exceeds one.

If you use Bob Walters' perspective, you may see that the thirders implicitly assume that the sum of probabilities of all mutually exclusive options is greater than one (e.g. two). That's quite an egregious error, indeed. It's an error leading (or at least helping to lead) various people to say lots of dumb things about "problems with the second law of thermodynamics", "many worlds of quantum mechanics", "anthropic principle", or – in the case of the most unhinged whackadoodles – "Boltzmann Brains".

## snail feedback (46) :

What is wrong with physics education? How do all these charlatans get into influential positions? Even sanitary engineers get Markov chains. It's a first semester probability topic.

During my career I witnessed many examples of incompetent technical people being hired for positions requiring technical expertise. Non-technical people cannot judge technical competence although the best managers are able to spot the frauds because they are such good judges of human character.

I have long held the view that every technical position should require passage of a proper written examination in the candidates field of expertise. I have seen this put into practice in a company which I co-founded. It works.

The price of bad hires is always the same: failure of the project.

The note below was intended to be a reply to BobSykes.

Speaking of Cuba, here's some interesting news, "Obama Administration Program Secretly Sent Young Latin Americans To Cuba To Gin Up Rebellion":

"Beginning as early as October 2009, a project overseen by the US Agency for International Development (USAID) sent Venezuelan, Costa Rican and Peruvian young people to Cuba in hopes of ginning up rebellion. The travelers worked undercover, often posing as tourists, and traveled around the island scouting for people they could turn into political activists.

In one case, the workers formed an HIV-prevention workshop that memos called “the perfect excuse” for the program’s political goals — a gambit that could undermine America’s efforts to improve health globally."

http://washington.cbslocal.com/2014/08/04/ap-obama-administration-program-secretly-sent-young-latin-americans-to-cuba-to-gin-up-rebellion/

Getting USAID is like a mob boss kissing you on the lips. It is NOT an act of love. ;~)

The SB problem seems to be a failed thought experiment. What it should say is that two people fall asleep and then a coin is tossed. If the coin is heads, a computer chooses one person randomly to wake up. If tails, the two are woken up. Now, for a person who wakes up in such an experiment, what is the probability that the coin is heads? 1/3

It amazes me how the "thirders" just arrogantly assume that the "halfers" are wrong without analyzing their arguments...It also reminds me of Lewis Carrol's "Bellman's Rule of Three"..."Anything I say three times is true."

The problem seems simple and unambiguous, and the probability is 1/2.

Here is another variant of the sleeping beauty fable:

https://www.youtube.com/watch?v=uW68CWvhy3U

I would agree with this result but it's a different problem.

Another terrible case of the 'uniform prior'. Its really depressing how much sway this nonsense holds over scientists in all fields and mathematicians alike. God either exists or not, therefore the probability is 50%? Please. Some probability distributions are uniform by construction (dice with rotational symmetries, marbles drawn from bags constructed to be identical objects), but there is no general principle that says the universe or creatures in it should prefer uniform distributions, cetirus paribus. The coin is fair; that we know (assuming it does have one symmetry). Dividing the probability evenly over 3 possibilities on the other hand, is just a clueless stab in the dark.

Dear Johan, I agree with what you write - if you don't overinterpret it. The thirders (and anthropic people) indeed generally violate causality and think that the future evolution may affect the probabilities of their causes in the past. They cannot.

On the other hand, *learning* new things - in the relative future - may affect one's knowledge about the past! The subjective probabilities may in general change. That's not the case here because both the woman only observes that "the number of interviews is at least one" and both hypotheses, tails and heads, predict this outcome with 100% certainty, so the outcome can't be used to discriminate between the two hypotheses!

Finally, if you're the same Johan who just posted on Carroll's blog, I totally disagree with your claims that each of the three claims you listed has probability 1/3. As this very blog post was meant to reveal, it just isn't the case. The option "1" is nothing else than "tails" - it is completely equivalent, so if it had probability 1/3, then tails would have probability 1/3!

Instead, the probability of 1) is 1/2 and the probability of 2) and 3) is 1/4 each!

Yes, I'm the same Johan. Concerning your 2nd paragraph, after some further thought, I agree with you. In both instances (H or T) she knows for certain that she's going to be interviewed (at least once), so as you correctly state, it doesn't allow her to discriminate. But then again, who knows how the mind of a (blonde) woman works :)

The Markov chain reasoning was a bit harder to understand. Following a similar but simpler (less formalistic) line of reasoning, I can see now why the probabilities would have to be 1/2 for 1) and 1/4 for 2) and 3) each. In fact, if I understand correctly, the probability for 1) would always be 1/2, whereas the other probabilities with "n" awakenings (when heads) would each be 1/2n.

I admit that the problem is far more subtle than I first thought.

What a wind-up this whole business is!

Right, sit up and pay attention everyone. I'm going to give you the definitive solution to the Sleeping Beauty problem.

You all know the conditions by now — if the coin lands heads she's woken twice; if it lands tails she's woken once ... yadda yadda ....

At this point it suits my purpose to chose Sean Carroll's formulation of the question put to her, namely:

Each time she is asked a question: “What is the probability you would assign that the coin came up tails?”Anyway, to cut to the chase, they toss the coin, wake her up and ask her the question.

She replies, "Extra-hold hairspray."

Now that response might make no sense to you, but it is of course the correct answer. It's correct because we know that Sleeping Beauty always tells the truth.

Yes, folks — she actually believes the answer to the question she has just been asked is "Raspberry-flavoured lip gloss". No wait! It wasn't that — I remember now, it was "Extra-hold hairspray". So that is the correct answer. Gottit?

OK, she might have said, "Extra-hold hairspray and lily-white titty cream," but she didn't since we already know the correct answer, and it doesn't involve dairy produce of any kind.

Of course,

hadshe said, "Extra-hold hairspray and lily-white titty cream," then that would have been the correct answer too. But she didn't, so it isn't.Now, those of you blessed with no flies about your person will have already worked out from this incisive analysis of mine that it is possible to formulate a rather general result, namely:

However she answers the question, her answer is correct.And from this a neat corollary immediately follows:

Since any answer she gives is as good as any other, then there is no point in asking her the question in the first place. And since there's no point in asking her the question in the first place, then the whole thing is a.non f#cking problemOK, we'll take a break now.

Oh yeah, I nearly forgot! Don't bother coming back — we're done.

:)

Thanks, Johan! Yup, it's subtle - but it's still elementary. I would say that it's the simplest possible exercise on probabilities after the calculations where you have N options in a row at the same time, where you say that they're equally likely, and you ask what is the probability (N1/N) that you get something from a subset. ;-)

In the Sleeping Beauty case, all the would-be clever claims that the probability is 1/2 but if one asked it a bit differently, the probability would be 1/3 or 2/3, are just wrong.

There is no way to formulate this problem that would make her subjective probability of any combination of the features 1/3 or 2/3.

For example, when she wakes up, the probability that it is Monday isn't 2/3, it is 3/4. Well, the conditional probability that it's Monday assuming that the coin is tails is 100% - because Monday is the only wakeup day. And the conditional probability of Monday assuming heads is 50% because there are 2 equally likely days in that case.

Because heads and tails are equally likely by the symmetry of the coin, the overall probability that it's Monday is a (fair arithmetic) average of the numbers 100% and 50% from the previous paragraph, i.e. 75% = 1/2 + 1/4.

One may only get 1/3 or 2/3 by summing and dividing probabilities at different moments in incoherent ways - where one should take averages etc. - and none of these mathematical expressions may be interpreted as her subjective probability of anything.

This sort of "thought experiment" seems to me to do little but confirm my doubts about the use/misuse of subjective probabilities. I personally think the blond in question would be too busy screaming for help, because her subjective belief of the probability that she had fallen into the hands of a bunch of lunatics would approach one. It is just too hard to believe any rational person would accept the terms and conditions of this "experiment." So it seems to me that (however you meant it) to a subjective Bayesian whatever she answered would be correct (encode her state of knowledge), and have little to do with what side of the coin came up on top. I can't even tell if the subjective probability we're talking about is the blonde's or mine.

I'm wondering if anyone can give me an example of the kind of information the subject could receive after the coin toss that would make heads (or tails) more likely without making it certain, other than information about the coin itself (i.e., that it might have been weighted).

http://en.wikipedia.org/wiki/Carbonic_acid

and keep in mind that the table has #s for pure water, ocean is buffered so effects would be smaller and start from pH ~ 8.2

Article you reference is pure fantasy. Nobody can measure the pH to 3 decimal points, (as in Present levels

~8.069),not really reliably even at 2 dec. points, and variability is much bigger. As for (Pre-industrial (18th century)

8.179) ["The concept of p[H] was first introduced by Danish chemist Søren Peder Lauritz Sørensen at the Carlsberg Laboratory in 1909 and revised to the modern pH in 1924 to accommodate definitions and measurements in terms of electrochemical cells."] So again, nobody has data from 18th century. It is very similar to the temperature residue manipulations.

Huh? Suppose that after the toss, the powers in charge of the experiment make another toss and write down a 1-bit value HH->1, HT->1, TH->1, TT->0. If you get 1, you know that with probability 2/3 the first toss was heads.

We know that without the CO2 factor, what we know about physics seems to indicate our planet should be colder. CO2, when used in certain calculations gave us the desired answer to match "reality". We do not know that CO2 is the actual cause because there is much evidence that we do not understand all the components of climate. Without know all the factors, we don't know CO2 is actually the explanation. In reality, had global warming not become a political force, research might have shown a very different outcome.

I know I will not add anything that nobody said before, because what I am adding is really elementary, so sorry for that.

I am going to the extreme case, in which SB is immortal. On sunday she goes to sleep. The only difference is that now is that she will not be awakened only tuesday but.... forever. In the case of heads, she will not be awaked tuesday, but will be saved from the nightmare on wednesday.

According to the 1/3 guys, in this case the result would be 0. This does not make sense. What would happen is that each time SB will pray it was heads, to be saved next wednesday, but the probability is always the same, 1/2. After all she doesn't know she was awaked before, so there is always a reboot.

So I really do not understand the 1/3, or 1/4, or 1/100000 or any infinitesimal result for this problem. It doesn't matter how many times SB is awakened, for her it will be always 1/2 and she has exactly 1/2 chances of being saved.

I do not think there is anything wrong in my reasoning, and I also think anyone will give this answer. Unless that person is someone expecting the unexpected, just because he or she doesn't like how things are. I know a lot of smart guys like that, in which by using sophisms they are capable to spread out confusion to other people too. The only way to deal with them is just ignoring... and telling them to do only what they know (in other words, shut up and calculate).

Exactly, Cesar.

I won't add anything new, either. But this "suppression" of the options represented by one or few repetitions is also behind the wrong anthropic reasoning, the irrational fear of the Boiltzmann Brains that become the majority and therefore the only ones, and the invalid claims that a low-entropy initial state is "infinitely unlikely".

Only tangentially related to this post (apologies for the obnoxious hijacking):

What is your view on extended probabilities, and does it turn the consistent histories interpretation into a "hidden variable but local" interpretation of sorts?

Below is a curious paper by Hartle and Gell-Man: "Decoherent Histories Quantum Mechanics with One Real Fine-Grained History"

http://journals.aps.org/pra/abstract/10.1103/PhysRevA.85.062120

arxiv: http://arxiv.org/abs/1106.0767

Hartle wasn't the first one to realize that some quantum mechanical surprises/novelties may be attributed to ordinary probability calculus as long as some "not final" propositions are allowed to have negative probabilities.

As Jim realizes very well, the most famous example is the Wigner distribution on the phase space.

However, I don't think that the 2011 paper is right. There can't be a preferred fine-grained history, due to the uncertainty principle. If we try to ask as fine questions about the history as possible, we inevitably run into tension with the uncertainty principle that forces us to only ask about something and not something else, or only with some accuracy, and these choices and trade-offs may be done in many ways.

Quite generally, while I think that Gell-Mann and Hartle have said many right things, I am totally puzzled by these smart men's - and others' - feeling that the research into these foundations should continue. There is nothing else waiting to be discovered here. Even the most sensible people who force themselves to say too much about these new but in principle simple ideas are ultimately guaranteed to write mostly rubbish or repeat themselves, and Gell-Mann and Hartle are no exceptions.

I am now a bit free from the trivialities of Company creation and Management : https://drive.google.com/folderview?id=0BwF_sLVSf9QsX1ZmeVZ6c3F2NGM&usp=sharing

I will resume full participation in extreme mathematical descriptions of reality. I have been meaning to describe this in the language of differential entropy and stochastic correlations. Anyways , I am going to destroy(Read) a text on real analysis tonight just out of spite(prep for plausible coursework) . Will resume stackexchange duties and kaggle competition things right after I fire one of my employees via text :~) .

"doubling of CO2 gives a 'bare' forcing of about 1.2 K"? -- Greenhouse gases Cool, only a little. Simple gedanken negates warming claims.

Below the clouds, effect is saturated.

Above the clouds, extra radiation to space, cools more and lowers the clouds. Per adiabatic lapse, lower clouds imply lower surface temperature.

For additional entertainment, net warming of the atmosphere is from the top, where it is warmer than the adiabatic.

Arrhenius warming does not exist. But propaganda is needed, to scare the dunderheads, into volunteering to be increasingly victimized, by Carbon Taxes.

1. If SB is not making a decision, then whether the probability is 1/2 or 1/3 is irrelevant.

2. If SB is making a decision, then the correct probability to use depends on the nature of that decision. This is because although SB herself might not remember whether she was woken previously, the nature of the decision and its resulting outcomes may have a "memory".

You can not determine the probability in this case without knowing why it matters.

Probabilities measure the degree of truth of a statement. This in no way depends on the type of "application" of the knowledge. It doesn't depend on the "existence" of an application, either.

I never mentioned money at all. If anything, I am saying that knowing the truth is useless unless you plan to use that information.

Or, alternatively, that probabilities do not exist unless there are decisions to "observe" them.

I used the word "money" because the money is the most accurate measure of one's "utility" and I was assuming that you want to talk quantitatively and not just be bullšitting.

At any rate, whether a statement is true or false - and/or what's the probability that it's true - has nothing whatever to do with whether it is useful and/or what it is useful for if anything.

By taking this line, you move from subjective to objective probability, so of course the answer becomes 1/2.

But why does the application matter? Does the application also need an application?

Murray Gell-Mann has also an opinion on climate change: http://www.webofstories.com/play/murray.gell-mann/183

That's a very disappointing, shallow monologue that misrepresents what skeptics argue about it etc.

When it comes to an "action on climate change", ignoring that this phrase is totally idiotic, I surely don't want to wait for an "unmistakable proof that there's something wrong". I only want to wait for the first "convincing argument that the expectation value of the costs of such an action would be lower than the expectation values of the benefits".

This is ludicrous today because the costs of the proposed self-described "actions on climate change" are at least 3 orders of magnitude higher than any proposed benefits - and they're indeed hugely speculative benefits - of such an action.

So I - and other skeptics - agree with the general cliches about using the probability calculus etc. but he is using it upside down. It's not climate change itself but climate alarmism that is the real threat and that must be confronted. Climate alarmism must be confronted well before there is "unmistakable proof" that it's dangerous for the civilization but I think that we're unfortunately already in the stage when we do have a pretty unmistakable proof, anyway.

Yeah, F=MA.

This reply has "physics in it" too, and yet I agree with Naomi Oreskes that models are useful only in very limited circumstances and can tend to reinforce ones own preconceived biases and and in fact can have the appearance of validating incorrect inferences. Maybe you should read her paper, since you fancy yourself a bit of a dabbler in climate models.

http://www.likbez.com/AV/CS/Pre01-oreskes.pdf

Bust out SkS and you show yourself to be an ignorant troll. If you are interested, I can point you to a discussion that they thought was private where they admit that the hockey stick was based on some very questionable premises, and yet they publicly support it to this day.

Asked and answered.

Thanks for the link. Of course models are not 'truth' and contain parameters that are not fully known. That is why they get updated with new information and are presented with error bars. I'm happy to see them as 'best guesses'. The problem with guessing without a model is that you are far more likely to reinforce your own bias!

I accept 'ignorant', but am certainly not a troll. I've read a fair chunk of the hockey-schtick web-site, and enjoyed https://www.youtube.com/watch?v=Uif1NwcUgMU&feature=player_embedded yet it is still clear that CO2 is a greenhouse gas and warms the planet. The main question is what is the present 'climate sensitivity'.

OK, I like scienceofdoom, at least he has a 'bare' doubling of CO2 giving a 1.1 K temperature increase, and you won't hear another word from me on Skeptcial Science. Happy? :)

'Asked and answered'

Where? And why do you trust one prediction over another? Maybe it's true, but it is not clear.

Acidification refers to lowering of pH. The speed mainly refers to the fact that buffering can't act that fast to stabilise it, and the fact that it is due to human activities rather than any natural process.

You seem to have missed this section and the accompanying figure: http://en.wikipedia.org/wiki/Ocean_acidification#Mechanism

It would not matter in the case of an objective probability. But here we are considering a subjective probability, and we need to be clear about what the "subject" of that subjective probability is. In particular, we need to be careful what we consider to be the boundries are of the entity we are referring to as "Sleeping Beauty" . It is given that this entity has no memory, but it is still the case that an action performed on "heads Monday" is not erased, and history still shows that it happened when "heads Tuesday" comes to pass. Consequently, the greater, decision making sleeping beauty may have a "memory" even if that is not biologically encoded within the workings of SB's brain.

"

1. If SB is not making a decision, then whether the probability is 1/2 or 1/3 is irrelevant."That's irrelevant.

"Greenhouse gases Cool, only a little"

well, you can take that up with Lubos. Try here for a start:

http://motls.blogspot.ie/2008/01/why-is-greenhouse-effect-logarithmic.html

If you want a more indepth discussion, try http://scienceofdoom.com/2014/08/12/the-atmosphere-cools-to-space-by-co2-and-water-vapor-so-more-ghgs-more-cooling/

(and the rest of that site), but be sure to at least read to the end of the page! It will take more than a gedanken experiment to prove your case - do you have any (links to) calculations?

Wow fantastic blog !!! really a nice article please go through this site for know more information, we provide wedding car in delhi. Luxury

Car Hire Delhi

Click it .

Thank you

Victoria travel

Post a Comment