tag:blogger.com,1999:blog-8666091.post7610469930365517353..comments2021-07-18T19:54:06.055+02:00Comments on The Reference Frame: Hire your CERN and click at your discoveriesLuboš Motlhttp://www.blogger.com/profile/17487263983247488359noreply@blogger.comBlogger5125tag:blogger.com,1999:blog-8666091.post-14130821435090523112014-08-15T21:13:07.751+02:002014-08-15T21:13:07.751+02:00I don't think you are applying the definition....I don't think you are applying the definition. Yes, she is learning new information. She is learning B, so by definition the old P(A|B), agreed to be 1/3, becomes the new P(A).Ramanujannoreply@blogger.comtag:blogger.com,1999:blog-8666091.post-56695620122604543302014-08-15T20:39:09.920+02:002014-08-15T20:39:09.920+02:00No, I agree with this definition.
And as I wrote...No, I agree with this definition.<br /><br /><br />And as I wrote, I even agree with P(tails | not told Tuesday-tails ) = 1/3 holds before the possible "put to sleep" moment.<br /><br /><br />But that doesn't imply that P(tails) = 1/3 *after* the possible put to sleep moment because at the moment when she could be put to sleep but she wasn't, she is learning some new information - about the day. Assuming that the coin is showing tails, she is actually learning about the day! The conditional probability becomes meaningless because it assumes things that is known to be wrong.<br /><br /><br />She is however learning no information about the state of the coin so the overall probability P(tails) stays the same at 1/2.Luboš Motlhttp://motls.blogspot.com/noreply@blogger.comtag:blogger.com,1999:blog-8666091.post-51866897457287215142014-08-15T19:55:53.401+02:002014-08-15T19:55:53.401+02:00The definition of P(A|B) that I am using is "...The definition of P(A|B) that I am using is "the probability that one would assign to A, given all information currently known plus the additional information that B is true." It seems that you are using a different definition, can you say what it is? Thanks.Ramanujannoreply@blogger.comtag:blogger.com,1999:blog-8666091.post-30353161443146519542014-08-15T09:50:47.183+02:002014-08-15T09:50:47.183+02:00Of course that the probability of P(A|B) is in gen...Of course that the probability of P(A|B) is in general different before and after we are told what P(B) is. It is only constant if "A" is a predictable consequence of B, i.e. in the future of B. But here, the causal relationship goes in the other way. P(A|B) is the reconstructed probability of a hypothesis, and of course that it changes once we actually learn some evidence. That's what Bayesian inference is all about.Luboš Motlhttp://motls.blogspot.com/noreply@blogger.comtag:blogger.com,1999:blog-8666091.post-11891802173274245992014-08-15T09:43:29.797+02:002014-08-15T09:43:29.797+02:00Typo in my comment: P(t | not T/t) = P(t AND not T...Typo in my comment: P(t | not T/t) = P(t AND not T/t)/P( not T/t) = 1/3Rmanujannoreply@blogger.com