Friday, August 15, 2014 ... Français/Deutsch/Español/Česky/Japanese/Related posts from blogosphere

Hire your CERN and click at your discoveries

If you want to assure yourself that you're capable of doing all the work that is done by people at CERN – from the Director General to the PhD advisers, detector technicians, statisticians etc., you may simply open this CERN game:

CERN particle clicker
You must click at buttons to discover the CP violation and do many other things.

Heuer and the folks at ATLAS, CMS, and other collaborations aren't doing too different things.

If you're going to click vigorously enough, you will be receiving packages with beer, coffee, graduate students, postdocs, research fellows, reputation, media hype, grants, and other commodities.

The first TRF reader who discovers some beyond-the-Standard-Model physics (or hires the board of Google as graduate students) and proves that he's better than all the CERN employees combined :-) should proudly boast about her achievements.

The reputation and funding etc. grew sort of exponentially, with the new units' having new names, and after 2 hours when it was running in a window, mostly without clicking ;-), I had 50 out of 105 achievements.

Add to Digg this Add to reddit

snail feedback (5) :

reader Rmanujan said...

Typo in my comment: P(t | not T/t) = P(t AND not T/t)/P( not T/t) = 1/3

reader Luboš Motl said...

Of course that the probability of P(A|B) is in general different before and after we are told what P(B) is. It is only constant if "A" is a predictable consequence of B, i.e. in the future of B. But here, the causal relationship goes in the other way. P(A|B) is the reconstructed probability of a hypothesis, and of course that it changes once we actually learn some evidence. That's what Bayesian inference is all about.

reader Ramanujan said...

The definition of P(A|B) that I am using is "the probability that one would assign to A, given all information currently known plus the additional information that B is true." It seems that you are using a different definition, can you say what it is? Thanks.

reader Luboš Motl said...

No, I agree with this definition.

And as I wrote, I even agree with P(tails | not told Tuesday-tails ) = 1/3 holds before the possible "put to sleep" moment.

But that doesn't imply that P(tails) = 1/3 *after* the possible put to sleep moment because at the moment when she could be put to sleep but she wasn't, she is learning some new information - about the day. Assuming that the coin is showing tails, she is actually learning about the day! The conditional probability becomes meaningless because it assumes things that is known to be wrong.

She is however learning no information about the state of the coin so the overall probability P(tails) stays the same at 1/2.

reader Ramanujan said...

I don't think you are applying the definition. Yes, she is learning new information. She is learning B, so by definition the old P(A|B), agreed to be 1/3, becomes the new P(A).