## Friday, February 10, 2006 ... /////

### Hartle-Hawking-Susskind unification

Interesting authors always attract an increased amount of attention. Stephen Hawking and Thomas Hertog have submitted a new paper

that proposes a certain unification of the Hartle-Hawking ideas about the wavefunction of the Universe with the concepts of the landscape. I have not yet analyzed the paper sufficiently deeply to describe it here but some readers may be ahead of me. So far I struggle with statements like "the histories of the Universe depend on the question we ask" and "the observations are determined by final boundary conditions".

The first statement violates my assumption that all current observations should be describable and explainable by the same set of "consistent histories", if you allow me to use the Gell-Mann-Hartle et al. interpretation of quantum mechanics. The second statement disagrees with my basic assumptions about causality.

I have almost forgotten what was Hawking's attitude to the anthropic principle. Then I realized that Hawking has said something wise about it that was meant to sound neutral - but that was eventually identified by David Gross as the "extreme anthropic principle": "All parameters of the visible Universe are assumptions - and we should only consider the conditional probabilities assuming that the parameters such as the particle masses are what they apparently are."

(Conditional probabilities play an important role in the new paper, too, and the ideas could actually be closely related.)

"Consequently, the message for the experimentalists is: Please don't measure anything else. Every new number you measure will have to be added to our awkward anthropic list of assumptions," David Gross interpreted Hawking's approach to the vacuum selection issues. ;-)

#### snail feedback (7) :

An Off Topic - Is there any particular reason that link to Peter Woit's blog has disapeared from the list of links "Blogs led by science"???

Macromause

Yes, there is. ;-)

"All parameters of the visible Universe are assumptions - and we should only consider the conditional probabilities assuming that the parameters such as the particle masses are what they apparently are"

To clarify this, my Bayesian reading of it is as follows:

Pr(x|new data, old data) = Σparams Pr(x|new data, params) Pr(params|old data)

The graph representing this is

old data → params → x ← new data

where

1. “old data” is the (large) archive of data collected in the past.
2. “params” is the values of the physical constants (or whatever) that are used to explain the “old data”. “params” is used to summarise (or compress) all of the “old data” as Pr(params|old data).
3. “new data” is recently collected data that has not been used in step (1) or (2) above. “new data” is combined with “params” to obtain any prediction “x” as Pr(x|new data, params).
4. All of the alternative possible “params” are then summed over using Σparams. This means that you don't use a single sharp value for "params", but you consider all of the alternative possibilities. The Pr(params|old data) factor gives you a probability weighting for this summation, which may (or may not) be sharply peaked around a single value.

This sort of approach is standard in Bayesian inference, where you have so much historical data (i.e. “old data”) that you have to summarise it using Pr(params|old data), which is a much more convenient object to compute with. Of course, Pr(params|old data) needs a prior Pr(params).

This approach assumes that “params” is a sufficient set of statistics to capture all of the information in “old data”. In physics applications this is like assuming that you have not overlooked any of the physical constants.

... I see ... ehm, just curious, what might it be??? Support for terrorists???