Because this very general definition, it should be clear that physics cannot be done without observables. Any theory that should be evaluated scientifically must have some observables and these observables are more directly or less directly compared with the outcomes of experiments.
There are only two known frameworks in physics how observables are computed, how they evolve in time, and how they are intepreted and compared with measurements. The two frameworks are
- classical physics
- quantum mechanics
The first one occurs as a special case of the second one. And the second one, quantum mechanics, is how the real world works. Everything indicates that there exists no other interpretation of the observables that could be, at least remotely, compatible with the qualitative features of the real, quantum world.
The word "observable" only became popular when quantum mechanics started to dominate.
In classical physics, observables are the basic degrees of freedom - such as coordinates and momenta of particles and/or values of the electric and magnetic fields at various points - or any functions of these variables - such as the angular momentum or the energy density integrated over the Solar system.
One of the postulates of quantum mechanics is that these observables are upgraded to operators on a Hilbert space. It is the eigenvalues of these operators that are measured, and the probability of various outcomes is determined as the absolute value of a certain complex amplitude squared. In ordinary non-gravitational quantum systems, we can typically define a concept of time and the complex amplitudes (the wavefunction) evolve according to the Schrödinger equation. The wavefunction itself is not an observable; it is not an operator and moreover its values are not directly measurable. The previous sentence is partially deep and partially an irrelevant piece of terminology.
Equivalently, the complex amplitudes may be directly calculated using Feynman's path integral. Equivalently, the wavefunction may be kept constant and the operators evolve in time - this is how it works in the Heisenberg picture. It differs from the Schrödinger picture much like two differently rotating systems of coordinates differ from one another.
In field theory, the number of independent observables becomes infinite: in each point of spacetime, we find an observable. This fact is true both in classical field theory as well as quantum field theory. Nevertheless, the principles are qualitatively unchanged. Although the number of observables is infinite, they follow very much the same rules as in ordinary classical or quantum mechanics. There are some technical issues involving distributions, divergences, regularization, renormalization etc., but they do not change the basic setup.
Observables in quantum gravity
Things are very different if general relativity because there is no longer a preferred definition of time. You may say that the same thing was already true in systems that respected special relativity. Yes, these systems have many notions of time related by Lorentz transformations but each of them may be adopted to give us the same kind of time-evolution as we had in non-relativistic physics.
However, this is not the case in general relativity. Why? It is because the set of different choices of the time coordinate forms an infinite-dimensional space. The symmetry that relates these different descriptions is a local symmetry - analogous to gauge symmetries in Yang-Mills theory - namely the group of diffeomorphisms. In classical physics, we know exactly what we are doing: general relativity allows us to calculate the metrics as long as we realize that the coordinates are not directly physical because they can be redefined in an arbitrary way.
In quantum mechanics, general relativity becomes more subtle. It is because the physical states must be invariant under gauge symmetries. The diffeomorphism transformations mentioned above also include time translations. This seems to imply that the wavefunction is time-independent as long as we compute the time-dependence with all the required terms. The fact that the overall generator of time translations "H" annihilates the full wavefunction is known as the Wheeler-deWitt equation.
You may think that this is a complete disaster because the state of the Universe is surely not time-independent. Well, it is not a complete disaster. It is because the diffeomorphisms that change the asymptotic structure of spacetime do not have to keep the wavefunction invariant. Only the diffeomorphisms that mostly act "inside the spacetime" have to keep the wavefunction invariant. Because the other diffeomorphisms may change the wavefunction, the wavefunction is allowed to have a nonzero ADM energy and similar features. These observables are calculated from the behavior of the metric and other fields at infinity.
Well, if there is some infinity.
In the Minkowski space and de Sitter space, we can safely define the energies according to the strategy above, and we may also determine the time evolution, but only from -infinity to +infinity. If these infinities really appear in the far past and the far future, we call the evolution operator "S-matrix". String theory allows us to calculate the S-matrix (another example that we do call an "observable") for all particles in the spectrum which includes the scattering of gravitons. We don't have to insert our knowledge about the problematic "bulk" observables: string theory automatically tells us not only the right answers but also the right questions. "It is the S-matrix you should calculate, silly," she says. It also tells us what are the corresponding evolution observables for anti de Sitter space.
Someone may therefore convince you that the S-matrix is the only meaningful observable that has any physical meaning in a quantum theory of gravity. This sentence is both deep, if an appropriate interpretation is adopted, as well as discouraging.
It is deep because we know that the local Green's functions or other local objects that are calculable in quantum field theories do not have a well-defined meaning in quantum gravity as long as they are nicely "covariant". And it is also deep because the S-matrix actually contains a lot. If we study particle scattering, the interaction region is typically so much smaller than our detectors that the interior of the detector may be approximated by an infinite spacetime, and the S-matrix contains all measurable information about the system. Even if we study more messy systems, the S-matrix actually knows a lot about physics, including the low-energy effective theory, and these insights may be combined with other pictures to determine the behavior of "less empty" configurations of matter, too.
On the other hand, such an S-matrix picture is almost certainly inappropriate for early cosmology. The very beginning of the Universe is not described by an S-matrix, at least not the type of S-matrix we know. When the Universe was small, the S-matrix was still useless. How do we deal with these problems?
One of the obvious remedies to get rid of the negative consequences of the local diffeomorphism group is to gauge-fix it. For example, you can go to the light-cone gauge. String theory in the Minkowski backgrounds may also be written in the light cone gauge. The result is that after such a gauge-fixing, the theory effectively becomes an ordinary theory analogous to the non-gravitational quantum field theories and one can ask the local questions and the questions what happens when the system evolves by a finite time interval. In reality, we rarely ask such questions - and even in the light-cone gauge, we continue to compute primarily the S-matrix - but in principle, you could ask these questions.
However, the light-cone gauge is also highly inappropriate for the context of cosmology, the creation of the Universe, and its early moments - because it relies on the existence of a null Killing vector that definitely disappears near the Big Bang. What can you do? Maybe we should look for other types of gauge-fixing that are more appropriate for approximately homogeneous and isotropic backgrounds that change rapidly with time. It's almost guaranteed that these other types of gauge-fixing will be more messy.
The loop quantum gravity people would offer you entirely different observables, such as the areas of two-dimensional surfaces in space. I am convinced that all these things are completely unphysical. Some speculations about the discrete spectrum of the areas, even if they were right - which I am equally sure is not the case - would be completely disconnected from the cosmic microwave background, distribution of galaxies, or any other indirect observation of the early Universe that we have done - and probably any observation that we will ever do. It's just not physics. Well, so far my statement seems completely obvious because the notions of loop quantum gravity seem to be disconnected from anything that looks like space in the first place. ;-)
Should we care about the problems?
You see that there are great difficulties to go from geometries with a nice behavior at infinity - such as the Minkowski and AdS spaces - to geometries without such a nice behavior, such as compact topologies in cosmology. Should we care about these problems?
My answer to this question is mixed. I am sure that we should eventually describe the early cosmology in a quantum language - because these events definitely have had observable consequences. On the other hand, my feeling is that almost everything that has been speculated about these non-asymptotic observables in quantum gravity has been largerly physically irrelevant.
As discussed above, a great deal of the local behavior is encoded in the S-matrix. Also, one can typically write down a coupled system involving a classical background metric and other quantum fields propagating in this background. Non-gravitational quantum field theory defined in a curved background is a pretty well-established framework that does not differ much from regular quantum field theory on flat backgrounds. And classical general relativity is well-established, too. And this combination continues to be sufficient for many questions.
However, it is not quantum gravity and it is guaranteed to break down when the quantum effects start to play a big role for gravity. The previous approximative picture definitely breaks down when the Universe is 1 Planck time from the Big Bang. And it probably gives misleading predictions for the GUT-scale inflation and other things, too.
Ignorance about the early Universe
There are several types of insights about the early Universe that we do not know yet.
I am talking about the early Universe because I am aware of no evidence that would support the idea of a big pre-history prior to the Big Bang or the statements that such a pre-history added to our picture of the Universe explains something or is scientifically natural or inevitable. This is why I apply Occam's razor to these ideas. Of course, eventually, we may revise them again if there is some evidence.
Some of them are related to our typical problem with the vacuum selection. Just like we don't know which "Standard-like Model" in the broad selection offered by string/M-theory is the right one, we also do not know which "generalized inflationary model" is the right one. These two things come in the same package. Of course that if we localized the correct place in the "landscape", we would probably answer all of these questions simultaneously.
Inflation destroys most of the information about the initial conditions of the Universe at the beginning of the inflation (or before it). That's a good thing because it makes much of our ignorance irrelevant. But you may still wonder that some general features of these initial conditions won't be inflated away; the choice of the compactification is an obvious example. The initial conditions are likely to remain something that influences our Universe and our current observations, and should therefore be a topic addressed by physics sometime in the future. The framework of the Hartle-Hawking wavefunction remains the only set of rules that makes some sense to me in this respect, and there remain questions how to embed it in string theory.
There have been many speculations about "solving the problems" with the Big Bang that avoided the concept of the Hartle-Hawking wavefunction. Neither of these speculations make sense to me, as far as I remember. Why? Because neither of them says that the initial conditions are unique and what they are. In fact, the word "unique" is far too strong. These speculations in fact do not impose any constraints on the initial conditions whatsoever. They typically say that they can define "nice" theory but this theory treats all conceivable initial conditions on par. From a physical perspective, this is exactly equivalent to having a singularity where everything becomes unpredictable.
Occam's razor dictates that we should imagine that the Universe starts at time "T" with the initial conditions that lead to a Universe that matches the observations and you should try to make "T" as small as possible given the constraint that your initial conditions will remain natural. In other words, we should never try to imagine that we describe an epoch before "T" unless our new description makes the whole picture more reasonable and its predictions more likely and natural. In other words, we should not pretend that we understand something even though we can't say anything new about it.
The only reason why the physicists may be interested in the very early Universe, its initial conditions, and a theory that describes that is that these conditions influence observations that we can do today and in the future - and the only role of the theory is to tell us something about these observations. This is only possible if the initial conditions are unique or at least heavily constrained, and if a quantum theory of the Big Bang does not offer us any constraints of this kind - and, instead, it tells us that "anything goes" - its value for physics is exactly equal to zero.
There seems to be some virtual reality in the investigation of various causally convoluted black hole geometries etc. As far as I am concerned, the four-dimensional black holes of the usual types (Schwarzschild, Reissner-Nordström, Kerr, Kerr-Newman) are the only ones that are relevant for physics. Sometimes we have problems to convince some of our colleagues even about this modest statement, but that's a different issue.
All other black holes we theoretically encounter are toy models that are only relevant for natural science if they share their qualitative features with the observed black holes. Most of the time, it is easier to calculate supersymmetric black holes in various spacetime dimensions etc. because it is more mathematically elegant and controllable. And of course, mathematical elegance and new relations between mathematical objects is a great thing, too. But the results are physically relevant only if they have some counterpart in the world of observable physics and if we can actually argue that they are in the same "universality class". It has been absolutely great to learn that the quantum picture of the black holes is internally consistent. But a huge amount of new insights about unphysical configurations in gravity has been developed - a much bigger amount than what we will ever be able to observe.
This fact makes it harder that these insights are "physics" in the most conventional sense. They are "physics" as understood by string theory - which also includes all other configurations in all other backgrounds allowed by string theory. And in some speculative cases, it also includes configurations that are not guaranteed to be allowed by string theory once it's properly understood. Well, it's a hard task to find something that is directly relevant for some unexplained observations because there are almost no unexplained observations available.
But this situation should not obscure the difference between physics - that always needs observables - and philosophy that can do without them. As a well-known cosmologist said, physics depends on the fine balance between cold and boring experimental data and hot and exciting theoretical speculations. Whenever this balance is lost, physics degenerates either to botanics or to philosophy. ;-)
In some sense, this observation has its counterpart in high-energy theoretical physics. In the long run, high-energy theoretical physics depends on the fine balance between the bottom-up approach and the top-down approach. If the balance is broken, high-energy theoretical physics either degenerates into phenomenology (in the insulting sense) that only describes what we already know and that makes new correct predictions purely by chance; or it degenerates into philosophy or religion that uses string theory as a captive.