As I wrote e.g. here ten years ago, I consider Weinberg's "Asymptotic Safety" paradigm in quantum gravity to be a deep misunderstanding.

The program basically wants to ignore the non-renormalizability of Einstein's equations; and special "non-local" phenomena discovered in recent decades, including holography. Instead, it wants to treat Einstein's equations as if they were on par with QCD and the theory became asymptotically free (or more precisely, having finite couplings but vanishing beta-functions) at high energy scales – which could determine the theory at lower energy scales, too.

There are good reasons why – I am convinced – the people who have understood the largely irreversible lessons of the string-theory-related research of quantum gravity in recent decades agree with me that the Asymptotic Safety sharply contradicts some very qualitative and clear principles that have been pretty much safely established by string theorists and their proxies – especially lessons about the black holes, their microstates, and their information (and things like the AdS/CFT).

Among other examples, I find it extremely likely that the Asymptotic Safety cannot be reconciled with the ER-EPR (wormhole-entanglement) correspondence; it doesn't seem to have the natural capacity to change the spacetime topology and/or treat the Hilbert spaces on two topologies as equivalent. The vague general Asymptotic Safety axioms generally contradict the observations that the spectrum at trans-Planckian masses or formally sub-Planckian distances is "very nonlocal" in quantum gravity. The Asymptotic Safety program simply wants to make gravity look much closer to the non-gravitational, local field theories than what gravity actually is.

However, there may exist more old-fashioned reasons why the Asymptotic Safety papers simply are wrong. Aside from 270+ other papers, John Donoghue of University of Massachusetts has written some renowned papers about the renormalization of Einstein's equations as an effective field theory. So you know, he knows something about the right and wrong technical treatment of the quantum loops and divergences and what they may be assigned to.

Today, he released the hep-th paper

A Critique of the Asymptotic Safety Programwhere he argues that the bulk if not the totality of the real-world papers on Asymptotic Safety (those are building on a 1992-3 paper by Christoff Wetterich; another defining man of the real-world program is Martin Reuter) suffers from some general bugs that render all of them wrong. Donoghue is open-minded about the possibility that some better, correct papers – papers from the hypothetical idealized world of Asymptotic Safety – could be written but no one has done so.

The papers that have actually been written in recent years mainly assume some naive running of the cosmological constant and Newton's constant, \(\Lambda(k)\) and \(G_N(k)\). Donoghue argues that this running cannot be linked to actual processes in which the energy of external gravitons is changing. Instead, this running is derived – in the real-world Asymptotic Safety papers – from tadpole diagrams which should be considered zero (e.g. because of the dimensional regularization; the Asymptotic Safety folks prefer dubious cutoff calculations in which they ignore the low energies, not high energies, as the normal cutoffs do – the first reason for the word "dubious" is that the calculations should generally not depend on technical details of the chosen treatment of the regularization, the second reason is that the long distances never become "negligible") and that naturally and instinctively do not know about the running in physical processes (those that actually have particles with variable energies). Also, the authors of the real-world papers neglect some obstacles that render the Euclidean-to-Minkowski rotation invalid; and they seem to neglect all quantum effects below the scale \(k\).

Donoghue is open-minded about some "fixed", idealized-world Asymptotic Safety papers but even such hypothetical papers are probably doomed because they either 1) make the Lorentzian-Euclidean transition impossible, or 2) contain ghosts and/or tachyons produced by any truncation.

I have asked many questions about this elementary question: How do we correctly account for the running of Newton's constant and the cosmological constant? And I have also proposed a "solution" of the cosmological constant boiling down to its renormalization group running: its extremely low-energy limit is what the cosmologists observe and it must be unavoidably comparable to the lightest massive particle to the fourth power (neutrino masses) because that's where (the energy below which) the running should stop.

Those are intriguing comments but I think that rather generally, it's simply inadequate to describe the quantum behavior by the running of several constants such as \(\Lambda(k)\) and \(G_N(k)\). The real reason why this truncation is wrong is that Einstein's gravity is non-renormalizable so any kosher treatment unavoidably includes infinitely many parameters, including the coefficients for all the higher-derivative terms and similar stuff.

All these terms could be negligible at some very weak coupling – which is assumed to emerge at very high energies in this program – but to get there, we must go through the Planck scale energies where the higher-derivative terms simply must have a comparable impact to the lowest-derivative terms. I personally don't see any valid evidence that there's any consistent gravity-like description of the metric field at the limiting trans-Planckian energies (with finite couplings and vanishing beta-functions; those "nice traits" seem to be pure wishful thinking in every respect). However, even if such a description existed, you can only get there from the – experimentally relevant – low energies by crossing the Planck scale where the whole infinite collection of terms is important.

More generally, the lore that the non-renormalizability of the general theory of relativity makes it impossible to calculate quantities near the Planck scale from Einstein's equations themselves – from their unique and straightforward "quantization" – seems to be correct and all explicit enough "loopholes" that have been proposed to circumvent this lore are wrong for reasons that are as technically well-defined as Donoghue's critical paper.

The running of the gravity-related constants induced by loops is a complicated enough industry and many people may very well fool themselves or others and their sponsors in continuing the production of papers all of which demonstrably suffer from some well-known mistakes. I think that it's the responsibility for an honest researcher not to be employed as a the producer of wrong papers; and it's the sponsors' responsibility to make sure that this isn't happening in the situation in which the recipients of the grants are

*not*honest.

## snail feedback (0) :

Post a Comment