Natalie Wolchover wrote a good article for the Simons Foundation,
At any rate, the system of ideas known as "naturalness" seems to marginally conflict with the experiments and things may be getting worse. Roughly speaking, naturalness wants dimensionful parameters (masses) to be comparable unless there is an increased symmetry when they're not comparable. But the Higgs boson is clearly much lighter than the Planck scale and in 2015, the LHC may show (but doesn't have to show!) that there are no light superpartners that help to make the lightness natural.
The "agravity" approach, if true, eliminates these naturalness problems because according to its scheme of things, there is no fundamental scale in Nature. One tries to get all the terms in the Lagrangian with some dimensionful couplings from terms that have no dimensionful couplings. "Agravity" is a different solution to these problems than both "naturalness" and "multiverse" – a third way, if you wish.
Similar things have been tried before, e.g. by William Bardeen in 1995, but Strumia et al. are the first ones who are trying to add gravity. The claim is that one may get the Einstein-Hilbert action by a dynamical process in a theory whose terms only include four-derivative terms such as \(R^2\).
Aside from a novel solution of the problems with the hierarchies, it is claimed that the scenario may predict inflation with the spectral index and the tensor-to-scalar ratio immensely compatible with the BICEP2 results.
The main obvious problem are the ghosts. The terms like \(R^2\) may be rewritten as propagating degrees of freedom whose squared normal (sign of the kinetic term) are indefinite – some of them lead to proper positive probabilities while others produce pathological negative probabilities.
I remember a 2001 Santa Barbara talk by Stephen Hawking about "how he befriended ghosts", with some pretty amusing multimedia involving ghosts hugging his wheelchair, so you should be sure that Strumia et al. aren't the first folks who want to befriend ghosts.
At this moment, ghosts look like a lethal flaw. But I can imagine that by some clever technical or conceptual tricks, this flaw could perhaps be cured. The physical probabilities could become positive if one chose some better degrees of freedom, or there could be a new argument why these negative probabilities are ultimately harmless for some reason I can't quite imagine at this moment.
However, my concerns about the theory go beyond the problem with the ghosts. I do think that the Planck scale has been made extremely important by the modern "holographic" research of quantum gravity. The Planck area defines the minimum area where nontrivial information may be squeezed. It seems to be the scale that determines the nonlocalities and breakdown of the normal geometric concepts. The Planck scale is the minimum distance where a dynamical, gravitating space may start to emerge.
So if someone envisions some smooth ordinary spacetime at ultratiny, sub-Planckian distances, he is facing exactly the same difficulties – I would say that many of them are lethal ones – as the difficulties mentioned in the context of Weinberg's asymptotic safety which also envisions a scale-invariant theory underlying gravity at ultrashort distances.
There could be some amazing advance that cures these serious diseases but such a cure remains a wishful thinking at this point. We shouldn't pretend that the diseases have already been cured – even though you may use this proposition as a "working hypothesis" and a "big motivator" whenever you try to do some research related to agravity. That's why I find the existing proposals of scale-invariant underpinnings of quantum gravity, including the agravity meme, to be very unlikely. Hierarchy-like problems including the cosmological constant problem may look rather serious but they're still less serious than predicting negative probabilities of physical processes.