Recall that the Lagrangian of the Einstein-Gauss-Bonnet system is
L = 1 / (16.pi.G) [ R + alpha (R*R - 4 R.R + R^2) ].Besides the Einstein-Hilbert term, you can see the topological term multiplied by the the area, "alpha". Because the pair-creation of black holes involves some topology change, the last term matters and increases the nucleation rate by the factor
Gamma = Gammaorig exp (4 pi alpha / G)The second enhancing factor becomes huge if the Gauss-Bonnet area "alpha" is much bigger than the Planck area "G". That's expected to be the case even in perturbative string theory where "alpha" is comparable to the squared string scale, or at least Maulik says so. When the enhancement is large, you should care about the original decay rate,
Gammaorig = exp (-pi L2 / 3G)where L is the curvature radius of the de Sitter space. Without the alpha-enhancement, this rate would be negligible for any de Sitter space that is visibly bigger than the Planck scale.
However, with the alpha-enhancement, the decay rate becomes significant. For an inflating Universe, the Hubble radius, "1/H", has to be greater than "sqrt(12 alpha)", otherwise the instanton creates lots of black holes which are probably unhealthy for the inflationary mechanism. In the example above, this means that the radius must exceed the string scale (with a particular numerical prefactor). This doesn't sound too dramatic a constraint but because the inflation scale is often close to the string scale, it could be a nontrivial constraint.
Of course, it would be even more interesting to discover that there is a new, unexpectedly huge contribution to the Gauss-Bonnet term that makes "alpha" close to the squared neutrino Compton wavelength. If this were the case, one could derive a constraint on the cosmological constant. ;-) Such a huge alpha is probably impossible but it would be fun if there were one.
There could exist similar enhancements and instabilities of this kind - and maybe its higher-dimensional counterparts - that could eliminate many kinds of compactifications with too small radii, too complicated topologies, and so on. Quantum cosmologists should try to study these possibly neglected mechanisms intensely.
By the way, this is related to one point that I dislike about the current approach of the anthropic people. For most features of the Universe, they can't find any strong and accurate enough anthropic constraint. But if they can "explain" something using this anthropic reasoning, they're satisfied. This is a fundamentally unscientific thinking because one should always try to find "all" conceivable constraints - and the "other solutions" (such as the black hole creation) could actually be more important, more stringent, more predictive, and more true than the ones that the anthropic people "guess" by chance.
ISS with NS5-branes
By the way, the second hep-th paper is also interesting and it is also about the vacuum selection. Kutasov, Lunin, MrOrist, Royston study the landscape of vacua obtained by stretched D4-branes (and other D-branes) between NS5-branes. They end up with some Intriligator-Seiberg-Shih-like SUSY breaking setup and argue that the early cosmology pushes the Universe towards a particular SUSY-breaking ground local minimum.