Abstract

Depending on the value of the Higgs mass, the Standard Model acquires an unstable region at large Higgs field values due to RG running of couplings, which we evaluate at 2-loop order. For currently favored values of the Higgs mass, this renders the electroweak vacuum only metastable with a long lifetime. We argue on statistical grounds that the Higgs field would be highly unlikely to begin in the small field metastable region in the early universe, and thus some new physics should enter in the energy range of order of, or lower than, the instability scale to remove the large field unstable region. We assume that Peccei-Quinn (PQ) dynamics enters to solve the strong CP problem and, for a PQ-scale in this energy range, may also remove the unstable region. We allow the PQ-scale to scan and argue, again on statistical grounds, that its value in our universe should be of order of the instability scale, rather than (significantly) lower. Since the Higgs mass determines the instability scale, which is argued to set the PQ-scale, and since the PQ-scale determines the axion properties, including its dark matter abundance, we are led to a correlation between the Higgs mass and the abundance of dark matter. We find the correlation to be in good agreement with current data.

1. Introduction

Recent LHC results are consistent with the predictions of the Standard Model, including the presence of a new boson that appears to be the Higgs particle with a mass  GeV [1, 2] (more recent measurements are summarized in [3]). With the Higgs at this mass, the Standard Model is well behaved up to very high energies if we evolve its couplings under the renormalization group (RG) equations. By no means does this imply that the Standard Model will be valid to these very high energies, and in fact there are good phenomenological reasons, such as dark matter, strong CP problem, baryogenesis, inflation, and hierarchy problem, to think it will be replaced by new physics at much lower energies, say (TeV). But it is logically possible, albeit unlikely, that the Standard Model, or at least the Higgs sector, will persist to these very high energies and the explanation of these phenomena will be connected to physics at these high, or even higher, energy scales.

So at what energy scale must the Standard Model breakdown? Obviously new physics must enter by the Planck scale where quantum gravity requires the introduction of new degrees of freedom. However, the RG running of the Higgs self-coupling can dictate the need for new physics at lower energies, depending on the starting value of . The Higgs mass is related to the self-coupling by , where the Higgs VEV is  GeV. For moderate to high values of the Higgs mass, the initial value of , defined at energies of order of the electroweak scale, is large enough that it never passes through zero upon RG evolution. On the other hand, for small enough values of the Higgs mass, the self-coupling passes through zero at a sub-Planckian energy, which we denote by , primarily due to the negative contribution to the beta function from the top quark, acquiring an unstable region at large field values [4, 5]. The latter occurs for a light Higgs as has been observed. One finds that this renders the electroweak vacuum only metastable with a long lifetime. However, we will argue in this paper that it is highly unlikely for the Higgs field in the early universe to begin in the metastable region as that would require relatively small field values as initial conditions. Instead it would be much more likely to begin at larger field values, placing it in the unstable region. Hence, the energy scale sets the maximum energy scale for new physics beyond the Standard Model to enter.

There are many possible choices for the new physics. One appealing possibility is supersymmetry, which alters the running of the Higgs self-coupling due to the presence of many new degrees of freedom, likely entering at much lower energies, conceivably (TeV), or so. In addition to possibly stabilizing the Higgs potential, supersymmetry can also alleviate the hierarchy problem, improve unification of gauge couplings, and fit beautifully into fundamental physics such as string theory. So it is quite appealing from several perspectives. It is conceivable, however, that even if supersymmetry exists in nature, it is spontaneously broken at very high energies, and in such a scenario we would be forced to consider other possible means to stabilize the Higgs potential.

One intriguing possibility that we examine in this paper is to utilize dynamics associated with the solution of the strong CP problem; the problem that the CP violating term in the QCD Lagrangian is experimentally constrained to have coefficient , which is highly unnatural. The leading solution involves new Peccei-Quinn (PQ) dynamics [7], involving a new complex scalar field and a new global symmetry that is spontaneously broken at some energy scale . This leads to a new light scalar field known as the axion [8, 9]. Since it is bosonic, the field adds a positive contribution to the effective for the Higgs, potentially removing the unstable region, depending on the scale . This elegant mechanism to remove the unstable region was included in the very interesting reference [10], where this and other mechanisms were discussed, and was a source of motivation for the present work (also related is [1113]).

In the present paper, we would like to take this elegant mechanism for vacuum stability and push it forward in several respects. Firstly, as already mentioned, we will argue on statistical grounds why the metastable vacuum requires stabilization. Secondly, we will allow the PQ-scale to scan and argue, again on statistical grounds, why it should be of order of the instability scale , rather than orders of magnitude lower. Finally, we will furnish a correlation between the Higgs mass and the axion dark matter abundance and use the latest LHC [1, 2] and cosmological data [6] to examine the validity of this proposal. The outcome of this series of arguments and computation is presented in Figure 1, which is the primary result of this work.

The outline of our paper is as follows. In Section 2, we examine the running of the Standard Model couplings at 2-loop order. In Section 3, we examine the metastability of the Standard Model vacuum and argue that it is statistically unfavorable for the Higgs to begin in this region. In Section 4, we include Peccei-Quinn dynamics to remove the Higgs instability and argue why . In Section 5, we relate the PQ-scale to the axion dark matter abundance, which furnishes a correlation between the Higgs mass and the abundance of dark matter. Finally, in Section 6, we compare the correlation to data and discuss our results.

2. Standard Model RG Evolution

We begin with a reminder of the structure of the Higgs sector of the Standard Model. The Higgs field is a complex doublet with Lagrangian In the unitary gauge, we expand around the VEV as , where in our convention  GeV. The associated Higgs mass is in terms of the starting value of , normally defined around the boson mass. At higher energies, the self-coupling undergoes RG evolution due to vacuum fluctuations from self-interaction, fermion interactions, and gauge interactions. Defining with , the associated 1-loop beta function (suppressing external leg corrections for now) is where the only fermion Yukawa coupling we track is that of the top quark since it is by far the largest. For sufficiently large Higgs mass, the positive self-interaction term is large enough to keep the beta function positive, or only slightly negative, to avoid running negative at sub-Planckian energies. For sufficiently small Higgs mass, the negative top quark contribution can dominate and cause the beta function to go negative, in turn causing to pass through zero at a sub-Planckian energy, which we denote by . The top quark Yukawa coupling itself runs towards small values at high energies with 1-loop beta function which is quite sensitive to the value of the strong coupling . To compute the evolution of couplings and the quantity accurately, we do the following: (i) starting with couplings defined at the mass, we perform proper pole matching and running up to the top mass, (ii) we include external leg corrections (and the associated wave function renormalization), (iii) we simultaneously solve the 5 beta function differential equations for the 5 important couplings , and (iv) we include the full 2-loop beta functions for the Standard Model; these are presented in the Appendix (see [14, 15] for more information). In our numerics, we use particular values of the couplings , derived from the best fit values In our final analysis, we will allow for three different values of , namely, the central value and 1-sigma variation  GeV, and we will explore a range of , with  GeV.

Performing the RG evolution leads to the energy dependent renormalized coupling . A plot of is given in Figure 2 for three Higgs mass values, namely, GeV (lower curve),  GeV (middle curve), and  GeV (upper curve), with the top mass fixed to the central value  GeV. This shows clearly that for the lighter Higgs masses the coupling passes through zero at a sub-Planckian energy scale and then remains negative. Furthermore, since the coupling only runs logarithmically slowly with energy, the value of can change by orders of magnitude if the starting value of the couplings changes by relatively small amounts. The domain involves a type of “attractive force" with negative potential energy density, as we now examine in more detail.

3. Metastability and Probability

If we think of the field value as being the typical energy pushed into a scattering process at energy , then we can translate the RG evolution of the couplings into an effective potential. Using and replacing , we obtain the (RG improved) effective potential at high energies () (see [16] for a precise analysis): where the wave function renormalization factor is given in terms of the anomalous dimension by , and we replace . Hence for a Higgs mass in the range observed by the LHC, the effective potential goes negative at a field value that is several orders of magnitude below the Planck scale, as can be deduced from the behavior of with  GeV in Figure 2.

We could plot directly; however the factor of makes it vary by many orders of magnitude as we explore a large field range. Instead a schematic of the resulting potential will be more illuminating for the present discussion in order to highlight the important features, as given in Figure 3. The plot is not drawn to scale; the 3 energy scales satisfy the hierarchy for a Higgs mass as indicated by LHC data  GeV. Note that the local maximum in the potential occurs at a field value that is necessarily very close to (only slightly smaller) and so we shall discuss these 2 field values interchangeably.

In this situation, the electroweak vacuum is only metastable. Its quantum mechanical tunneling rate can be estimated by Euclideanizing the action and computing the associated bounce action . This leads to the following probability of decaying in time through a bubble of size [17] The computation of the rate is rather involved, and we shall not pursue the details here. Suffice to say that, for the central values of Higgs mass and top mass from LHC data, it is found that the lifetime of the electroweak vacuum is longer than the present age of the universe [18, 19].

It is conceivable that it is an acceptable situation for the electroweak vacuum to be metastable. However, here we would like to present an argument that such a situation is statistically disfavorable. We imagine that, in the very early universe, the Higgs field was randomly distributed in space. For instance, during cosmological inflation the Higgs field could have been frozen at some value as the universe rapidly expands (if there is high scale inflation) until after inflation when the field will oscillate and its initial value could plausibly have been random and uniformly distributed. If this is the case, then what is the probability that the Higgs field began in the metastable region , rather than the unstable region ? The answer depends on the allowed domain the Higgs can explore. Here we estimate the allowed domain to be Planckian, that is, , but our argument only depends on the upper value being much larger than . Naively, this would lead to a probability ; however we should recall that the Higgs is a complex doublet, composed of 4 real scalars, and each one would need to satisfy in the early universe to be in the metastable region. Hence, we estimate the probability as For example, if we describe the physics in Coulomb gauge, then we have both the modulus of the Higgs field and angular modes , with . From this point of view, it seems most reasonable to take the probability density weighted by an appropriate Jacobian factor associated with transforming from Cartesian field coordinates to such radial plus angular coordinates. This Jacobian scales as and so again will lead to the probability growing like the fourth power of the energy. Another way to put it is to say that there is much more field space available at large Higgs fields values than for small values. This seems reasonable, especially if one imagines initial conditions laid down by inflation. The number of states in the Hilbert space, whose typical Higgs value is large, is much greater than the number of states in the Hilbert space whose typical Higgs value is small. One might reach a different perspective in, say, unitary gauge where the angular modes appear as the longitudinal modes of the and bosons. However, the unitary gauge is not a useful way of describing physics above the electroweak scale. So we consider the above point of view with multiple scalars to be more physically reasonable.

So, for instance, for  GeV and  GeV, we have  GeV, leading to a probability , which indicates that the chance of randomly landing in the metastable region in the early universe is exceedingly unlikely. Instead it is far more likely to land in the unstable region indicated in Figure 3. Here the effective potential is negative leading to a catastrophic runaway instability, perhaps to a new VEV that is close to Planckian. This would in turn lead to a plethora of problems for the formation of complex structures, so we can safely assume such a regime is uninhabitable and irrelevant. This leads us to examine a scenario in which new physics enters and removes this problem.

4. Peccei-Quinn Dynamics and Distribution

One of the phenomenological reasons for new physics beyond the Standard Model is the fine tuning of the CP violating term in the QCD Lagrangian. The following dimension 4 operator is gauge invariant and Lorentz invariant and should be included in the QCD Lagrangian with a dimensionless coefficient : From bounds on the electric dipole moment of the neutron, this term is experimentally constrained to satisfy , which requires extreme fine tuning. There appears no statistical explanation of this fine tuning if it were purely random, since a small but moderate value of would have very little consequences for the formation of complex structures. Instead this requires a dynamical explanation, which we take to be due to a new global symmetry, known as a Peccei-Quinn (PQ) symmetry [7], involving a new heavy complex scalar field . This field undergoes spontaneous symmetry breaking at a scale , and in the resulting effective field theory, the quantity is essentially promoted to the angular degree of freedom in , a light scalar field known as the axion [8, 9]. The zero temperature Lagrangian for includes a symmetry breaking potential for and a QCD instanton generated sinusoidal potential for : If we expand around the -field’s VEV at low energies, we see that the angular component of , the axion, is very light with a mass (where is of order of the QCD-scale) and will be dynamically driven to zero, solving the strong CP problem. Since we will require to be a very high energy scale (compared to, say, the electroweak scale), the radial mode of is very heavy, with mass . Hence at low energies, the radial mode is essentially irrelevant; it can be integrated out and, apart from a possible renormalization of the Standard Model couplings, can be ignored. However at energies approaching , the radial mode cannot be ignored. The field couples to various Standard Model particles in any realization of the PQ symmetry, including the Higgs field through the interactions of the type (where is a dimensionless coupling). This causes an alteration in the effective coupling at an energy scale of order where the new field becomes dynamical.

Since is bosonic, it generally leads to a positive increase in , either through tree-level corrections or through loop corrections as follows: as long as is not very small, then it makes a significant and rapid change in the -function for of the form . This is because, in the cases of interest, is otherwise very small in the vicinity of this effect turning on, as seen in Figure 2. So even a small positive change in its function can cause a rapid change and stabilization of the effective potential leading to a threshold boost in at the scale at which this new degree of freedom becomes active, a point that was included nicely in [10]. This conclusion can only be avoided by an atypically tiny coupling between the Higgs and the PQ-field.

In the most common case then, this leads to a reduction or removal of the unstable region depending on the scale relative to the instability scale , assuming couplings between the Peccei-Quinn dynamics and the Higgs sector. The more precise statement is that the mass of the radial field is . This really sets the scale at which a correction to the function becomes active.

We obviously require to be in the range in order for the new physics to prevent the effective potential from having a large negative regime (note that a small negative dip is statistically allowable for , but not a large field dip). But since is very large, this leaves several orders of magnitude uncertainty in the value of . In other words, it would be sufficient for in order to remove the unstable region. However, here we would like to present a statistical argument that is much more likely. We shall take in the following discussion to illustrate the idea, though it is simple to generalize the argument. There are indications that the PQ-scale may be associated with GUT or Planckian physics, and indeed typical realizations of the QCD-axion in string theory suggest that is much closer to the GUT or Planck scale [20], rather than a more intermediate scale, such as ~1011 GeV. In some landscape, we can imagine scanning over different values. For lack of more detailed knowledge, we can imagine that it scans on, say, a uniform distribution in the range . If this is the case, then will be as small as is required but would not be significantly smaller as that would be even rarer in the landscape. By placing on a uniform distribution, the probability that it will be small enough to alleviate the instability is roughly where almost all of the phase space pushes , rather than orders of magnitude lower. It is important to note that, for , as arises from the measured Standard Model’s couplings, the probability in (11) is small but still much greater than the probability in (7). Hence it is much more likely to have an atypically small PQ-scale and no constraint on the initial Higgs field, than an atypically small Higgs field and no constraint on the PQ-scale. We now examine the cosmological consequences of .

5. Axion Dark Matter

The light scalar axion particle is neutral, is very stable, and acts as a form of dark matter. The computation of its abundance is nontrivial and has been studied in many papers, including [21, 22]. The final result for the axion abundance is essentially controlled by the scale . Its value is normally measured by the quantity , where is the energy density in axion dark matter and is the so-called “critical density" of the universe defined through the Friedman equation as . Tracking the nontrivial temperature dependence of the axion potential and redshifting to late times lead to the following expression for : where the Hubble parameter is represented as  km/s/Mpc and is the CMB temperature. The coefficient is an fudge factor due to uncertainty in the detailed QCD effects that set the axion mass and its temperature dependence. In our numerics, we have taken as a representative value. It is quite possible that the true value may be smaller than this, such as as taken in [22], but other effects, including contributions from string-axions, can potentially push the true value to be larger [23]. Also, is the initial angle in the early universe (which later redshifts towards zero, solving the strong CP problem). Here we take , which comes from allowing to be uniformly distributed in the domain and then spatial averaging. Another interesting possibility arises if inflation occurs after Peccei-Quinn symmetry breaking, allowing to be homogeneous and possibly small, as studied in [24, 25]. The latter scenario is subject to various constraints, including bounds on isocurvature fluctuations [26], and will not be our focus here.

The quantity is slightly inconvenient for expressing the main results for the following two reasons: (i) in a flat universe (as we are assuming) it is bounded to satisfy , which obscures the fact that a priori the dark matter abundance could be enormous, and (ii) it is manifestly time dependent (due to and ), which requires some choice of physical time to compare different universes. To avoid these complications, we prefer to compute the dark matter density in units of the baryon density. Fixing the baryon to photon ratio at the measured value, we have with from observation. From this, we define the (unbounded and time independent) measure of dark matter as Observations show that the dark matter density parameter is nonzero in our universe, although its particular particle properties (whether axion or WIMP, etc.) are still unknown. The observational evidence for dark matter comes from a range of sources, including CMB, lensing, galaxy rotation and clustering, structure formation, and baryon-acoustic-oscillations, and is very compelling; for example, see [2732], and its abundance has been measured quite accurately. Hence our prediction for the value of (coming from setting with determined by ) can be compared to observation; see Figure 1.

6. Results and Discussion

6.1. Comparison with Data

Let us summarize our argument: holding other parameters fixed, the Higgs mass determines the instability scale , which we evaluate at 2-loop order. We have argued on statistical grounds in Section 3 why the scale of new physics should not be larger than and in Section 4 why the scale of new physics should not be (significantly) smaller than , leading to . Since determines the dark matter abundance in (15), this establishes a correlation between and . The result was displayed earlier in the paper in Figure 1. The solid-blue curve is for the central value of the top mass  GeV, and the dashed-blue curves are for  GeV. We compare this prediction to the latest LHC and cosmological data. Firstly, we have taken the ATLAS value [1] and the CMS value [2] and produced our own combined value of  GeV, which is indicated by the red vertical lines. Secondly, we have taken the WMAP7 data, plus other observations, for the dark matter abundance and the baryon abundance [6] and combined them to obtain , which is indicated by the green horizontal lines. The predicted correlation between the Higgs mass and the dark matter abundance in Figure 1 displays good agreement with current data.

6.2. Precision and Uncertainties

Improved accuracy in testing this scenario comes in several experimental directions. This includes measuring the Higgs mass to better precision, as well as the top mass and the strong coupling (which we set to the central value ), while the current accuracy in is quite good. A theoretical uncertainty surrounds the specific choice of relative to . Here we have taken , due to a statistical argument that allowed the scale to scan, leading to the conclusion that it should be as small as required, but no smaller—an argument that is similar to the argument for the magnitude of the cosmological constant [33]. One might argue that a factor of a few smaller may be required to properly alleviate the instability [10], which would lead to a slight lowering of the blue curves in Figure 1, but a small negative dip is tolerable statistically, which makes plausible.

Related to this uncertainty is the particular prior distribution for , which we assumed to be uniform. The expectation of a flat distribution is plausible for the cosmological constant if one allows both positive and negative values, making not special. In the case of , it is necessarily positive, so is arguably a special part of the distribution. This may render the true distribution nonuniform. However, as long as the distribution does not vanish in the limit faster than , then our arguments go through. In other words, the probability of an atypically small and no constraint on the initial Higgs field would still be larger than the probability of an atypically small Higgs field and no constraint on . Also, one may question whether the uniform distribution assumed for the initial values of each of the 4 components of the Higgs field is reasonable. Since we have a sufficiently limited understanding of the early universe, including a measure problem for inflation, any such assumptions could be called into question. However, since the metastable region occupies such a tiny fraction of the volume of field space, roughly ~(10−8)4 = 10−32 or so, an alteration in prior probabilities would need to be quite drastic to change the conclusions.

6.3. Outlook

An important test of this scenario involves unravelling the nature of dark matter directly. The QCD-axion is actively being searched for in a range of experiments, including ADMX [34], with no positive detection so far. But the regime of parameter space in which the axion can be the dark matter will be explored in coming years. If an axion is discovered, it will be important to unravel its particular properties including its coupling to other fields. An explicit embedding of the discovered version (popular models include KSVZ [35, 36] and DFSZ [37, 38]) into the Higgs stability analysis would be important. Searches such as ADMX rely upon the axion being all or most of the dark matter, so a related verification would be the associated lack of discovery of WIMPs, or other dark matter candidates, in direct or indirect searches. Or at least these forms of dark matter should comprise a relatively small fraction of the total.

The discovery of the Higgs boson at the LHC is a final confirmation of the Standard Model. This leaves the scale at which the theory breaks down unclear. Here we have investigated the possibility that the theory, or at least the Higgs sector, remains intact until the scale at which the Higgs potential runs negative which would lead to a runaway instability at large field values. By introducing Peccei-Quinn dynamics, we can potentially solve the strong CP problem, remove the unstable region, and obtain roughly the correct amount of dark matter due to a collection of statistical arguments that sets . This is remarkably minimal but does still leave questions regarding unification, baryogenesis, inflation, hierarchy problem, and so forth. It is conceivable that unification can still occur at higher energies by the introduction of new degrees of freedom, that the physics of baryogenesis and inflation is associated with such high scale physics [39, 40], and that the hierarchy problem has no dynamical explanation. Alternatively, the LHC or other experiments may discover new degrees of freedom at much lower energies, which would radically alter this picture. Currently all such issues remain largely unclear, requiring much more guidance from experiment and observation.

Appendix

Standard Model 2-Loop Beta Functions

In this Appendix, we list the RG equations for the couplings at energies above the top mass at 2-loop order from [14, 15]. In each case, we write , and so forth, where , and is the starting renormalization scale, taken to be . We also performed proper pole matching for couplings defined at the mass and running up to the top mass , but for brevity we do not list those details here.

For the Higgs quartic coupling, we have For the top quark Yukawa coupling, we have For the 3 gauge couplings , we have where By solving the set of 5 coupled differential equations, we obtain as a function of energy or .

The wave function renormalization of the Higgs field is , where the anomalous dimension is

Conflicts of Interest

The author declares that there are no conflicts of interest regarding the publication of this paper.

Acknowledgments

The author would like to thank Alan Guth and Frank Wilczek for helpful discussions and would also like to acknowledge support from the Center for Theoretical Physics at MIT and the Tufts Institute of Cosmology. This work is supported by the U.S. Department of Energy under cooperative research agreement Contract no. DE-FG02-05ER41360.