LHC Data: Where do we look after the Higgs?

In 2012, the two largest experiments at CERN, ATLAS and CMS, announced that they had discovered the Higgs boson. It was an extraordinarily exciting moment for the particle physics community—we had finally found all the particles in the Standard Model! And then it was a disturbing moment for the community: where do we look next?

How have we searched for new particles so far?

The Standard Model in particle physics is a catalog of all the known particles in our universe. It contains the fundamental particles that make the atom (electrons, and quarks which comprise the protons and neutrons) as well as their heavier cousins that decay too quickly to form stable things like atoms. It also contains what we call ‘bosons,’ or particles that mediate interactions. As an example: macroscopically, we see two positively charged objects repel each other. In the particle physics picture, the repulsion happens as bosons (photons in this case) are being thrown from one electron to another, pushing them away.

When the Large Hadron Collider (LHC) was built at CERN, we expected to find the Higgs as the Standard Model needed the Higgs to explain how particles could have mass. Theoretical predictions told us at what energy we needed to build the collider. Lo and behold, after enough data was collected, we found the Higgs.

Image from CMS Collaboration. The deviation from data (black dots) to the null hypothesis (green band) at 125 GeV is due to the Higgs Boson.

That being said, nothing about discovering a new particle is straightforward. Unfortunately, there is no microscope that can make the Higgs boson visible to the human eye. Instead, experimental particle physicists must be much cleverer. The Higgs decays far too quickly to be seen even if we had a means of observing it directly. Instead, we look for the decay products of the Higgs.

Imagine you plant a tomato seed in the early spring and spend the next few months traveling. When you return to check on your plant, you don’t look for a seed. You look for a long stem with several tomatoes hanging off it. Looking for particles at high energy colliders is the same idea, only instead of months it takes fractions of a second for a particle to decay and produce many new particles. If you expect to see the original particle, just like expecting to see a single seed, you simply won’t find it.

The Higgs decays into almost all massive particles. Einstein has told us that energy is equal to mass, and we know we can’t pull energy out of nowhere, so we just need to find the particles with energy that equal the mass of the Higgs. Simple conservation of energy, right?

Again, we need to be much cleverer. The Higgs can decay into particles that also decay, meaning we now have four particles to look for. And maybe some of those particles decay a lot and produce tens or hundreds of particles clustered together (what we call a jet). These objects are certainly more complicated than a simple two particle search, but if the detector is empty except for the decaying Higgs, it shouldn’t matter.

Again, we are reminded of how difficult the job of an experimentalist is! The detector, as you may have guessed, is far from empty. Nearly 40 proton-proton collisions are occurring at the same time in the detector, spaced only millimeters away. There are thousands of particles to sort through if you want to consider any one collision. Finding the Higgs seems impossible at this point. But experimentalists did it. Using the theoretical predictions regarding the rates of how often we get a Higgs boson in these collisions and how often they decay into certain types of fundamental particles, experimentalists were able to pick out enough Higgs bosons to claim a discovery.

At this point it should be clear that finding a new particle is hard. It seems the only reason we could find the Higgs was that we knew exactly what we were looking for. But now that the Higgs has been discovered, particle physicists are left with the terrifying notion that we have no idea what we are looking for.

Why are we still looking?

Data and theory agree extremely well, which is why we continue to use the Standard Model as the guiding principle in particle physics. But they also disagree enough that we know it can’t be the full story. Particle physicists often try to develop new theoretical models that can be tested at the LHC, but without a beacon as bright as something like the Higgs, it is doubtful that any specific model will fix all the discrepancies we observe.

This has brought several physicists to consider a different paradigm when searching for new physics: instead of letting theory drive how we search, why not search for absolutely anything that we can, and fill out the theory details later?

First Data, then Theory

An uncomfortable truth about experiments like the LHC that are high energy and large data output is that how we look for new particles limits what kind of new particles we can find.

This problem starts at the trigger. Almost all the collisions that happen at the LHC are ‘uninteresting.’ They are uninteresting mostly because they occur very often and we already understand the physics. It’s processes that are uncommon that we still need to study. Because electronics physically cannot transfer the amount of data being collected for every collision fast enough, we have to be picky. In fact, we throw away over 99.999% of data produced at the LHC. The choice of what events we save and what we throw away is fixed by the trigger.

Image from ATLAS Collaboration. A visualization of a proton-proton collision at the LHC. There are many particles in this event, including a muon (red) and an electron (blue) that might have been used for triggering.

The trigger has specifications like “keep all the data from collisions that produce an electron with energy about 20 GeV” or “keep all events with two muons and two electrons.” When the trigger condition is met, whatever it may be, the data in the detector is stored. While this helps us wade through the swamp of data, it could also be throwing away events that signal the existence of new particles. There is no way around this problem besides finding triggers that are robust enough to be unlikely to throw away interesting events. Until electronics get faster and data storage capabilities increase, we will have to use triggers.

The next problem is deciding on the signature, or what particles in the detector actually signal the existence of a new particle. The signal of a particle is the list of particles that are observable after the particle is produced. For example, if a particle called a Z boson is produced, one of the signatures for the Z boson is an electron and an anti-electron. So to find the Z, we look for electron-anti-electron pairs whose energy adds up to the mass of the Z.

If we have no idea what kind of particle we are looking for, or how massive it is, or what other particles it interacts with, we have no theoretical motivation for what particle to look for. So far, the most popular way to search for these new, mysterious particles is by first building models that predicts properties of the particle. Therefore, we assume that maybe this particle decays into something we can look for, like three jets. This kind of search is easy in one way: we know exactly what to look for! Anyone who has been to a hardware store knows that knowing the name or shape of what you want makes it infinitely easier to find. However, just because we know what we are looking for doesn’t mean it was the right thing to look for. Thus, a new trend in particle searches is rising. Instead of building models that predict very specific signals, why don’t we just look at the most general things we can possibly look at?

This concept of data-first-then-theory is not a function of physicists getting lazy. Instead, one should view it as scientific environmentalism. There are several enormous, expensive experiments that have run or are still running and collecting data. Clearly the analysis done on the data they’ve already collected hasn’t led to any recent discoveries. But does that mean it’s bad data or a worthless experiment? Absolutely not! It could just mean that we blinded ourselves to new physics by looking in the wrong direction.

Imagine you’re locked out of a room and given a huge ring of keys of all shapes and sizes. First, you’d try the keys that seem most likely to fit into the lock—maybe they have the same number on the door written on the handle, maybe the metal matches the lock. But when these don’t work, you have to start considering the keys that might look a bit funny or seem too bizarre to fit in the lock. After all, the only tools you have are the keys in your hand. This is what particle physicists are trying to do at colliders like the LHC. The amount of data we have access to is overwhelmingly big, and we had to start looking somewhere. But just because we’ve looked at some data doesn’t mean we’ve completely exhausted the possibility of finding a new particle in this data. It just means maybe we haven’t tried the right key.

In my next post, I’ll describe how a few particle theorists, including myself, performed an agnostic search for new physics in LHC data nearly a decade old. In our search we restricted ourselves to looking at one of the simplest triggers, and were able to look for new physics no one else had yet searched for.

A Moriond Retrospective: New Results from the LHC Experiments

Hi ParticleBiters!

In lieu of a typical HEP paper summary this month, I’m linking a comprehensive overview of the new results shown at this year’s Moriond conference, originally published in the CERN EP Department Newsletter. Since this includes the latest and greatest from all four experiments on the LHC ring (ATLAS, CMS, ALICE, and LHCb), you can take it as a sort of “state-of-the-field”. Here is a sneak preview:

“Every March, particle physicists around the world take two weeks to promote results, share opinions and do a bit of skiing in between. This is the Moriond tradition and the 52nd iteration of the conference took place this year in La Thuile, Italy. Each of the four main experiments on the LHC ring presented a variety of new and exciting results, providing an overview of the current state of the field, while shaping the discussion for future efforts.”

Read more in my article for the CERN EP Department Newsletter here!

The integrated luminosity of the LHC with proton-proton collisions in 2016 compared to previous years. Luminosity is a measure of a collider’s performance and is proportional to the number of collisions. The integrated luminosity achieved by the LHC in 2016 far surpassed expectations and is double that achieved at a lower energy in 2012.



Studying the Higgs via Top Quark Couplings

Article: “Implications of CP-violating Top-Higgs Couplings at LHC and Higgs Factories”

Authors: Archil Kobakhidze, Ning Liu, Lei Wu, and Jason Yue

Reference: arXiv hep-ph 1610.06676


It has been nearly five years since scientists at the LHC first observed a new particle that looked a whole lot like the highly sought after Higgs boson. In those five years, they have poked and prodded at every possible feature of that particle, trying to determine its identity once and for all. The conclusions? If this thing is an imposter, it’s doing an incredible job.

This new particle of ours really does seem to be the classic Standard Model Higgs. It is a neutral scalar, with a mass of about 125 GeV. All of its couplings with other SM particles are lying within uncertainty of their expected values, which is very important. You’ve maybe heard people say that the Higgs gives particles mass. This qualitative statement translates into an expectation that the Higgs coupling to a given particle is proportional to that particle’s mass. So probing the values of these couplings is a crucial task.

Figure 1: Best-fit results for the production signal strengths for the combination of ATLAS and CMS. Also shown for completeness are the results for each experiment. The error bars indicate the 1σ intervals.

Figure 1 shows the combined experimental measurements between ATLAS and CMS of Higgs decay signal strengths as a ratio of measurement to SM expectation. Values close to 1 means that experiment is matching theory. Looking at this plot, you might notice that a few of these values have significant deviations from 1, where our perfect Standard Model world is living. Specifically, the ttH signal strength is running a bit high. ttH is the production of a top pair and a Higgs from a single proton collision. There are many ways to do this, starting from the primary Higgs production mechanism of gluon-gluon fusion. Figure 2 shows some example diagrams that can produce this interesting ttH signature. While the deviations are a sign to physicists that maybe we don’t understand the whole picture.

Figure 2: Parton level Feynman diagrams of ttH at leading order.

Putting this in context with everything else we know about the Higgs, that top coupling is actually a key player in the Standard Model game. There is a popular unsolved mystery in the SM called the hierarchy problem. The way we understand the top quark contribution to the Higgs mass, we shouldn’t be able to get such a light Higgs, or a stable vacuum. Additionally, electroweak baryogenesis reveals that there are things about the top quark that we don’t know about.

Now that we know we want to study top-Higgs couplings, we need a way to characterize them. In the Standard Model, the coupling is purely scalar. However, in beyond the SM models, there can also be a pseudoscalar component, which violates charge-parity (CP) symmetry. Figure 3 shows a generic form for the term, where Cst is the scalar and Cpt is the pseudoscalar contribution. What we don’t know right away are the relative magnitudes of these two components. In the Standard Model, Cst = 1 and Cpt = 0. But theory suggests that there may be some non-zero value for Cpt, and that’s what we want to figure out.

Figure 3

Using simulations along with the datasets from Run 1 and Run 2 of the LHC, the authors of this paper investigated the possible values of Cst and Cpt. Figure 4 shows the updated bound. You can see from the yellow 2σ contour that the new limits on the values are |Cpt| < 0.37 and 0.85 < Cst < 1.20, extending the exclusions from Run 1 data alone. Additionally, the authors claim that the cross section of ttH can be enhanced up to 1.41 times the SM prediction. This enhancement could either come from a scenario where Cpt = 0 and Cst > 1, or the existence of a non-zero Cpt component.

Figure 4: The signal strength µtth at 13 TeV LHC on the plane of Cst and Cpt. The yellow contour corresponds to a 2σ limit.

Further probing of these couplings could come from the HL-LHC, through further studies like this one. However, examining the tH coupling in a future lepton collider would also provide valuable insights. The process e+e- à hZ contains a top quark loop. Thus one could make a precision measurement of this rate, simultaneously providing a handle on the tH coupling.


References and Further Reading:

  1. “Enhanced Higgs associated production with a top quark pair in the NMSSM with light singlets”. arXiv hep-ph 02353
  2. “Measurements of the Higgs boson production and decay rates and constraints on its couplings from a combined ATLAS and CMS analysis of the LHC pp collision data at √s = 7 and 8 TeV.” ATLAS-CONF-2015-044



Inspecting the Higgs with a golden probe

Hello particle nibblers,

After recovering from a dead-diphoton-excess induced depression (see here, here, and here for summaries) I am back to tell you a little more about something that actually does exist, our old friend Monsieur Higgs boson. All of the fuss over the past few months over a potential new particle at 750 GeV has perhaps made us forget just how special and interesting the Higgs boson really is, but as more data is collected at the LHC, we will surely be reminded of this fact once again (see Fig.1).

Figure 1: Monsieur Higgs boson struggles to understand the Higgs mechanism.

Previously I discussed how one of the best and most precise ways to study the Higgs boson is just by `shining light on it’, or more specifically via its decays to pairs of photons. Today I want to expand on another fantastic and precise way to study the Higgs which I briefly mentioned previously; Higgs decays to four charged leptons (specifically electrons and muons) shown in Fig.2. This is a channel near and dear to my heart and has a long history because it was realized, way before the Higgs was actually discovered at 125 GeV, to be among the best ways to find a Higgs boson over a large range of potential masses above around 100 GeV. This led to it being dubbed the “gold plated” Higgs discovery mode, or “golden channel”, and in fact was one of the first channels (along with the diphoton channel) in which the 125 GeV Higgs boson was discovered at the LHC.

Figure 2: Higgs decays to four leptons are mediated by the various physics effects which can enter in the grey blob. Could new physics be hiding in there?
Figure 2: Higgs decays to four leptons are mediated by the various physics effects which can enter in the grey blob. Could new physics be hiding in there?

One of the characteristics that makes the golden channel so valuable as a probe of the Higgs is that it is very precisely measured by the ATLAS and CMS experiments and has a very good signal to background ratio. Furthermore, it is very well understood theoretically since most of the dominant contributions can be calculated explicitly for both the signal and background. The final feature of the golden channel that makes it valuable, and the one that I will focus on today, is that it contains a wealth of information in each event due to the large number of observables associated with the four final state leptons.

Since there are four charged leptons which are measured and each has an associated four momentum, there are in principle 16 separate numbers which can be measured in each event. However, the masses of the charged leptons are tiny in comparison to the Higgs mass so we can consider them as massless (see Footnote 1) to a very good approximation. This then reduces (using energy-momentum conservation) the number of observables to 12 which, in the lab frame, are given by the transverse momentum, rapidity, and azimuthal angle of each lepton. Now, Lorentz invariance tells us that physics doesnt care which frame of reference we pick to analyze the four lepton system. This allows us to perform a Lorentz transformation from the lab frame where the leptons are measured, but where the underlying physics can be obscured, to the much more convenient and intuitive center of mass frame of the four lepton system. Due to energy-momentum conservation, this is also the center of mass frame of the Higgs boson. In this frame the Higgs boson is at rest and the \emph{pairs} of leptons come out back to back (see Footnote 2) .

In this frame the 12 observables can be divided into 4 production and 8 decay (see Footnote 3). The 4 production variables are characterized by the transverse momentum (which has two components), the rapidity, and the azimuthal angle of the four lepton system. The differential spectra for these four variables (especially the transverse momentum and rapidity) depend very much on how the Higgs is produced and are also affected by parton distribution functions at hadron colliders like the LHC. Thus the differential spectra for these variables can not in general be computed explicitly for Higgs production at the LHC.

The 8 decay observables are characterized by the center of mass energy of the four lepton system, which in this case is equal to the Higgs mass, as well as two invariant masses associated with each pair of leptons (how one picks the pairs is arbitrary). There are also five angles (\Theta, \theta_1, \theta_2, Φ, Φ1) shown in Fig. 3 for a particular choice of lepton pairings. The angle \Theta is defined as the angle between the beam axis (labeled by p or z) and the axis defined to be in the direction of the momentum of one of the lepton pair systems (labeled by Z1 or z’). This angle also defines the ‘production plane’. The angles \theta_1, \theta_2 are the polar angles defined in the lepton pair rest frames. The angle Φ1 is the azimuthal angle between the production plane and the plane formed from the four vectors of one of the lepton pairs (in this case the muon pair). Finally Φ is defined as the azimuthal angle between the decay planes formed out of the two lepton pairs.

Figure 3: Angular center of mass observables ($latex \Theta, \theta_1, \theta_2, Φ, Φ_1$) in Higgs to four lepton decays.
Figure 3: Angular center of mass observables in Higgs to four lepton decays.

To a good approximation these decay observables are independent of how the Higgs boson is produced. Furthermore, unlike the production variables, the fully differential spectra for the decay observables can be computed explicitly and even analytically. Each of them contains information about the properties of the Higgs boson as do the correlations between them. We see an example of this in Fig. 4 where we show the one dimensional (1D) spectrum for the Φ variable under various assumptions about the CP properties of the Higgs boson.

Figure 4: Here I show various examples for the Φ differential spectrum assuming different possibilities for the CP properties of the Higgs boson.
Figure 4: Here I show various examples for the Φ differential spectrum assuming different possibilities for the CP properties of the Higgs boson.

This variable has long been known to be sensitive to the CP properties of the Higgs boson. An effect like CP violation would show up as an asymmetry in this Φ distribution which we can see in curve number 5 shown in orange. Keep in mind though that although I show a 1D spectrum for Φ, the Higgs to four lepton decay is a multidimensional differential spectrum of the 8 decay observables and all of their correlations. Thus though we can already see from a 1D projection for Φ how information about the Higgs is contained in these distributions, MUCH more information is contained in the fully differential decay width of Higgs to four lepton decays. This makes the golden channel a powerful probe of the detailed properties of the Higgs boson.

OK nibblers, hopefully I have given you a flavor of the golden channel and why it is valuable as a probe of the Higgs boson. In a future post I will discuss in more detail the various types of physics effects which can enter in the grey blob in Fig. 2. Until then, keep nibbling and don’t let dead diphotons get you down!

Footnote 1: If you are feeling uneasy about the fact that the Higgs can only “talk to” particles with mass and yet can decay to four massless (atleast approximately) leptons, keep in mind they do not interact directly. The Higgs decay to four charged leptons is mediated by intermediate particles which DO talk to the Higgs and charged leptons.

Footnote 2: More precisely, in the Higgs rest frame, the four vector formed out of the sum of the two four vectors of any pair of leptons which are chosen will be back to back with the four vector formed out of the sum of the second pair of leptons.

Footnote 3: This dividing into production and decay variables after transforming to the four lepton system center of mass frame (i.e. Higgs rest frame) is only possible in practice because all four leptons are visible and their four momentum can be reconstructed with very good precision at the LHC. This then allows for the rest frame of the Higgs boson to be reconstructed on an event by event basis. For final states with missing energy or jets which can not be reconstructed with high precision, transforming to the Higgs rest frame is in general not possible.

A New Solution to the Hierarchy Problem?

Hello particle Chompers,

Today I want to discuss a slightly more advanced topic which I will not be able to explain in much detail, but goes by the name of the gauge Hierarchy problem or just the `the Hierarchy Problem‘. My main motivation is to simply make you curious enough that you will feel inspired to investigate it further for yourself since it is one of the outstanding problems in particle physics and one of the main motivations for the construction of the LHC. A second motivation is to bring to your attention a recent and exciting paper which proposes a potentially new solution to the hierarchy problem.

The hierarchy problem can roughly be stated as the problem of why the vacuum expectation value (VEV) of the Higgs boson, which determines the masses of the electroweak W and Z bosons, is so small compared to the highest energy scales thought to exist in the Universe. More specifically, the masses of the W and Z bosons (which define the weak scale) are roughly \sim 10^{2} GeV (see Figure 1) in particle physics units (remember in these units mass = energy!).

The W boson as it finds to its astonishment that it has a mass of only about 100 GeV instead of $latex 10^{19}$ GeV as expected.
The W boson as it finds to its astonishment that it has a mass of only about 100 GeV instead of 10^{19} GeV as expected.

On the other hand the highest energy scale thought to exist in the Universe is the planck scale at \sim 10^{19} GeV which is associated with the physics of gravity. Quantum field theory tells us that the Higgs VEV should get contributions from all energy scales (see Figure 2) so the question is why is the Higgs VEV, and thus the W and Z boson masses, a factor of roughly \sim 10^{17} smaller than it should be?

The Higgs vacuum expectation value receives contributions from all energy scales.
The Higgs vacuum expectation value receives contributions from all energy scales.

In the Standard Model (SM) there is no solution to this problem. Instead one must rely on a spectacularly miraculous numerical cancellation among the parameters of the SM Lagrangian. Miraculous numerical `coincidences’ like this make us physicists feel uncomfortable to the point that we give it the special name of `fine tuning’. The hierarchy problem is thus also known as the fine tuning problem.

A search for a solution to this problem has been at the forefront of particle physics for close to 40 years. It is the aversion to fine tuning which leads most physicist to believe there must be new physics beyond the SM whose dynamics are responsible for keeping the Higgs VEV small. Proposals include supersymmetrycomposite Higgs models, extra dimensions, as well as invoking the anthropic principle in the context of a multiverse. In many cases, these solutions require a variety of new particles at energies close to the weak scale (\sim 100-1000 GeV) and thus should be observable at the LHC. However the lack of evidence at the LHC for any physics beyond the SM is already bringing tension to many of these solutions. A solution which does not require new particles at the weak scale would thus be very attractive.

Recently a novel mechanism, which goes by the name of \emph{cosmological relaxation of the electroweak scale}, has been proposed which potentially offers such a solution. The details (which physicists are currently still digesting) are well beyond the scope of this blog. I will just mention that the mechanism incorporates two previously proposed mechanisms known as inflation^1 and the QCD axion^2 which solve other known problems. These are combined with the SM in a novel way such that the weak scale can arise naturally in our universe without any fine tuning and without new particles at the weak scale (or multiple universes)! And as a bonus, the axion in this mechanism (referred to as the `relaxion’) makes a good dark matter candidate!

Whether or not this mechanism turns out to be a solution to the hierarchy problem will of course require experimental tests and further theoretical scrutiny, but its a fascinating idea which combines aspects of quantum field theory and general relativity so I hope it will serve as motivation for you to begin learning more about these subjects!


1. Inflation is a theorized period of exponential accelerated expansion of our Universe in the moments just after the big bang. It was proposed as a solution to the problems of why our Universe is so flat and (mostly) homogenous while also explaining the structure we see throughout the Universe and in the cosmic microwave background.

2. Axions are particles proposed to explain why the amount of CP violation in the QCD sector in the SM is so small, which is known as the `strong CP problem‘.