Tracking the Jets to Monte Carlo

Article: Calculating Track-Based Observables for the LHC
Authors: Hsi-Ming Chang, Massimiliano Procura, Jesse Thaler, and Wouter J. Waalewijn
Reference: arXiv:1303.6637 [hep-ph]

With the LHC churning out hordes of data by the picosecond, theorists often have to work closely with experimentalists to test various theories, devise models, etc. In order to make sure they’re developing the right kinds of tools to test their ideas, theorists have to understand how experimentalists analyze their data.

One very simple example that’s relevant to the LHC involves the reconstruction of jets. Experimentalists often reconstruct jets from only the charged particles (such as \pi^{\pm}) they see in the detector while theorists generally do calculations where jets have both charged and neutral particles (such as \pi^0, see my previous ParticleBites for a more general intro to jets). In order to compare predictions with experiment, someone is going to have to compromise. The concept of Track-Based Observables is an interesting example of a way of attacking the more general problem of connecting the blackboard with accelerator.

Before diving into the details, let’s briefly look at the basics of how theorists and experimentalists make predictions about these trouble-making collimated sprays of radiation.

Figure 1: A basic diagram of different parts of the ATLAS detector. Note, in particular, what's being measured at the calorimetry and tracking stages.
Figure 1: A basic diagram of different parts of the ATLAS detector. Note, in particular, what’s being measured at the calorimetry and tracking stages.

Experiment: For experimentalists, jets are collections of clustered signatures left on the detector. These signatures generally come from calorimetry (i.e. measure the energy deposits) and/or from tracking where the trajectories (i.e. the kinematics) of final state particles are measured. At the LHC, where luminosities are absurdly high, a phenomenon known as “pileup” becomes an increasingly aggravating contaminant in the data. In short, pile-up is caused by the fact that the LHC isn’t just shooting a single proton at another proton in the beam, its shooting bunches of protons at each other. Of course while this makes the likelihood of a direct proton-proton collision more likely, it also causes confusion when looking at the mess of final state particles. How do you know which tracks of particles came from a given collision? It turns out that since measuring particle tracks gives you the kinematic properties of particles, those particles are much easier to deal with. However, neutral particles aren’t detected via tracking at the LHC, only via calorimetry. Thus, to make their lives significantly easier, experimentalists can simply do their analyses and reconstruct jets using purely the charged particles.

Theory: The notion of ignoring charged particles in the name of convenience doesn’t immediately jive with theorists. When using standard Quantum Field Theoretic (QFT) techniques to make predictions about jets, theorists have to turn back the clock on the zoo of final state particles that experimentalists analyze. For most sane theorists, calculations that involve 1 particle splitting into 2 is about all we generally tackle using perturbative QFT. Depending on the circumstances, calculating things at this stage is usually said to be a “next-to-leading order” or NLO calculation when one is using perturbation theory to describe strong interactions. This is because there is a factor of the strong coupling constant \alpha_s multiplying this piece of whatever expression you’re dealing with. Luckily, it turns out that in QCD, this is largely all you need most of the time to make decent predictions. The tricky part is then turning a prediction about 1 particle splitting into 2 into a prediction about those particles splitting into 4 then 8 and so on, i.e. a fully hadronized and realistic event. For this, theorists generally employ hadronization “models” that are found by fitting to data or events simulated by Monte Carlo event generators. But how do you do this fitting? How can you properly separate physics into non-perturbative and perturbative pieces in a way that makes sense?

Figure 1: Theorists have to take parton-level predictions that can be reasonably calculated analytically using perturbative-QCD (or an effective theory) and using a combination of tools, turn these calculations into predictions about fully hadronized events. This diagram shows an example of a single parton hadronizing into many final state particles. The 1->2 splitting is calculated in perturbation theory. The second stage, which involves various q->qg, g->gg, etc. splittings (often referred to as "showering") is often obtained using something called "renormalization group evolution" which is beyond the scope of this bite. The final stage involves the actual fragmentation of partons into hadrons. This is where non-perturbative physics takes over and fits must often be done to Monte Carlo event generators or data. This diagram can be found in a fantastic talk by one of the paper's authors at https://goo.gl/2QlkSz.
Figure 2: Theorists have to take parton-level predictions that can be reasonably calculated analytically using perturbative-QCD (or an effective theory) and using a combination of tools, turn these calculations into predictions about fully hadronized events. This diagram shows an example of a single parton hadronizing into many final state particles. The 1->2 splitting is calculated in perturbation theory. The second stage, which involves various q->qg, g->gg, etc. splittings (often referred to as “showering”) is often obtained using something called “renormalization group evolution” which is beyond the scope of this bite. The final stage involves the actual fragmentation of partons into hadrons. This is where non-perturbative physics takes over and fits must often be done to Monte Carlo event generators or data. This diagram can be found in a fantastic talk by one of the paper’s authors at BOOST 2013.

The popular way of handling the tricky task of connecting NLO calculations with fully hadronized events is by using factorization theorems. What factorization theorems do, in short, is to compartmentalize analytic calculations into pieces involving physics happening at widely separated energy scales. Using perturbative techniques, we calculate relevant processes happening in each regime separately, and then combine those pieces as dictated by the factorization theorem to come up with a full answer. This oftentimes involves splitting up an observable into a perturbatively calculable piece and a non-perturbative piece that must be obtained via a fit to data or a Monte Carlo simulation.

Warning: These theorems are, in all but a few cases, not rigorously defined although there are many examples where factorization theorems can be shown to be true up to small power corrections in some parameter. This is often where the power of effective field theories is used. We’ll address this important and complex issue in future bites.
Where do track-based observables fit into all of this? Once we have a factorization theorem, we have to make sure that we encapsulate the physics in a systematic way that doesn’t have infinities popping up all over the place (as often happens when theorists aren’t careful). The authors of arXiv:1303.6637 define track functions that are combined with perturbatively calculable pieces via a factorization theorem they discuss to make an observable cross-section. To find the track functions, they fit their NLO analytic distributions for the fraction of the total jet energy carried by the jet’s charged particles to simulations done by PYTHIA.

Figure 3: The authors plot their analytic calculations combined with fitted track functions against results generated entirely from PYTHIA. Their results show that track functions can do a good job (up to smearing effects that PYTHIA's hadronization model imposes).
Figure 3: The authors plot their analytic calculations combined with fitted track functions against results generated entirely from PYTHIA. Their results show that track functions can do a good job (up to smearing effects that PYTHIA’s hadronization model imposes).

As can be seen in Figure 2, the authors then make predictions for a totally different observable using track functions combined with their analytic calculations and compare them to events generated entirely in PYTHIA, achieving very good results. Let’s take a second to appreciate how powerful an idea this is. By writing down the foundation for track functions, the authors have laid the groundwork for a host of track-based observables that focus specifically on probing the charged particles within a jet. However, the real impact is that these track functions are that they can be applied to any quarks/gluons that are undergoing fragmentation in any process. When these are eventually extracted from data at the LHC, phenomenologists will be able to perform analytic calculations and then, using track functions, make precision predictions about track-based observables that experimentalists can then use to more easily extract information about the structure of jets.

Further Reading

  1. “Fragmentation and Hadronization,” by B.R. Webber of Cambridge/CERN. A concise and very clear introduction to the basics of how we model non-perturbative physics and the basic ideas behind how Monte Carlo event generators simulate fragmentation/hadronization.
  2. “Factorization of Hard Processes in QCD”, by Collins, Soper, and Sterman. The grandaddy of papers on factorization by 3 giants in the field. Outlines the few existing proofs of factorization theorems.
  3. “How a detector works,” A quick and very basic introduction to detectors used in particle physics.
The following two tabs change content below.
Reggie Bain is a grad student in theoretical physics at Duke University. His research focuses on studying phenomenology in Quantum Chromodynamics, the physics governing the strong nuclear force. His current projects involve the use of “effective” field theories to study the production of heavy mesons and highly collimated showers of particles called jets at the LHC. As an undergraduate, he worked high energy physics research with the ATLAS project at the University of South Carolina (USC). Reggie is also passionate about science communication and outreach. He co-founded and directed Carolina Science Outreach at USC, an organization that gives fun and interactive science presentations to K-12 students across South Carolina. As a graduate student, he's worked on organizing science communication workshops for graduate students. He co-founded the ComSciCon-Triangle workshop series in the NC Research Triangle and has been an organizer for the national ComSciCon workshops at Harvard/MIT since 2015.

Latest posts by Reggie Bain (see all)

CategoriesUncategorized

Leave a Reply

Your email address will not be published. Required fields are marked *