Dark Photons in Light Places

Title: “Searching for dark photon dark matter in LIGO O1 data”

Author: Huai-Ke Guo, Keith Riles, Feng-Wei Yang, & Yue Zhao

Reference: https://www.nature.com/articles/s42005-019-0255-0

There is very little we know about dark matter save for its existence. Its mass(es), its interactions, even the proposition that it consists of particles at all is mostly up to the creativity of the theorist. For those who don’t turn to modified theories of gravity to explain the gravitational effects on galaxy rotation and clustering that suggest a massive concentration of unseen matter in the universe (among other compelling evidence), there are a few more widely accepted explanations for what dark matter might be. These include weakly-interacting massive particles (WIMPS), primordial black holes, or new particles altogether, such as axions or dark photons. 

In particle physics, this latter category is what’s known as the “hidden sector,” a hypothetical collection of quantum fields and their corresponding particles that are utilized in theorists’ toolboxes to help explain phenomena such as dark matter. In order to test the validity of the hidden sector, several experimental techniques have been concocted to narrow down the vast parameter space of possibilities, which generally consist of three strategies:

  1. Direct detection: Detector experiments look for low-energy recoils of dark matter particle collisions with nuclei, often involving emitted light or phonons. 
  2. Indirect detection: These searches focus on potential decay products of dark matter particles, which depends on the theory in question.
  3. Collider production: As the name implies, colliders seek to produce dark matter in order to study its properties. This is reliant on the other two methods for verification.

The first detection of gravitational waves from a black hole merger in 2015 ushered in a new era of physics, in which the cosmological range of theory-testing is no longer limited to the electromagnetic spectrum. Bringing LIGO (the Laser Interferometer Gravitational-Wave Observatory) to the table, proposals for the indirect detection of dark matter via gravitational waves began to spring up in the literature, with implications for primordial black hole detection or dark matter ensconced in neutron stars. Yet a new proposal, in a paper by Guo et. al., suggests that direct dark matter detection with gravitational waves may be possible, specifically in the case of dark photons. 

Dark photons are hidden sector particles in the ultralight regime of dark matter candidates. Theorized as the gauge boson of a U(1) gauge group, meaning the particle is a force-carrier akin to the photon of quantum electrodynamics, dark photons either do not couple or very weakly couple to Standard Model particles in various formulations. Unlike a regular photon, dark photons can acquire a mass via the Higgs mechanism. Since dark photons need to be non-relativistic in order to meet cosmological dark matter constraints, we can model them as a coherently oscillating background field: a plane wave with amplitude determined by dark matter energy density and oscillation frequency determined by mass. In the case that dark photons weakly interact with ordinary matter, this means an oscillating force is imparted. This sets LIGO up as a means of direct detection due to the mirror displacement dark photons could induce in LIGO detectors.

Figure 1: The experimental setup of the Advanced LIGO interferometer. We can see that light leaves the laser and is reflected between a few power recycling mirrors (PR), split by a beam splitter (BS), and bounced between input and end test masses (ITM and ETM). The entire system is mounted on seismically-isolated platforms to reduce noise as much as possible. Source: https://arxiv.org/pdf/1411.4547.pdf

LIGO consists of a Michelson interferometer, in which a laser shines upon a beam splitter which in turn creates two perpendicular beams. The light from each beam then hits a mirror, is reflected back, and the two beams combine, producing an interference pattern. In the actual LIGO detectors, the beams are reflected back some 280 times (down a 4 km arm length) and are split to be initially out of phase so that the photodiode detector should not detect any light in the absence of a gravitational wave. A key feature of gravitational waves is their polarization, which stretches spacetime in one direction and compresses it in the perpendicular direction in an alternating fashion. This means that when a gravitational wave passes through the detector, the effective length of one of the interferometer arms is reduced while the other is increased, and the photodiode will detect an interference pattern as a result. 

LIGO has been able to reach an incredible sensitivity of one part in 10^{23} in its detectors over a 100 Hz bandwidth, meaning that its instruments can detect mirror displacements up to 1/10,000th the size of a proton. Taking advantage of this number, Guo et. al. demonstrated that the differential strain (the ratio of the relative displacement of the mirrors to the interferometer’s arm length, or h = \Delta L/L) is also sensitive to ultralight dark matter via the modeling process described above. The acceleration induced by the dark photon dark matter on the LIGO mirrors is ultimately proportional to the dark electric field and charge-to-mass ratio of the mirrors themselves.

Once this signal is approximated, next comes the task of estimating the background. Since the coherence length is of order 10^9 m for a dark photon field oscillating at order 100 Hz, a distance much larger than the separation between the LIGO detectors at Hanford and Livingston (in Washington and Louisiana, respectively), the signals from dark photons at both detectors should be highly correlated. This has the effect of reducing the noise in the overall signal, since the noise in each of the detectors should be statistically independent. The signal-to-noise ratio can then be computed directly using discrete Fourier transforms from segments of data along the total observation time. However, this process of breaking up the data, known as “binning,” means that some signal power is lost and must be corrected for.

Figure 2: The end result of the Guo et. al. analysis of dark photon-induced mirror displacement in LIGO. Above we can see a plot of the coupling of dark photons to baryons as a function of the dark photon oscillation frequency. We can see that over further Advanced LIGO runs, up to O4-O5, these limits are expected to improve by several orders of magnitude. Source: https://www.nature.com/articles/s42005-019-0255-0

In applying this analysis to the strain data from the first run of Advanced LIGO, Guo et. al. generated a plot which sets new limits for the coupling of dark photons to baryons as a function of the dark photon oscillation frequency. There are a few key subtleties in this analysis, primarily that there are many potential dark photon models which rely on different gauge groups, yet this framework allows for similar analysis of other dark photon models. With plans for future iterations of gravitational wave detectors, further improved sensitivities, and many more data runs, there seems to be great potential to apply LIGO to direct dark matter detection. It’s exciting to see these instruments in action for discoveries that were not in mind when LIGO was first designed, and I’m looking forward to seeing what we can come up with next!

Learn More:

  1. An overview of gravitational waves and dark matter: https://www.symmetrymagazine.org/article/what-gravitational-waves-can-say-about-dark-matter
  2. A summary of dark photon experiments and results: https://physics.aps.org/articles/v7/115 
  3. Details on the hardware of Advanced LIGO: https://arxiv.org/pdf/1411.4547.pdf
  4. A similar analysis done by Pierce et. al.: https://journals.aps.org/prl/pdf/10.1103/PhysRevLett.121.061102

To the Standard Model, and Beyond! with Kaon Decay

Title: “New physics implications of recent search for K_L \rightarrow \pi^0 \nu \bar{\nu} at KOTO”

Author: Kitahara et. al.

Reference: https://arxiv.org/pdf/1909.11111.pdf

The Standard Model, though remarkably accurate in its depiction of many physical processes, is incomplete. There are a few key reasons to think this: most prominently, it fails to account for gravitation, dark matter, and dark energy. There are also a host of more nuanced issues: it is plagued by “fine tuning” problems, whereby its parameters must be tweaked in order to align with observation, and “free parameter” problems, which come about since the model requires the direct insertion of parameters such as masses and charges rather than providing explanations for their values. This strongly points to the existence of as-yet undetected particles and the inevitability of a higher-energy theory. Since gravity should be a theory living at the Planck scale, at which both quantum mechanics and general relativity become relevant, this is to be expected. 

A promising strategy for probing physics beyond the Standard Model is to look at decay processes that are incredibly rare in nature, since their small theoretical uncertainties mean that only a few event detections are needed to signal new physics. A primary example of this scenario in action is the discovery of the positron via particle showers in a cloud chamber back in 1932. Since particle physics models of the time predicted zero anti-electron events during these showers, just one observation was enough to herald a new particle. 

The KOTO experiment, conducted at the Japan Proton Accelerator Research Complex (J-PARC), takes advantage of this strategy. The experiment was designed specifically to investigate a promising rare decay channel: K_L \rightarrow \pi^0 \nu \bar{\nu}, the decay of a neutral long kaon into a neutral pion, a neutrino, and an antineutrino. Let’s break down this interaction and discuss its significance. The kaon, a meson composed of an up quark and anti-strange quark, comes in both long and short varieties, describing the time of decay relative to each other. The Standard Model predicts a branching ratio of 3 \times 10^{-11} for this particular decay process, meaning that out of all the neutral long kaons that decay, only this tiny fraction of them decay into the combination of a neutral pion, neutrino, and an antineutrino, making it incredibly rare for this process to be observed in nature.

The Feynman diagram describing how a neutral pion, neutrino, and antineutrino are produced from a neutral long kaon. We note the production of two photons, a key observation for the KOTO experiment’s verification of event detection, as this differentiates this process from other neutral long kaon decay channels. Source: https://arxiv.org/pdf/1910.07585.pdf

Here’s where it gets exciting. The KOTO experiment recently reported four signal events within this decay channel where the Standard Model predicts just 0.10 \pm 0.02 events. If all four of these events are confirmed as the desired neutral long kaon decays, new physics is required to explain the enhanced signal. There are several possibilities, recently explored in a new paper by Kitahara et. al.,  for what this new physics might be. Before we go into too much detail, let’s consider how KOTO’s observation came to be.

The KOTO experiment is a fixed-target experiment, in which particles are accelerated and collide with something stationary. In this case, protons at energy 30 GeV collided with gold, producing a beam of kaons after other products are diverted with collimators and magnets. The observation of the desired K_L \rightarrow \pi^0 \nu \bar{\nu} mode is particularly difficult experimentally for several reasons. First, the initial and final decay products are neutrally charged, making them harder to detect since they do not ionize, a primary strategy for detecting charged particles. Second, neutral pions are produced via several other kaon decay channels, requiring several strategies to differentiate neutral pions produced by K_L \rightarrow \pi^0 \nu \bar{\nu} from those produced from K_L \rightarrow 3 \pi^0, K_L \rightarrow 2\pi^0, and K_L \rightarrow \pi^0 \pi^+ \pi^-, among others. As we can see in the Feynman diagram above, our desired decay mode has the advantage of producing two photons, allowing KOTO to observe these photons and their transverse momentum in order to pinpoint a K_L \rightarrow \pi^0 \nu \bar{\nu} decay. In terms of experimental construction, KOTO included charged veto detectors in order to reject events with charged particles in the final state and a systematic study of background events was performed in order to discount hadron showers originating from neutrons in the beam line. 

This setup was in service of KOTO’s goal to explore the question of CP violation with long kaon decay. CP violation refers to the violation of charge-parity symmetry, the combination of charge-conjugation symmetry (in which a theory is unchanged when we swap a particle for its antiparticle) and parity symmetry (in which a theory is invariant when left and right directions are swapped). We seek to understand why some processes seem to preserve CP symmetry when the Standard Model allows for violation, as is the case in quantum chromodynamics (QCD), and why some processes break CP symmetry, as is seen in the quark mixing matrix (CKM matrix) and the neutrino oscillation matrix. Overall, CP violation has implications for matter-antimatter asymmetry, the question of why the universe seems to be composed predominantly of matter when particle creation and decay processes produce equal amounts of both matter and antimatter. An imbalance of matter and antimatter in the universe could be created if CP violation existed under the extreme conditions of the early universe, mere seconds after the Big Bang. Explanations for matter-antimatter asymmetry that do not involve CP violation generally require the existence of primordial matter-antimatter asymmetry, effectively dodging the fundamental question. The observation of CP violation with KOTO could provide critical evidence toward an eventual answer.  

The Kitahara paper provides three interpretations of KOTO’s observation that incorporate physics beyond the Standard Model: new heavy physics, new light physics, and new particle production. The first, new heavy physics, amplifies the expected Standard Model signal via the incorporation of new operators that couple to existing Standard Model particles. If this coupling is suppressed, it could adequately explain the observed boost in the branching ratio. Light new physics involves reinterpreting the neutrino-antineutrino pair as a new light particle. Factoring in experimental constraints, this new light particle should decay in with a finite lifetime on the order of 0.1-0.01 nanoseconds, making it almost completely invisible to experiment. Finally, new particles could be produced within the K_L \rightarrow \pi^0 \nu \bar{\nu} decay channel, which should be light and long-lived in order to allow for its decay to two photons. The details of these new particle scenarios should involve constraints from other particle physics processes, but each serve to increase the branching ratio through direct production of more final state particles. On the whole, this demonstrates the potential for the K_L \rightarrow \pi^0 \nu \bar{\nu} to provide a window to physics beyond the Standard Model. 

Of course, this analysis presumes the accuracy of KOTO’s four signal events. Pending the confirmation of these detections, there are several exciting possibilities for physics beyond the Standard Model, so be sure to keep your eye on this experiment!

Learn More:

  1. An overview of the KOTO experiment’s data taking: https://arxiv.org/pdf/1910.07585.pdf
  2. A description of the sensitivity involved in the KOTO experiment’s search: https://j-parc.jp/en/topics/2019/press190304.html
  3. More insights into CP violation: https://www.symmetrymagazine.org/article/charge-parity-violation

The early universe in a detector: investigations with heavy-ion experiments

Title: “Probing dense baryon-rich matter with virtual photons”

Author: HADES Collaboration

Reference: https://www.nature.com/articles/s41567-019-0583-8

The quark-gluon plasma, a sea of unbound quarks and gluons moving at relativistic speeds thought to exist at extraordinarily high temperature and density, is a phase of matter critical to our understanding of the early universe and extreme stellar interiors. On the timescale of milliseconds after the Big Bang, the matter in the universe is postulated to have been in a quark-gluon plasma phase, before the universe expanded, cooled, and formed the hadrons we observe today from constituent quarks and gluons. The study of quark matter, the range of phases formed from quarks and gluons, can provide us with insight into the evanescent early universe, providing an intriguing focus for experimentation. Astrophysical objects that are comprised of quarks, such as neutron stars, are also thought to house the necessary conditions for the formation of quark-gluon plasma at their cores. With the accumulation of new data from neutron star mergers, studies of quark matter are becoming increasingly productive and rife for new discovery. 

Quantum chromodynamics (QCD) is the theory of quarks and the strong interaction between them. In this theory, quarks and force-carrying gluons, the aptly-named particles that “glue” quarks together, have a “color” charge analogous to charge in quantum electrodynamics (QED). In QCD, the gluon field is often modeled as a narrow tube between two color charges with a constant strong force between them, in contrast with the inverse-square dependence on distance for fields in QED. The pair potential energy between the quarks increases linearly with separation, eventually surpassing the creation energy for a new quark-antiquark pair. Hence, the quarks cannot exist in unbound pairs at low energies, a property known as color confinement. When separation is attempted between quarks, new quarks are instead produced. In particle accelerators, physicists see “jets” of new color-neutral particles (mesons and baryons) in the process of hadronization. At high energies, the story changes and hinges on an idea known as asymptotic freedom, in which the strength of particle interactions decreases with increased energy scale in certain gauge theories such as QCD. 

A Feynman diagrammatic scheme of the production of new hadrons from a dilepton collision. We observe an electron-positron pair annihilating to a virtual photon, which then decays to many hadrons via hadronization. Source: https://cds.cern.ch/record/317673

QCD matter is commonly probed with heavy-ion collision experiments and quark-gluon plasma has been produced before in minute quantities at the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Lab as well as the Large Hadron Collider (LHC) at CERN. The goal of these experiments is to create conditions similar to those of the early universe or at the center of dense stars — doing so requires intense temperatures and an abundance of quarks. Heavy-ions, such as gold or lead nuclei, fit this bill when smashed together at relativistic speeds. When these collisions occur, the resulting “fireball” of quarks and gluons is unstable and quickly decays into a barrage of new, stable hadrons via the hadronization method discussed above. 

There are several main goals of heavy-ion collision experiments around the world, revolving around the study of the phase diagram for quark matter. The first component of this is the search for the critical point: the endpoint of the line of first-order phase transitions. The phase transition between hadronic matter, in which quarks and gluons are confined, and partonic matter, in which they are dissociated in a quark-gluon plasma, is also an active area of investigation. There is additionally an ongoing search for chiral symmetry restoration at finite temperature and finite density. A chiral symmetry occurs when the handedness of the particles remains invariant under a parity transformation, that is, when the sign of a spatial coordinate is flipped. However, in QCD, a symmetric system becomes asymmetric in a process known as spontaneous symmetry breaking. Several experiments are designed to investigate evidence of the restoration of this symmetry.

The phase diagram for quark matter, a plot of chemical potential vs. temperature, has many unknown points of interest.  Source: https://www.sciencedirect.com/science/article/pii/S055032131630219X

The HADES (High-Acceptance DiElectron Spectrometer) collaboration is a group attempting to address such questions. In a recent experiment, HADES focused on the creation of quark matter via collisions of a beam of Au (gold) ions with a stack of Au foils. Dileptons, which are bound lepton-antilepton pairs that emerge from the decay of virtual particles, are a key element of HADES’ findings. In quantum field theory (QFT), in which particles are modeled as excitations in an underlying field, virtual particles can be thought of as excitations in the field that are transient due to limitations set by the uncertainty principle. Virtual particles are represented by internal lines in Feynman diagrams, are used as tools in calculations, and are not isolated or measured on their own — they are only exchanged with ordinary particles. In the HADES experiment, the virtual photons that produce dileptons which immediately decouple from the strong force. Produced at all stages of QCD interaction, they are ideal messengers of any modification of hadron properties. They are also thought to contain information about the thermal properties of the underlying medium. 

To actually extract this information, the HADES detector utilizes a time-of-flight chamber and ring-imaging Cherenkov (RICH) chamber, which identifies particles using the characteristics of Cherenkov radiation: electromagnetic radiation emitted when a particle travels through a dielectric medium at a velocity greater than the phase velocity of light in that particular medium. The detector is then able to measure the invariant mass, rapidity (a commonly-used substitute measure for relativistic velocity), and transverse momentum of emitted electron-positron pairs, the dilepton of choice. In accelerator experiments, there are typically a number of selection criteria in place to ensure that the machinery is detecting the desired particles and the corresponding data is recorded. When a collision event occurs within HADES, a number of checks are in place to ensure that only electron-positron events are kept, factoring in both the number of detected events and detector inefficiency, while excess and background data is thrown out. The end point of this data collection is a calculation of the four-momenta of each lepton pair, a description of its relativistic energy and momentum components. This allows for the construction of a dilepton spectrum: the distribution of the invariant masses of detected dileptons. 

The main data takeaway from this experiment was the observation of an excess of dilepton events in an exponential shape, contrasting with the expected number of dileptons from ordinary particle collisions. This suggests a shift in the properties of the underlying matter, with a reconstructed temperature above 70 MeV (note that particle physicists tend to quote temperatures in more convenient units of electron volts). The kicker comes when the group compares these results to simulated neutron star mergers, with expected core temperatures of 75 MeV. This means that the bulk matter created within HADES is similar to the highly dense matter formed in such mergers, a comparison which has become recently accessible due to multi-messenger signals incorporating both electromagnetic and gravitational wave data. 

Practically, we see that HADES’ approach is quite promising for future studies of matter under extreme conditions, with the potential to reveal much about the state of the universe early on in its history as well as probe certain astrophysical objects — an exciting realization! 

Learn More:

  1. https://home.cern/science/physics/heavy-ions-and-quark-gluon-plasma
  2. https://www-hades.gsi.de/
  3. https://profmattstrassler.com/articles-and-posts/particle-physics-basics/virtual-particles-what-are-they/