The P5 Report & The Future of Particle Physics (Part 1)

Particle physics is the epitome of ‘big science’. To answer our most fundamental questions out about physics requires world class experiments that push the limits of whats technologically possible. Such incredible sophisticated experiments, like those at the LHC, require big facilities to make them possible,  big collaborations to run them, big project planning to make dreams of new facilities a reality, and committees with big acronyms to decide what to build.

Enter the Particle Physics Project Prioritization Panel (aka P5) which is tasked with assessing the landscape of future projects and laying out a roadmap for the future of the field in the US. And because these large projects are inevitably an international endeavor, the report they released last week has a large impact on the global direction of the field. The report lays out a vision for the next decade of neutrino physics, cosmology, dark matter searches and future colliders. 

P5 follows the community-wide brainstorming effort known as the Snowmass Process in which researchers from all areas of particle physics laid out a vision for the future. The Snowmass process led to a particle physics ‘wish list’, consisting of all the projects and research particle physicists would be excited to work on. The P5 process is the hard part, when this incredibly exciting and diverse research program has to be made to fit within realistic budget scenarios. Advocates for different projects and research areas had to make a case of what science their project could achieve and a detailed estimate of the costs. The panel then takes in all this input and makes a set of recommendations of how the budget should be allocated, what should projects be realized and what hopes are dashed. Though the panel only produces a set of recommendations, they are used quite extensively by the Department of Energy which actually allocates funding. If your favorite project is not endorsed by the report, its very unlikely to be funded. 

Particle physics is an incredibly diverse field, covering sub-atomic to cosmic scales, so recommendations are divided up into several different areas. In this post I’ll cover the panel’s recommendations for neutrino physics and the cosmic frontier. Future colliders, perhaps the spiciest topic, will be covered in a follow up post.

The Future of Neutrino Physics

For those in the neutrino physics community all eyes were on the panels recommendations regarding the Deep Underground Neutrino Experiment (DUNE). DUNE is the US’s flagship particle physics experiment for the coming decade and aims to be the definitive worldwide neutrino experiment in the years to come. A high powered beam of neutrinos will be produced at Fermilab and sent 800 miles through the earth’s crust towards several large detectors placed in a mine in South Dakota. Its a much bigger project than previous neutrino experiments, unifying essentially the entire US community into a single collaboration.

DUNE is setup to produce world leading measurements of neutrino oscillations, the property by which neutrinos produced in one ‘flavor state’, (eg an electron-neutrino) gradually changes its state with sinusoidal probability (eg into a muon neutrino) as it propagates through space. This oscillation is made possible by a simple quantum mechanical weirdness: neutrino’s flavor state, whether it couples to electrons muons or taus, is not the same as its mass state. Neutrinos of a definite mass are therefore a mixture of the different flavors and visa versa.

Detailed measurements of this oscillation are the best way we know to determine several key neutrino properties. DUNE aims to finally pin down two crucial neutrino properties: their ‘mass ordering’, which will solidify how the different neutrino flavors and measured mass differences all fit together, and their ‘CP-violation’ which specifies whether neutrinos and their anti-matter counterparts behave the same or not. DUNE’s main competitor is the Hyper-Kamiokande experiment in Japan, another next-generation neutrino experiment with similar goals.

A depiction of the DUNE experiment. A high intensity proton beam at Fermilab is used to create a concentrated beam of neutrinos which are then sent through 800 miles of the Earth’s crust towards detectors placed deep underground South Dakota. Source

Construction of the DUNE experiment has been ongoing for several years and unfortunately has not been going quite as well as hoped. It has faced significant schedule delays and cost overruns. DUNE is now not expected to start taking data until 2031, significantly behind Hyper-Kamiokande’s projected 2027 start. These delays may lead to Hyper-K making these definitive neutrino measurements years before DUNE, which would be a significant blow to the experiment’s impact. This left many DUNE collaborators worried about its broad support from the community.

It came as a relief then when P5 report re-affirmed the strong science case for DUNE, calling it the “ultimate long baseline” neutrino experiment. The report strongly endorsed the completion of the first phase of DUNE. However, it recommended a pared-down version of its upgrade, advocating for an earlier beam upgrade in lieu of additional detectors. This re-imagined upgrade will still achieve the core physics goals of the original proposal with a significant cost savings. With this report, and news that the beleaguered underground cavern construction in South Dakota is now 90% complete, was certainly welcome holiday news to the neutrino community. This is also sets up a decade-long race between DUNE and Hyper-K to be the first to measure these key neutrino properties.

Cosmic Implications

While we normally think of particle physics as focused on the behavior of sub-atomic particles, its really about the study of fundamental forces and laws, no matter the method. This means that telescopes to study the oldest light in the universe, the Cosmic Microwave Background (CMB), fall into the same budget category as giant accelerators studying sub-atomic particles. Though the experiments in these two areas look very different, the questions they seek to answer are cross-cutting. Understanding how particles interact at very high energies helps us understand the earliest moments of the universe, when such particles were all interacting in a hot dense plasma. Likewise, by studying the these early moments of the universe and its large-scale evolution can tell us about what kinds of particles and forces are influencing its dynamics. When asking fundamental questions about the universe, one needs both the sharpest microscopes and the grandest panoramas possible.

The most prominent example of this blending of the smallest and largest scales in particle physics is dark matter. Some of our best evidence for dark matter comes analyzing the cosmic microwave background to determine how the primordial plasma behaved. These studies showed that some type of ‘cold’, matter that doesn’t interact with light, aka dark matter, was necessary to form the first clumps that eventually seeded the formation of galaxies. Without it, the universe would be much more soup-y and structureless than what we see to today.

The “cosmic web” galaxy clusters from the Millenium simulation. Measuring and understanding this web can tell us a lot about the fundamental constituents of the universe. Source

To determine what dark matter is then requires an attack from two fronts: design experiments here on earth attempting directly detect it, and further study its cosmic implications to look for more clues as to its properties.

The panel recommended next generation telescopes to study the CMB as a top priority. The so called ‘Stage 4’ CMB experiment would deploy telescopes in both the south pole and Chile’s Atacama desert to better characterize sources of atmospheric noise. The CMB has been studied extensively before, but the increased precision of CMS-S4 could shed light on mysteries like dark energy, dark matter, inflation, and the recent Hubble Tension. Given the past fruitfulness of these efforts, I think few doubted the science case for such a next generation experiment.

A mockup of one of the CMS-S4 telescopes which will be based in the Chilean desert. Note the person for scale on the right (source)

The P5 report recommended a suite of new dark matter experiments in the next decade, including the ‘ultimate’ liquid Xenon based dark matter search. Such an experiment would follow in the footsteps of massive noble gas experiments like LZ and XENONnT which have been hunting for a favored type of dark matter called WIMP’s for the last few decades. These experiments essentially build giant vats of liquid Xenon, carefully shield from any sources of external radiation, and look for signs of dark matter particles bumping into any of the Xenon atoms. The larger the vat of Xenon, the higher chance a dark matter particle will bump into something. Current generation experiments have ~7 tons of Xenon, and the next generation experiment would be even larger. The next generation aims to reach the so called ‘neutrino floor’, the point as which the experiments would be sensitive enough to observe astrophysical neutrinos bumping into the Xenon. Such neutrino interactions would look extremely similar to those of dark matter, and thus represent an unavoidable background which would signal the ultimate sensitivity of this type of experiment. WIMP’s could still be hiding in a basement below this neutrino floor, but finding them would be exceedingly difficult.

A photo of the current XENONnT experiment. This pristine cavity is then filled with liquid Xenon and closely monitored for signs of dark matter particles bumping into one of the Xenon atoms. Credit: XENON Collaboration

WIMP’s are not the only dark matter candidates in town, and recent years have also seen an explosion of interest in the broad range of dark matter possibilities, with axions being a prominent example. Other kinds of dark matter could have very different properties than WIMPs and have had much fewer dedicated experiments to search for them. There is ‘low hanging fruit’ to pluck in the way of relatively cheap experiments which can achieve world-leading sensitivity. Previously, these ‘table top’ sized experiments had a notoriously difficult time obtaining funding, as they were often crowded out of the budgets by the massive flagship projects. However, small experiments can be crucial to ensuring our best chance of dark matter discovery, as they fill in the blinds pots missed by the big projects.

The panel therefore recommended creating a new pool of funding set aside for these smaller scale projects. Allowing these smaller scale projects to flourish is important for the vibrancy and scientific diversity of the field, as the centralization of ‘big science’ projects can sometimes lead to unhealthy side effects. This specific recommendation also mirrors a broader trend of the report: to attempt to rebalance the budget portfolio to be spread more evenly and less dominated by the large projects.

A pie chart comparing the budget porfolio in 2023 (left) versus the projected budget in 2033 (right). Currently most of the budget is being taken up by the accelerator upgrades and cavern construction of DUNE, with some amount for the LHC upgrades. But by 2033 the panel recommends a much more equitable balance between different research area.

What Didn’t Make It

Any report like this comes with some tough choices. Budget realities mean not all projects can be funded. Besides the pairing down of some of DUNE’s upgrades, one of the biggest areas that was recommended against were ‘accessory experiments at the LHC’. In particular, MATHUSULA and the Forward Physics Facility were two experiments that proposed to build additional detectors near already existing LHC collision points to look for particles that may be missed by the current experiments. By building new detectors hundreds of meters away from the collision point, shielded by concrete and the earth, they can obtained unique sensitivity to ‘long lived’ particles capable of traversing such distances. These experiments would follow in the footsteps of the current FASER experiment, which is already producing impressive results.

While FASER found success as a relatively ‘cheap’ experiment, reusing detector components from and situating itself in a beam tunnel, these new proposals were asking for quite a bit more. The scale of these detectors would have required new caverns to be built, significantly increasing the cost. Given the cost and specialized purpose of these detectors, the panel recommended against their construction. These collaborations may now try to find ways to pare down their proposal so they can apply to the new small project portfolio.

Another major decision by the panel was to recommend against hosting a new Higgs factor collider in the US. But that will discussed more in a future post.

Conclusions

The P5 panel was faced with a difficult task, the total cost of all projects they were presented with was three times the budget. But they were able to craft a plan that continues the work of the previous decade, addresses current shortcomings and lays out an inspiring vision for the future. So far the community seems to be strongly rallying behind it. At time of writing, over 2700 community members from undergraduates to senior researchers have signed a petition endorsing the panels recommendations. This strong show of support will be key for turning these recommendations into actual funding, and hopefully lobbying congress to even increase funding so that more of this vision can be realized.

For those interested the full report as well as executive summaries of different areas can be found on the P5 website. Members of the US particle physics community are also encouraged to sign the petition endorsing the recommendations here.

And stayed tuned for part 2 of our coverage which will discuss the implications of the report on future colliders!

Maleficent dark matter: Part I

We might not have gotten here without dark matter. It was the gravitational pull of dark matter, which makes up most of the mass of galactic structures, that kept heavy elements — the raw material of Earth-like rocky planets — from flying away after the first round of supernovae at the advent of the stelliferous era. Without this invisible pull, all structures would have been much smaller than seen today, and stars much more rare.

Thus with knowledge of dark matter comes existential gratitude. But the microscopic identity of dark matter is one of the biggest scientific enigmas of our times, and what we don’t know could yet kill us. This two-part series is about the dangerous end of our ignorance, reviewing some inconvenient prospects sketched out in the dark matter literature. Reader discretion is advised.

[Note: The scenarios outlined here are based on theoretical speculations of dark matter’s identity. Such as they are, these are unlikely to occur, and even if they do, extremely unlikely within the lifetime of our species, let alone that of an individual. In other words, nobody’s sleep or actuarial tables need be disturbed.]

The dark matter wind could blow in mischief. Image source: Freese et al.

Carcinogenic dark matter

Maurice Goldhaber quipped that “you could feel it in your bones” that protons are cosmologically long-lived, as otherwise our bodies would have self-administered a lethal dose of ionizing radiation. (This observation sets a lower limit on the proton lifetime at a comforting 10^7 times the age of the universe.) Could we laugh similarly about dark matter? The Earth is probably amid a wind of particle dark matter, a wind that could trigger fatal ionization in our cells if encountered too frequently. The good news is that if dark matter is made of weakly interacting massive particles (WIMPs), K. Freese and C. Savage report safety: “Though WIMP interactions are a source of radiation in the body, the annual exposure is negligible compared to that from other natural sources (including radon and cosmic rays), and the WIMP collisions are harmless to humans.

The bad news is that the above statement assumes dark matter is distributed smoothly in the Galactic halo. There are interesting cosmologies in which dark matter collects in high-density “clumps” (a.k.a. “subhalos”, “mini-halos”,  or “mini-clusters”). According to J. I. Collar, the Earth encountering these clumps every 30–100 million years could explain why mass extinctions of life occur periodically on that timescale. During transits through the clumps, dark matter particles could undergo high rates of elastic collisions with nuclei in life forms, injecting 100–200 keV of energy per micrometer of transit, just right to “induce a non-negligible amount of radiation damage to all living tissue“. We are in no hurry for the next dark clump.

Eruptive dark matter

If your dark matter clump doesn’t wipe out life efficiently via cancer,  A. Abbas and S. Abbas recommend waiting another five million years. It takes that long for the clump dark matter to gravitationally capture in Earth, settle in its core, self-annihilate, and heat the mantle, setting off planet-wide volcanic fireworks. The resulting chain of events would end, as the authors rattle off enthusiastically, in “the depletion of the ozone layer, global temperature changes, acid rain, and a decrease in surface ocean alkalinity.”

Dark matter settling in the Earth’s core could spell doom. Image source: J. Bramante & A. Goodman.

Armageddon dark matter

If cancer and volcanoes are not dark matter’s preferred methods of prompting mass extinction, it could get the job done with old-fashioned meteorite impacts.

It is usually supposed that dark matter occupies a spherical halo that surrounds the visible, star-and-gas-crammed, disk of the Milky Way.  This baryonic pancake was formed when matter, starting out in a spinning sphere, cooled down by radiating photons and shrunk in size along the axis of rotation; due to conservation of angular momentum the radial extent was preserved. No such dissipative process is known to govern dark matter, thus it retains its spherical shape. However, a small component of dark matter might have still cooled by emitting some unseen radiation such as “dark photons“. That would result in a “dark disk” sub-structure co-existing in the Galactic midplane with the visible disk. Every 35 million years the Solar System crosses the Galactic midplane, and when that happens, a dark disk of surface density of 10 M_\odot/pc^2 could tidally perturb the Oort Cloud and send comets shooting toward the inner planets, causing periodic mass extinctions. So suggest L. Randall and M. Reece, whose arXiv comment “4 figures, no dinosaurs” is as much part of the particle physics lore as Randall’s book that followed the paper, Dark Matter and the Dinosaurs.

We note in passing that SNOLAB, the underground laboratory in Sudbury, ON that houses the dark matter experiments DAMIC, DEAP, and PICO, and future home of NEWS-G, SENSEISuper-CDMS and ARGO, is located in the Creighton Mine — where ore deposits were formed by a two billion year-old comet impact. Perhaps the dark disk nudges us to detect its parent halo.

A drift (horizontal passage) in Creighton Mine, 2.1 km underground. Around the corner is SNOLAB, where several experiments searching for dark matter are located. The mine owes its existence to a meteorite impact — perhaps triggered by a Galactic disk of dark matter. Photo: N. Raj.

——————
In the second part of the series we will look — if we’re still here — at more surprises that dark matter could have planned for us. Stay tuned.

Bibliography.

[1] Dark Matter collisions with the Human Body, K. Freese & D. Savage, Phys.Lett.B 717 (2012) 25-28.

[2] Clumpy cold dark matter and biological extinctions, J. I. Collar, Phys.Lett.B 368 (1996) 266-269.

[3] Volcanogenic dark matter and mass extinctions, S. Abbas & A. Abbas, Astropart.Phys. 8 (1998) 317-320

[4] Dark Matter as a Trigger for Periodic Comet Impacts, L. Randall & M. Reece, Phys.Rev.Lett. 112 (2014) 161301

[5] Dark Matter and the Dinosaurs, L. Randall, Harper Collins: Ecco Press‎ (2015)

Hullabaloo Over The Hubble Constant

Title: The Expansion of the Universe is Faster than Expected

Author: Adam Riess

Reference: Nature   Arxiv

There is a current crisis in the field of cosmology and it may lead to our next breakthrough in understanding the universe.  In the late 1990’s measurements of distant supernovae showed that contrary to expectations at the time, the universe’s expansion was accelerating rather than slowing down. This implied the existence of a mysterious “dark energy” throughout the universe, propelling this accelerated expansion. Today, some people once again think that our measurements of the current expansion rate, the Hubble constant, are indicating that there is something about the universe we don’t understand.

The current cosmological standard model, called ΛCDM, is a phenomenological model of describing all contents of the universe. It includes regular visible matter, Cold Dark Matter (CDM), and dark energy. It is an extremely bare-bones model; assuming dark matter interacts only gravitationally and that dark energy is just a simple cosmological constant (Λ) which gives a constant energy density to space itself.  For the last 20 years this model has been rigorously tested but new measurements might be beginning to show that it has some holes. Measurements of the early universe based on ΛCDM and extrapolated to today predict a different rate of expansion than what is currently being measured, and cosmologists are taking this war over the Hubble constant very seriously.

The Measurements

On one side of this Hubble controversy are measurements from the early universe. The most important of these is based on the Cosmic Microwave Background (CMB), light directly from the hot plasma of the Big Bang that has been traveling billions of years directly to our telescopes. This light from the early universe is nearly uniform in temperature, but by analyzing the pattern of slightly hotter and colder spots, cosmologists can extract the 6 free parameters of ΛCDM. These parameters encode the relative amount of energy contained in regular matter, dark matter, and dark energy. Then based on these parameters, they can infer what the current expansion rate of the universe should be. The current best measurements of the CMB come from the Planck collaboration which can infer the Hubble constant with a precision of less than 1%.

The Cosmic Microwave Background (CMB). Blue spots are slightly colder than average and red spots are slightly hotter. By fitting a model to this data, one can determine the energy contents of the early universe.

On the other side of the debate are the late-universe (or local) measurements of the expansion. The most famous of these is based on a ‘distance ladder’, where several stages of measurements are used to calibrate distances of astronomical objects. First, geometric properties are used to calibrate the brightness of pulsating stars (Cepheids). Cepheids are then used to calibrate the absolute brightness of exploding supernovae. The expansion rate of the universe can then be measured by relating the red-shift (the amount the light from these objects has been stretched by the universe’s expansion) and the distance of these supernovae. This is the method that was used to discover dark energy in 1990’s and earned its pioneers a Nobel prize. As they have collected more data and techniques have been refined, the measurement’s precision has improved dramatically.

In the last few years the tension between the two values of the Hubble constant has steadily grown. This had let cosmologists to scrutinize both sets of measurements very closely but so far no flaws have been found. Both of these measurements are incredibly complex, and many cosmologists still assumed that there was some unknown systematic error in one of them that was the culprit. But recently, other measurements both the early and late universe have started to weigh in and they seem to agree with the Planck and distance ladder results. Currently the tension between the early and late measurements of the Hubble constant sits between 4 to 6 sigma, depending on which set of measurements you combine. While there are still many who believe there is something wrong with the measurements, others have started to take seriously that this is pointing to a real issue with ΛCDM, and there is something in the universe we don’t understand. In other words, New Physics!

A comparison of the early universe and late universe measurements of the Hubble constant. Different combinations of measurements are shown for each. The tension is between 4 and 6 sigma on depending on which set of measurements you combine

The Models

So what ideas have theorists put forward that can explain the disagreement? In general theorists have actually had a hard time trying to come up with models that can explain this disagreement while not running afoul of the multitude of other cosmological data we have, but some solutions have been found. Two of the most promising approaches involve changing the composition of universe just before the time the CMB was emitted.

The first of these is called Early Dark Energy. It is a phenomenological model that posits the existence of another type of dark energy, that behaves similarly to a cosmological constant early in the universe but then fades away relatively quickly as the universe expands. This model is able to slightly improve Planck’s fit to the CMB data while changing the contents of the early universe enough to alter the predicted Hubble constant to be consistent with the local value. Critics of the model have feel that its parameters had to been finely tuned for the solution to work. However there has been some work in mimicking its success with a particle-physics based model.

The other notable attempt at resolving the tension involves adding additional types of neutrinos and positing that neutrinos interact with each other in a much stronger way than the Standard Model. This similarly changes the interpretation of the CMB measurements to predict a larger expansion rate. The authors also posit that this new physics in the neutrino sector may be related to current anomalies seen in neutrino physics experiments that are also currently lacking an explanation. However follow up work has showed that it is hard to reconcile such strongly self-interacting neutrinos with laboratory experiments and other cosmological probes.

The Future

At present the situation remains very unclear. Some cosmologists believe this is the end of ΛCDM, and others still believe there is an issue with one of the measurements. For those who believe new physics is the solution, there is no consensus about what the best model is. However, the next few years should start to clarify things. Other late-universe measurements of the Hubble constant, using gravitational lensing or even gravitational waves, should continue to improve their precision and could give skeptics greater confidence to the distance ladder result. Next generation CMB experiments will eventually come online as well, and will offer greater precision than the Planck measurement. Theorists will probably come up with more possible resolutions, and point out additional measurements to be made that can confirm or refute their models. For those hoping for a breakthrough in our understanding of the universe, this is definitely something to keep an eye on!

Read More

Quanta Magazine Article on the controversy 

Astrobites Article on Hubble Tension

Astrobites Article on using gravitational lensing to measure the Hubble Constant

The Hubble Hunters Guide