New LHC data has more of the same but could something be in the offing?

Run 2 results from the LHC show that QCD is scale-invariant – in keeping with the Standard Model prediction.

Dijet mass (TeV) v. no. of events. SOurce: ATLAS/CERN
Dijet mass (TeV) v. no. of events. Source: ATLAS/CERN

Looks intimidating, doesn’t it? It’s also very interesting because it contains an important result acquired at the Large Hadron Collider (LHC) this year, a result that could disappoint many physicists.

The LHC reopened earlier this year after receiving multiple performance-boosting upgrades over the 18 months before. In its new avatar, the particle-smasher explores nature’s fundamental constituents at the highest energies yet, almost twice as high as they were in its first run. By Albert Einstein’s mass-energy equivalence (E = mc2), the proton’s mass corresponds to an energy of almost 1 GeV (giga-electron-volt). The LHC’s beam energy to compare was 3,500 GeV and is now 6,500 GeV.

At the start of December, it concluded data-taking for 2015. That data is being steadily processed, interpreted and published by the multiple topical collaborations working on the LHC. Two collaborations in particular, ATLAS and CMS, were responsible for plots like the one shown above.

This is CMS’s plot showing the same result:

Source: CMS/CERN
Source: CMS/CERN

When protons are smashed together at the LHC, a host of particles erupt and fly off in different directions, showing up as streaks in the detectors. These streaks are called jets. The plots above look particularly at pairs of particles called quarks, anti-quarks or gluons that are produced in the proton-proton collisions (they’re in fact the smaller particles that make up protons).

The sequence of black dots in the ATLAS plot shows the number of jets (i.e. pairs of particles) observed at different energies. The red line shows the predicted number of events. They both match, which is good… to some extent.

One of the biggest, and certainly among the most annoying, problems in particle physics right now is that the prevailing theory that explains it all is unsatisfactory – mostly because it has some really clunky explanations for some things. The theory is called the Standard Model and physicists would like to see it disproved, broken in some way.

In fact, those physicists will have gone to work today to be proved wrong – and be sad at the end of the day if they weren’t.

Maintenance work underway at the CMS detector, the largest of the five that straddle the LHC. Credit: CERN
Maintenance work underway at the CMS detector, the largest of the five that straddle the LHC. Credit: CERN

The annoying problem at its heart

The LHC chips in providing two kinds of opportunities: extremely sensitive particle-detectors that can provide precise measurements of fleeting readings, and extremely high collision energies so physicists can explore how some particles behave in thousands of scenarios in search of a surprising result.

So, the plots above show three things. First, the predicted event-count and the observed event-count are a match, which is disappointing. Second, the biggest deviation from the predicted count is highlighted in the ATLAS plot (look at the red columns at the bottom between the two blue lines). It’s small, corresponding to two standard deviations (symbol: σ) from the normal. Physicists need at least three standard deviations () from the normal for license to be excited.

But this is the most important result (an extension to the first): The predicted event-count and the observed event-count are a match across 6,000 GeV. In other words: physicists are seeing no cause for joy, and all cause for revalidating a section of the Standard Model, in a wide swath of scenarios.

The section in particular is called quantum chromodynamics (QCD), which deals with how quarks, antiquarks and gluons interact with each other. As theoretical physicist Matt Strassler explains on his blog,

… from the point of view of the highest energies available [at the LHC], all particles in the Standard Model have almost negligible rest masses. QCD itself is associated with the rest mass scale of the proton, with mass-energy of about 1 GeV, again essentially zero from the TeV point of view. And the structure of the proton is simple and smooth. So QCD’s prediction is this: the physics we are currently probing is essential scale-invariant.

Scale-invariance is the idea that two particles will interact the same way no matter how energetic they are. To be sure, the ATLAS/CMS results suggest QCD is scale-invariant in the 0-6,000 GeV range. There’s a long way to go – in terms of energy levels and future opportunities.

Something in the valley

The folks analysing the data are helped along by previous results at the LHC as well. For example, with the collision energy having been ramped up, one would expect to see particles of higher energies manifesting in the data. However, the heavier the particle, the wider the bump in the plot and more the focusing that’ll be necessary to really tease out the peak. This is one of the plots that led to the discovery of the Higgs boson:

 

Source: ATLAS/CERN
Source: ATLAS/CERN

That bump between 125 and 130 GeV is what was found to be the Higgs, and you can see it’s more of a smear than a spike. For heavier particles, that smear’s going to be wider with longer tails on the site. So any particle that weighs a lot – a few thousand GeV – and is expected to be found at the LHC would have a tail showing in the lower energy LHC data. But no such tails have been found, ruling out heavier stuff.

And because many replacement theories for the Standard Model involve the discovery of new particles, analysts will tend to focus on particles that could weigh less than about 2,000 GeV.

In fact that’s what’s riveted the particle physics community at the moment: rumours of a possible new particle in the range 1,900-2,000 GeV. A paper uploaded to the arXiv preprint server on December 10 shows a combination of ATLAS and CMS data logged in 2012, and highlights a deviation from the normal that physicists haven’t been able to explain using information they already have. This is the relevant plot:

Source: arXiv:1512.03371v1
Source: arXiv:1512.03371v1

 

The one on the middle and right are particularly relevant. They each show the probability of the occurrence of an event (observed as a bump in the data, not shown here) of some heavier mass of energy decaying into two different final states: of W and Z bosons (WZ), and of two Z bosons (ZZ). Bosons make a type of fundamental particle and carry forces.

The middle chart implies that the mysterious event is at least 1,000-times less likelier to occur than normally and the one on the left implies the event is at least 10,000-times less likelier to occur than normally. And both readings are at more than 3σ significance, so people are excited.

The authors of the paper write: “Out of all benchmark models considered, the combination favours the hypothesis of a [particle or its excitations] with mass 1.9-2.0 [thousands of GeV] … as long as the resonance does not decay exclusively to WW final states.”

But as physicist Tommaso Dorigo points out, these blips could also be a fluctuation in the data, which does happen.

Although the fact that the two experiments see the same effect … is suggestive, that’s no cigar yet. For CMS and ATLAS have studied dozens of different mass distributions, and a bump could have appeared in a thousand places. I believe the bump is just a fluctuation – the best fluctuation we have in CERN data so far, but still a fluke.

There’s a seminar due to happen today at the LHC Physics Centre at CERN where data from the upgraded run is due to be presented. If something really did happen in those ‘valleys’, which were filtered out of a collision energy of 8,000 GeV (basically twice the beam energy, where each beam is a train of protons), then those events would’ve happened in larger quantities during the upgraded run and so been more visible. The results will be presented at 1930 IST. Watch this space.

Featured image: Inside one of the control centres of the collaborations working on the LHC at CERN. Each collaboration handles an experiment, or detector, stationed around the LHC tunnel. Credit: CERN.

All goes well on LHC 2.0’s first day back in action

It finally happened! The particle-smasher known as the Large Hadron Collider is back online after more than two years, during which its various components were upgraded to make it even meaner. A team of scientists and engineers gathered at the collider’s control room at CERN over the weekend – giving up Easter celebrations at home – to revive the giant machine so it could resume feeding its four detectors with high-energy collisions of protons.

Before the particles enter the LHC itself, they are pre-accelerated to 450 GeV by the Super Proton Synchrotron. At 11.53 am (CET), the first beam of pre-accelerated protons was injected into the LHC at Point 2 (see image), starting a clockwise journey. By 11.59 am, it’d been reported crossing Point 3, and at 12.01 pm, it was past Point 5. The anxiety in the control room was palpable when an update was posted in the live-blog: “The LHC operators watching the screen now in anticipation for Beam 1 through sector 5-6”.

Beam 1 going from Point 2 to Point 3 during the second run of the Large Hadron Collider's first day in action. Credit: CERN
Beam 1 going from Point 2 to Point 3 during the second run of the Large Hadron Collider’s first day in action. Credit: CERN

Finally, at 12.12 pm, the beam had crossed Point 6. By 12.27, it had gone a full-circle around the LHC’s particles pipeline, signalling that the pathways were defect-free and ready for use. Already, as and when the beam snaked through a detector without glitches, some protons were smashed into static targets producing a so-called splash of particles like sparks, and groups of scientists erupted in cheers.

Both Rolf-Dieter Heuer, the CERN Director-General, and Frederick Bordry, Director for Accelerators and Technology, were present in the control room. Earlier in the day, Heuer had announced that another beam of protons – going anti-clockwise – had passed through the LHC pipe without any problems, providing the preliminary announcement that all was well with the experiment. In fact, CERN’s scientists were originally supposed to have run these beam-checks a week ago, when an electrical glitch spotted at the last minute thwarted them.

In its new avatar, the LHC sports almost double the energy it ran at, before it shut down for upgrades in early-2013, as well as more sensitive collision detectors and fresh safety systems. For the details of the upgrades, read this. For an ‘abridged’ version of the upgrades together with what new physics experiments the new LHC will focus on, read this. Finally, here’s to another great year for high-energy physics!

Fabiola Gianotti, the first woman Director-General of CERN

The CERN Council has elected a new Director-General to succeed the incumbent Rolf-Dieter Heuer. Fabiola Gianotti, who served as the ATLAS collaboration’s spokesperson from 2009 to 2013 – a period that included the discovery of the long-sought Higgs boson by the ATLAS and CMS experiments – will be the first woman to hold the position. Her mandate begins from January 2016.

A CERN press release announcing the appointment said the “Council converged rapidly in favor of Dr. Gianotti”, implying it was a quick and unanimous decision.

The Large Hadron Collider (LHC), the mammoth particle smasher that produces the collisions that ATLAS, CMS and two other similar collaborations study, is set to restart in January 2015 after a series of upgrades to increase its energy and luminosity. And so Dr. Gianotti’s term will coincide with a distinct phase of science, this one eager for evidence to help answer deeper questions in particle physics – such as the Higgs boson’s mass, the strong force’s strength and dark matter.

Dr. Gianotti will succeed 15 men who, as Director Generals, have been responsible for not simply coordinating the scientific efforts stemming from CERN but also guiding research priorities and practices. They have effectively set the various agendas that the world’s preeminent nuclear physics lab has chosen to pursue since its establishment in 1945.

In fact, the title of ‘spokesperson’, which Dr. Gianotti held for the ATLAS collaboration for four years until 2013, is itself deceptively uncomplicated. The spokesperson not only speaks for the collaboration but is also the effective project manager who plays an important role when decisions are made about what measurements to focus on and what questions to answer. When on July 4, 2012, the discovery of a Higgs-boson-like particle was announced, results from the ATLAS particle-detector – and therefore Dr. Gianotti’s affable leadership – were instrumental in getting that far, and in getting Peter Higgs and Francois Englert their 2013 Nobel Prize in physics.

Earlier this year, she had likened her job to “a great scientific adventure”, and but “also a great human adventure”, to CNN. To guide the aspirations and creativity of 3,000 engineers and physicists without attenuation1 of productivity or will must have indeed been so.

That she will be the first woman to become the DG of CERN can’t escape attention either, especially at a time when women’s participation in STEM research seems to be on the decline and sexism in science is being recognized as a prevalent issue. Dr. Gianotti will no doubt make a strong role model for a field that is only 25% women. There will also be much to learn from her past, from the time she chose to become a physicist after learning about Albert Einstein’s idea of quantum mechanics to explain the photoelectric effect. She joined CERN while working toward her PhD from the University of Milan. She was 25, it was 1987 and the W/Z bosons had just been discovered at the facility’s UA1 and UA2 collaborations. Dr. Gianotti would join the latter.

It was an exciting time to be a physicist as well as exacting. Planning for the LHC would begin in that decade and launch one of the world’s largest scientific collaborations with it. The success of a scientist would start to demand not just research excellence but also a flair for public relations, bureaucratic diplomacy and the acuity necessary to manage public funds in the billions from different countries. Dr. Gianotti would go on to wear all these hats even as she started work in calorimetry at the LHC in 1990, on the ATLAS detector in 1992, and on the search for supersymmetric (‘new physics’) particles in 1996.

Her admiration for the humanities has been known to play its part in shaping her thoughts about the universe at its most granular. She has a professional music diploma from the Milan Conservatory and often unwinds at the end of a long day with a session on the piano. Her fifth-floor home in Geneva sometimes affords her a view of Mont Blanc, and she often enjoys long walks in the mountains. In the same interview, given to Financial Times in 2013, she adds,

There are many links between physics and art. For me, physics and nature have very nice foundations from an aesthetic point of view, and at the same time art is based on physics and mathematical principle. If you build a nice building, you have to build it with some criteria because otherwise it collapses.2

Her success in leading the ATLAS collaboration, and becoming the veritable face of the hunt for the Higgs boson, have catapulted her to being the next DG of CERN. At the same time, it must feel reassuring3 that as physicists embark on a new era of research that requires just as much ingenuity in formulating new ideas as in testing them, an era “where logic based on past theories does not guide us”4, Fabiola Gianotti’s research excellence, administrative astuteness and creative intuition is now there to guide them.

Good luck, Dr. Gianotti!


1Recommended read: Who really found the Higgs boson? The real genius in the Nobel Prize-winning discovery is not who you think it is. Nautilus, Issue 18.

2I must mention that it’s weird that someone which such strong aesthetic foundations used Comic Sans MS as the font of choice for her presentation at the CERN seminar in 2012 that announced the discovery of a Higgs-like-boson. It was probably the beginning of Comic Sans’s comeback.

3Though I am no physicist.

4In the words of Academy Award-winning film editor Walter S. Murch.

Featured image credit: Claudia Marcelloni/CERN

Restarting the LHC: A timeline

CERN has announced the restart schedule of its flagship science “project”, the Large Hadron Collider, that will see the giant machine return online in early 2015. I’d written about the upgrades that could be expected shortly before it shut down in 2012. They range from new pixel sensors and safety systems to facilities that will double the collider’s energy and the detectors’ eyes for tracking collisions. Here’s a little timeline I made with Timeline.js, check it out.

(It’s at times like this that I really wish WP.com would let bloggers embed iframes in posts.)

Putting particle physics research to work

In the whole gamut of comments regarding the Higgs boson, there is a depressingly large number decrying the efforts of the ATLAS and CMS collaborations. Why? Because a lot of people think the Large Hadron Collider (LHC) is a yawning waste of time and money, an investment that serves mankind no practical purpose.

Well, here and here are some cases in point that demonstrate the practical good that the LHC has made possible in the material sciences. Another big area of application is in medical diagnostics: making the point is one article about hunting for the origin of Alzheimer’s, and another about the very similar technology used in particle accelerators and medical imaging devices, meteorology, VLSI, large-scale networking, cryogenics, and X-ray spectroscopy.

Moving on to more germane applications: arXiv has reams of papers that discuss the deployment of

… amongst others.

The LHC, above all else, is the brainchild of the European Centre for Nuclear Research, popularly known as CERN. These guys invented the notion of the internet, developed the first touch-screen devices, and pioneered the earliest high-energy medical imaging techniques.

With experiments like those being conducted at the LHC, it’s easy to forget every other development in such laboratories apart from the discovery of much-celebrated particles. All the applications I’ve linked to in this post were conceived by scientists working with the LHC, if only to argue that everyone, the man whose tax money pays for these giant labs to the man who uses the money to work in the labs, is mindful of practical concerns.

Gunning for the goddamned: ATLAS results explained

Here are some of the photos from the CERN webcast yesterday (July 4, Wednesday), with an adjoining explanation of the data presented in each one and what it signifies.

This first image shows the data accumulated post-analysis of the diphoton decay mode of the Higgs boson. In simpler terms, physicists first put together all the data they had that resulted from previously known processes. This constituted what’s called the background. Then, they looked for signs of any particle that seemed to decay into two energetic photons, or gamma rays, in a specific energy window; in this case, 100-160 GeV.

Finally, knowing how the number of events would vary in a scenario without the Higgs boson, a curve was plotted that fit the data perfectly: the number of events at each energy level v. the energy level at which it was tracked. This way, a bump in the curve during measurement would mean there was a particle previously unaccounted for that was causing an excess of diphoton decay events at a particular energy.

This is the plot of the mass of the particle being looked for (x-axis) versus the confidence level with which it has (or has not, depending n how you look at it) been excluded as an event to focus on. The dotted horizontal line, corresponding to 1μ, marks off a 95% exclusion limit: any events registered above the line can be claimed as having been observed with “more than 95% confidence” (colloquial usage).

Toward the top-right corner of the image are some numbers. 7 TeV and 8 TeV are the values of the total energy going into each collision before and after March, 2012, respectively. The beam energy was driven up to increase the incidence of decay events corresponding to Higgs-boson-like particles, which, given the extremely high energy at which they exist, are viciously short-lived. In experiments that were run between March and July, physicists at CERN reported an increase of almost 25-30% of such events.

The two other numbers indicate the particle accelerator’s integrated luminosity. In particle physics, luminosity is measured as the number of particles that can pass detected through a unit of area per second. The integrated luminosity is the same value but measured over a period of time. In the case of the LHC, after the collision energy was vamped up, the luminosity, too, had to be increased: from about 4.7 fb-1 to 5.8 fb-1. You’ll want to Wiki the unit of area called barn. Some lighthearted physics talk there.

In this plot, the y-axis on the left shows the chances of error, and the corresponding statistical significance on the right. When the chances of an error stand at 1, the results are not statistically significant at all because every observation is an error! But wait a minute, does that make sense? How can all results be errors? Well, when looking for one particular type of event, any event that is not this event is an error.

Thus, as we move toward the ~125 GeV mark, the number of statistically significant results shoot up drastically. Looking closer, we see two results registered just beyond the 5-sigma mark, where the chances of error are 1 in 3.5 million. This means that if the physicists created just those conditions that resulted in this >5σ (five-sigma) observation 3.5 million times, only once will a random fluctuation play impostor.

Also, notice how the differences between each level of statistical significance increases with increasing significance? For chances of errors: 5σ – 4σ > 4σ – 3σ > … > 1σ – 0σ. This means that the closer physicists get to a discovery, the exponentially more precise they must be!

OK, this is a graph showing the mass-distribution for the four-lepton decay mode, referred to as a channel by those working on the ATLAS and CMS collaborations (because there are separate channels of data-taking for each decay-mode). The plotting parameters are the same as in the first plot in this post except for the scale of the x-axis, which goes all the way from 0 to 250 GeV. Now, between 120 GeV and 130 GeV, there is an excess of events (light blue). Physicists know it is an excess and not at par with expectations because theoretical calculations made after discounting a Higgs-boson-like decay event show that, in that 10 GeV, only around 5.3 events are to be expected, as opposed to the 13 that turned up.