I was randomly rewatching The Big Bang Theory on Netflix today when I spotted this gem:
Okay, maybe less a gem and more a shiny stone, but still. The screenshot, taken from the third episode of the sixth season, shows Sheldon Cooper mansplaining to Penny the work of Peter Higgs, whose name is most famously associated with the scalar boson the Large Hadron Collider collaboration announced the discovery of to great fanfare in 2012.
My fascination pertains to Sheldon’s description of Higgs as an “accomplished self-promoter”. Higgs, in real life, is extremely reclusive and self-effacing and journalists have found him notoriously hard to catch for an interview, or even a quote. His fellow discoverers of the Higgs boson, including François Englert, the Belgian physicist with whom Higgs won the Nobel Prize for physics in 2013, have been much less media-shy. Higgs has even been known to suggest that a mechanism in particle physics involving the Higgs boson should really be called the ABEGHHK’tH mechanism, include the names of everyone who hit upon its theoretical idea in the 1960s (Philip Warren Anderson, Robert Brout, Englert, Gerald Guralnik, C.R. Hagen, Higgs, Tom Kibble and Gerardus ‘t Hooft) instead of just as the Higgs mechanism.
No doubt Sheldon thinks Higgs did right by choosing not to appear in interviews for the public or not writing articles in the press himself, considering such extreme self-effacement is also Sheldon’s modus of choice. At the same time, Higgs might have lucked out and be recognised for work he conducted 50 years prior probably because he’s white and from an affluent country, both of which attributes nearly guarantee fewer – if any – systemic barriers to international success. Self-promotion is an important part of the modern scientific endeavour, as it is with most modern endeavours, even if one is an accomplished scientist.
All this said, it is notable that Higgs was also a conscientious person. When he was awarded the Wolf Prize in 2004 – a prestigious award in the field of physics – he refused to receive it in person in Jerusalem because it was a state function and he has protested Israel’s war against Palestine. He was a member of the Campaign for Nuclear Disarmament until the group extended its opposition to nuclear power as well; then he resigned. He also stopped supporting Greenpeace after they become opposed to genetic modification. If it is for these actions that Sheldon deemed Higgs an “accomplished self-promoter”, then I stand corrected.
Featured image: A portrait of Peter Higgs by Lucinda Mackay hanging at the James Clerk Maxwell Foundation, Edinburgh. Caption and credit: FF-UK/Wikimedia Commons, CC BY-SA 4.0.
If your job requires you to pore through a dozen or two scientific papers every month – as mine does – you’ll start to notice a few every now and then couching a somewhat well-known fact in study-speak. I don’t mean scientific-speak, largely because there’s nothing wrong about trying to understand natural phenomena in the formalised language of science. However, there seems to be something iffy – often with humorous effect – about a statement like the following: “cutting emissions of ozone-forming gases offers a ‘unique opportunity’ to create a ‘natural climate solution'”1 (source). Well… d’uh. This is study-speak – to rephrase mostly self-evident knowledge or truisms in unnecessarily formalised language, not infrequently in the style employed in research papers, without adding any new information but often including an element of doubt when there is likely to be none.
1. Caveat: These words were copied from a press release, so this could have been a case of the person composing the release being unaware of the study’s real significance. However, the words within single-quotes are copied from the corresponding paper itself. And this said, there have been some truly hilarious efforts to make sense of the obvious. For examples, consider many of the winners of the Ig Nobel Prizes.
Of course, it always pays to be cautious, but where do you draw the line before a scientific result is simply one because it is required to initiate a new course of action? For example, the Univ. of Exeter study, the press release accompanying which discussed the effect of “ozone-forming gases” on the climate, recommends cutting emissions of substances that combine in the lower atmosphere to form ozone, a compound form of oxygen that is harmful to both humans and plants. But this is as non-“unique” an idea as the corresponding solution that arises (of letting plants live better) is “natural”.
However, it’s possible the study’s authors needed to quantify these emissions to understand the extent to which ambient ozone concentration interferes with our climatic goals, and to use their data to inform the design and implementation of corresponding interventions. Such outcomes aren’t always obvious but they are there – often because the necessarily incremental nature of most scientific research can cut both ways. The pursuit of the obvious isn’t always as straightforward as one might believe.
The Univ. of Exeter group may have accumulated sufficient and sufficiently significant evidence to support their conclusion, allowing themselves as well as others to build towards newer, and hopefully more novel, ideas. A ladder must have rungs at the bottom irrespective of how tall it is. But when the incremental sword cuts the other way, often due to perverse incentives that require scientists to publish as many papers as possible to secure professional success, things can get pretty nasty.
For example, the Cornell University consumer behaviour researcher Brian Wansink was known to advise his students to “slice” the data obtained from a few experiments in as many different ways as possible in search of interesting patterns. Many of the papers he published were later found to contain numerous irreproducible conclusions – i.e. Wansink had searched so hard for patterns that he’d found quite a few even when they really weren’t there. As the British economist Ronald Coase said, “If you torture the data long enough, it will confess to anything.”
The dark side of incremental research, and the virtue of incremental research done right, stems from the fact that it’s non-evidently difficult to ascertain the truth of a finding when the strength of the finding is expected to be so small that it really tests the notion of significance or so large – or so pronounced – that it transcends intuitive comprehension.
For an example of the former, among particle physicists, a result qualifies as ‘fact’ if the chances of it being a fluke are 1 in 3.5 million. So the Large Hadron Collider (LHC), which was built to discover the Higgs boson, had to have performed at least 3.5 million proton-proton collisions capable of producing a Higgs boson and which its detectors could observe and which its computers could analyse to attain this significance.
But while protons are available abundantly and the LHC can theoretically perform 645.8 trillion collisions per second, imagine undertaking an experiment that requires human participants to perform actions according to certain protocols. It’s never going to be possible to enrol billions of them for millions of hours to arrive at a rock-solid result. In such cases, researchers design experiments based on very specific questions, and such that the experimental protocols suppress, or even eliminate, interference, sources of doubt and confounding variables, and accentuate the effects of whatever action, decision or influence is being evaluated.
Such experiments often also require the use of sophisticated – but nonetheless well-understood – statistical methods to further eliminate the effects of undesirable phenomena from the data and, to the extent possible, leave behind information of good-enough quality to support or reject the hypotheses. In the course of navigating this winding path from observation to discovery, researchers are susceptible to, say, misapplying a technique, overlooking a confounder or – like Wansink – overanalysing the data so much that a weak effect masquerades as a strong one but only because it’s been submerged in a sea of even weaker effects.
Similar problems arise in experiments that require the use of models based on very large datasets, where researchers need to determine the relative contribution of each of thousands of causes on a given effect. The Univ. of Exeter study that determined ozone concentration in the lower atmosphere due to surface sources of different gases contains an example. The authors write in their paper (emphasis added):
We have provided the first assessment of the quantitative benefits to global and regional land ecosystem health from halving air pollutant emissions in the major source sectors. … Future large-scale changes in land cover [such as] conversion of forests to crops and/or afforestation, would alter the results. While we provide an evaluation of uncertainty based on the low and high ozone sensitivity parameters, there are several other uncertainties in the ozone damage model when applied at large-scale. More observations across a wider range of ozone concentrations and plant species are needed to improve the robustness of the results.
In effect, their data could be modified in future to reflect new information and/or methods, but in the meantime, and far from being a silly attempt at translating a claim into jargon-laden language, the study eliminates doubt to the extent possible with existing data and modelling techniques to ascertain something. And even in cases where this something is well known or already well understood, the validation of its existence could also serve to validate the methods the researchers employed to (re)discover it and – as mentioned before – generate data that is more likely to motivate political action than, say, demands from non-experts.
In fact, the American mathematician Marc Abrahams, known much more for founding and awarding the Ig Nobel Prizes, identified this purpose of research as one of three possible reasons why people might try to “quantify the obvious” (source). The other two are being unaware of the obvious and, of course, to disprove the obvious.
Clockwork theory has been revived and reformulated by scientists from CERN to solve a difficult problem at the heart of particle physics.
It has come under scrutiny at various times by multiple prominent physicists and thinkers, but it’s not hard to see why, when the idea of ‘grand unification’ first set out, it seemed plausible to so many. The first time it was seriously considered was about four decades ago, shortly after physicists had realised that two of the four fundamental forces of nature were in fact a single unified force if you ramped up the energy at which it acted. (electromagnetic + weak = electroweak). The thought that followed was simply logical: what if, at some extremely high energy (like what was in the Big Bang), all four forces unified into one? This was 1974.
There has been no direct evidence of such grand unification yet. Physicists don’t know how the electroweak force will unify with the strong nuclear force – let alone gravity, a problem that actually birthed one of the most powerful mathematical tools in an attempt to solve it. Nonetheless, they think they know the energy at which such grand unification should occur if it does: the Planck scale, around 1019 GeV. This is about as much energy as is contained in a few litres of petrol, but it’s stupefyingly large when you have to accommodate all of it in a particle that’s 10-15 metres wide.
This is where particle accelerators come in. The most powerful of them, the Large Hadron Collider (LHC), uses powerful magnetic fields to accelerate protons to close to light-speed, when their energy approaches about 7,000 GeV. But the Planck energy is still 10 million billion orders of magnitude higher, which means it’s not something we might ever be able to attain on Earth. Nonetheless, physicists’ theories show that that’s where all of our physical laws should be created, where the commandments by which all that exists does should be written.
… Or is it?
There are many outstanding problems in particle physics, and physicists are desperate for a solution. They have to find something wrong with what they’ve already done, something new or a way to reinterpret what they already know. The clockwork theory is of the third kind – and its reinterpretation begins by asking physicists to dump the idea that new physics is born only at the Planck scale. So, for example, it suggests that the effects of quantum gravity (a quantum-mechanical description of gravity) needn’t necessarily become apparent only at the Planck scale but at a lower energy itself. But even if it then goes on to solve some problems, the theory threatens to present a new one. Consider: If it’s true that new physics isn’t born at the highest energy possible, then wouldn’t the choice of any energy lower than that just be arbitrary? And if nothing else, nature is not arbitrary.
To its credit, clockwork sidesteps this issue by simply not trying to find ‘special’ energies at which ‘important’ things happen. Its basic premise is that the forces of nature are like a set of interlocking gears moving against each other, transmitting energy – rather potential – from one wheel to the next, magnifying or diminishing the way fundamental particles behave in different contexts. Its supporters at CERN and elsewhere think it can be used to explain some annoying gaps between theory and experiment in particle physics, particularly the naturalness problem.
Before the Higgs boson was discovered, physicists predicted based on the properties of other particles and forces that its mass would be very high. But when the boson’s discovery was confirmed at CERN in January 2013, its mass implied that the universe would have to be “the size of a football” – which is clearly not the case. So why is the Higgs boson’s mass so low, so unnaturally low? Scientists have fronted many new theories that try to solve this problem but their solutions often require the existence of other, hitherto undiscovered particles.
Clockwork’s solution is a way in which the Higgs boson’s interaction with gravity – rather gravity’s associated energy – is mediated by a string of effects described in quantum field theory that tamp down the boson’s mass. In technical parlance, the boson’s mass becomes ‘screened’. An explanation for this that’s both physical and accurate is hard to draw up because of various abstractions. So as University of Bruxelles physicist Daniele Teresi suggests, imagine this series: Χ = 0.5 × 0.5 × 0.5 × 0.5 × … × 0.5. Even if each step reduces Χ’s value by only a half, it is already an eighth after three steps; after four, a sixteenth. So the effect can get quickly drastic because it’s exponential.
And the theory provides a mathematical toolbox that allows for all this to be achieved without the addition of new particles. This is advantageous because it makes clockwork relatively more elegant than another theory that seeks to solve the naturalness problem, called supersymmetry, SUSY for short. Physicists like SUSY also because it allows for a large energy hierarchy: a distribution of particles and processes at energies between electroweak unification and grand unification, instead of leaving the region bizarrely devoid of action like the Standard Model does. But then SUSY predicts the existence of 17 new particles, none of which have been detected yet.
Even more, as Matthew McCullough, one of clockwork’s developers, showed at an ongoing conference in Italy, its solutions for a stationary particle in four dimensions exhibit conceptual similarities to Maxwell’s equations for an electromagnetic wave in a conductor. The existence of such analogues is reassuring because it recalls nature’s tendency to be guided by common principles in diverse contexts.
This isn’t to say clockwork theory is it. As physicist Ben Allanach has written, it is a “new toy” and physicists are still playing with it to solve different problems. Just that in the event that it has an answer to the naturalness problem – as well as to the question why dark matter doesn’t decay, e.g. – it is notable. But is this enough: to say that clockwork theory mops up the math cleanly in a bunch of problems? How do we make sure that this is how nature works?
McCullough thinks there’s one way, using the LHC. Very simplistically: clockwork theory induces fluctuations in the probabilities with which pairs of high-energy photons are created at some energies at the LHC. These should be visible as wavy squiggles in a plot with energy on the x-axis and events on the y-axis. If these plots can be obtained and analysed, and the results agree with clockwork’s predictions, then we will have confirmed what McCullough calls an “irreducible prediction of clockwork gravity”, the case of using the theory to solve the naturalness problem.
To recap: No free parameters (i.e. no new particles), conceptual elegance and familiarity, and finally a concrete and unique prediction. No wonder Allanach thinks clockwork theory inhabits fertile ground. On the other hand, SUSY’s prospects have been bleak since at least 2013 (if not earlier) – and it is one of the more favoured theories among physicists to explain physics beyond the Standard Model, physics we haven’t observed yet but generally believe exists. At the same time, and it bears reiterating, clockwork theory will also have to face down a host of challenges before it can be declared a definitive success. Tik tok tik tok tik tok…
Four years of the Higgs boson, live-tweeting and timezones, new music, and quickly reviewing an Erikson book.
Four years of the Higgs boson
Missed this didn’t I. On July 4, 2012, physicists at CERN announced that the Large Hadron Collider had found a Higgs-boson-like particle. Though the confirmation would only come in January 2013 (that it was the Higgs boson and not any other particle), July 4 is the celebrated date. I don’t exactly mark the occasion every year except to recap on whatever’s been happening in particle physics. And this year: everyone’s still looking for supersymmetry; there was widespread excitement about a possible new fundamental particle weighing about 750 GeV when data-taking began at the LHC in late May but strong rumours from within CERN have it that such a particle probably doesn’t exist (i.e. it’s vanishing in the new data-sets). Pity. The favoured way to anticipate what might come to be well before the final announcements are made in August is to keep an eye out for conference announcements in mid-July. If they’re made, it’s a strong giveaway that something’s been found.
Live-tweeting and timezones
I’ve a shitty internet connection at home in Delhi which means I couldn’t get to see the live-stream NASA put out of its control room or whatever as Juno executed its orbital insertion manoeuvre this morning. Fortunately, Twitter came to the rescue; NASA’s social media team had done such a great job of hyping up the insertion (deservingly so) that it seemed as if all the 480 accounts I followed were tweeting about it. I don’t believe I missed anything at all, except perhaps the sounds of applause. Twitter’s awesome that way, and I’ll say that even if it means I’m stating the obvious. One thing did strike me: all times (of the various events in the timeline) were published in UTC and EDT. This makes sense because converting from UTC to a local timezone is easy (IST = UTC + 5.30) while EDT corresponds to the US east cost. However, the thing about IST being UTC + 5.30 isn’t immediately apparent to everyone (at least not to me), and every so often I wish an account tweeting from India, such as a news agency’s, uses IST. I do it every time.
I don’t know why I hadn’t found Yat-kha earlier considering I listen to Huun Huur Tu so much, and Yat-kha is almost always among the recommendations (all bands specialising in throat-singing). And while Huun Huur Tu likes to keep their music traditional and true to its original compositional style, Yat-kha takes it a step further, banding its sound up with rock, and this tastes much better to me. With a voice like Albert Kuvezin’s, keeping things traditional can be a little disappointing – you can hear why in the song above. It’s called Kaa-khem; the same song by Huun Huur Tu is called Mezhegei. Bass evokes megalomania in me, and it’s all the more sensual when its rendition is accomplished with human voice, rising and falling. Another example of what I’m talking about is called Yenisei punk. Finally, this is where I’d suggest you stop if you’re looking for throat-singing made to sound more belligerent: I stumbled upon War horse by Tengger Cavalry, classified as nomadic folk metal. It’s terrible.
Fall of Light, a part 2
In fantasy trilogies, the first part benefits from establishing the premise and the third, from the denouement. If the second part has to benefit from anything at all, then it is the story itself, not the intensity of the stakes within its narrative. At least, that’s my takeaway from Fall of Light, the second book of Steven Erikson’s Kharkanas trilogy. Its predecessor, Forge of Darkness, established the kingdom of Kurald Galain and the various forces that shape its peoples and policies. Because the trilogy has been described as being a prequel (note: not the prequel) to Erikson’s epic Malazan Book of the Fallen series, and because of what we know about Kurald Galain in the series, the last book of the trilogy has its work cut out for it. But in the meantime, Fall of Light was an unexpectedly monotonous affair – and that was awesome. As a friend of mine has been wont to describe the Malazan series: Erikson is a master of raising the stakes. He does that in all of his books (including the Korbal Broach short-stories) and he does it really well. However, Fall of Light rode with the stakes as they were laid down at the end of the first book, through a plot that maintained the tension at all times. It’s neither eager to shed its burden nor is it eager to take on new ones. If you’ve read the Malazan series, I’d say he’s written another Deadhouse Gates, but better.
Although the data’s statistical significance isn’t as good as it would have to be for there to be a new ‘champagne bottle boson’ moment, it’s encouraging that the data itself isn’t vanishing.
On December 15 last year, scientists working with the Large Hadron Collider experiment announced that they had found slight whispers of a possible new fundamental particle, and got the entire particle physics community excited. There was good reason: should such a particle’s existence become verified, it would provide physicists some crucial headway in answering questions about the universe that our current knowledge of physics has been remarkably unable to cope with. And on March 17, members of the teams that made the detection presented more details as well as some preliminary analyses at a conference, held every year, in La Thuile, Italy.
The verdict: the case for the hypothesised particle’s existence has got a tad bit stronger. Physicists still don’t know what it could be or if it won’t reveal itself to have been a fluke measurement once more data trickles in by summer this year. At the same time, the bump in the data persists in two sets of measurements logged by two detectors and at different times. In December, the ATLAS detector had presented a stronger case – i.e., a more reliable measurement – than the CMS detector; at La Thuile on March 17, the CMS team also came through with promising numbers.
Because of the stochastic nature of particle physics, the reliability of results is encapsulated by their statistical significance, denoted by σ (sigma). So 3σ would mean the measurements possess a 1-in-350 chance of being a fluke and marks the threshold for considering the readings as evidence. And 5σ would mean the measurements possess a 1-in-3.5 million chance of being a fluke and marks the threshold for claiming a discovery. Additionally, tags called ‘local’ and ‘global’ refer to whether the significance is for a bump exactly at 750 GeV or anywhere in the plot at all.
And right now, particle physicists have this scoreboard, as compiled by Alessandro Strumia, an associate professor of physics at Pisa University, who presented it at the conference:
Pauline Gagnon, a senior research scientist at CERN, explained on her blog, “Two hypotheses were tested, assuming different characteristics for the hypothetical new particle: the ‘spin 0’ case corresponds to a new type of Higgs boson, while ‘spin 2’ denotes a graviton.” A graviton is a speculative particle carrying the force of gravity. The – rather, a – Higgs boson was discovered at the LHC in July 2012 and verified in January 2013. This was during the collider’s first run, when it accelerated two beams of protons to 4 TeV (1,000 GeV = 1 TeV) each and then smashed them together. The second run kicked off, following upgrades to the collider and detectors during 2014, with a beam energy of 6.5 TeV.
Although none of the significances are as good as they’d have to be for there to be a new ‘champagne bottle boson’moment (alternatively: another summertime hit), it’s encouraging that the data behind them has shown up over multiple data-taking periods and isn’t failing repeated scrutiny. More presentations by physicists from ATLAS and CMS at the conference, which concludes on March 19, are expected to provide clues about other anomalous bumps in the data that could be related to the one at 750 GeV. If theoretical physicists have such connections to make, their ability to zero in on what could be producing the excess photons becomes much better.
But even more than new analyses gleaned from old data, physicists will be looking forward to the LHC waking up from its siesta in the first week of May, and producing results that could become available as early as June. Should the data still continue to hold up – and the 5σ local significance barrier be breached – then physicists will have just what they need to start a new chapter in the study of fundamental physics just as the previous one was closed by the Higgs boson’s discovery in 2012.
For reasons both technical and otherwise, such a chapter has its work already cut out. The Standard Model of particle physics, a theory unifying the behaviours of different species of particles and which requires the Higgs boson’s existence, is flawed despite its many successes. Therefore, physicists have been, and are, looking for ways to ‘break’ the model by finding something it doesn’t have room for. Both the graviton and another Higgs boson are such things although there are other contenders as well.
For many years, one of the world’s most powerful scopes, as in a microscope, was the Hadron-Elektron Ring Anlage (HERA) particle accelerator in Germany. Where scopes bounce electromagnetic radiation – like visible light – off surfaces to reveal information hidden to the naked eye, accelerators reveal hidden information by bombarding the target with energetic particles. At HERA, those particles were electrons accelerated to 27.5 GeV. At this energy, the particles can probe a distance of a few hundredths of a femtometer (earlier called fermi) – 2.5 million times better than the 0.1 nanometers that atomic force microscopy can achieve (of course, they’re used for different purposes).
The electrons were then collided head on against protons accelerated to 920 GeV.
Unlike protons, electrons aren’t made up of smaller particles and are considered elementary. Moreover, protons are approx. 2,000-times heavier than electrons. As a result, the high-energy collision is more an electron scattering off of a proton, but the way it happens is that the electron imparts some energy to the proton before scattering off (this is imagined as an electron emitting some energy as a photon, which is then absorbed by the proton). This is called deep inelastic scattering: ‘deep’ for high-energy; ‘inelastic’ because the proton absorbs some energy.
One of the most famous deep-inelastic scattering experiments was conducted in 1968 at the Stanford Linear Accelerator Centre. Then, the perturbed protons were observed to ’emit’ other particles – essentially hitherto undetected constituent particles that escaped their proton formation and formed other kinds of composite particles. The constituent particles were initially dubbed partons but later found to be quarks, anti-quarks (the matter/anti-matter particles) and gluons (the force-particles that held the quarks/anti-quarks together).
HERA was shut in June 2007. Five years later, the plans for a successor at least 100-times more sensitive than HERA were presented – in the form of the Large Hadron-electron Collider (LHeC). As the name indicates, it is proposed to be built adjoining the Large Hadron Collider (LHC) complex at CERN by 2025 – a timeframe based on when the high-luminosity phase of the LHC is set to begin (2024).
On December 15, physicists working on the LHC had announced new results obtained from the collider – two in particular stood out. One was a cause for great, yet possibly premature, excitement: a hint of a yet unknown particle weighing around 747 GeV. The other was cause for a bit of dismay: quantum chromodynamics (QCD), the theory that deals with the physics of quarks, anti-quarks and gluons, seemed flawless across a swath of energies. Some physicists were hoping it wouldn’t be so (because its flawlessness has come at the cost of being unable to explain some discoveries, like dark matter). Over the next decade, the LHC will push the energy frontier further to see – among other things – if QCD ‘breaks’, becoming unable to explain a possible new phenomenon.
Against this background, the LHeC is being pitched as the machine that could be dedicated to examining this breakpoint and some other others like it, and in more detail than the LHC is equipped to. One helpful factor is that when electrons are one kind of particles participating in a collision, physicists don’t have to worry about how the energy will be distributed among constituent particles since electrons don’t have any. Hadron collisions, on the other hand, have to deal with quarks, anti-quarks and gluons, and are tougher to analyse.
So, to accomplish this, the team behind the LHeC is considering installing a pill-shaped machine called the energy recovery linac (ERL), straddling the LHC ring (shown above), to produce a beam of electrons that’d then take on the accelerated protons from the main LHC ring – making up the ‘linac-ring LHeC’ design. A first suggestion to install the LHeC as a ring, to accelerate electrons, along the LHC ring was rejected because it would hamper experiments during construction. Anyway, the electrons will be accelerated to 60 GeV while the protons, to 7,000 GeV. The total wall-plug power to the ERL is being capped at 100 MW.
The ERL has a slightly different acceleration mechanism from the LHC, and doesn’t simply accelerate particles continuously around a ring. First, the electrons are accelerated through a radio-frequency field in a linear accelerator (linac – the straight section of the ERL) and then fed into a circular channel, crisscrossed by magnetic fields, curving into the rear end of the linac. The length of the circular channel is such that by the time the electrons travel along it, their phase has shifted by 180º (i.e. if their spin was oriented “up” at one end, it’d have become flipped to “down” by the time they reached the other). And when the out-of-phase electrons reenter the channel, they decelerate. Their kinetic energy is lost to the RF field, which intensifies and so provides a bigger kick to the new batch of particles being injected to the linac at just that moment. This way, the linacrecovers the kinetic energy from each circulation.
Such a mechanism is employed at all because the amount of energy lost in a form called synchrotron radiation increases drastically as the particle’s mass gets lower – when accelerated radially using bending magnetic fields.
Keeping in mind the need to explore new areas of physics – especially those associated with leptons (elementary particles of which electrons are a kind) and quarks/gluons (described by QCD) – the energy of the electrons coming out of the ERL is currently planned to be 60 GeV. They will be collided with accelerated protons by positioning the ERL tangential to the LHC ring. And at the moment of the collision, CERN’s scientists hope that they will be able to use the LHeC to study:
Predicted unification of the electromagnetic and weak forces (into an electroweak force): The electromagnetic force of nature is mediated by the particles called photons while the weak force, by particles called W and Z bosons. Whether the scientists will observe the unification of these forces, as some theories predict, is dependent on the quality of electron-proton collisions. Specifically, if the square of the momentum transferred between the particles can reach up to 8-9 TeV, the collider will have created an environment in which physicists will be able to probe for signs of an electroweak force at play.
Gluon saturation: To quote from an interview given by theoretical physicist Raju Venugopalan in January 2013: “We all know the nuclei are made of protons and neutrons, and those are each made of quarks and gluons. There were hints in data from the HERA collider in Germany and other experiments that the number of gluons increases dramatically as you accelerate particles to high energy. Nuclear physics theorists predicted that the ions accelerated to near the speed of light at the [Relativistic Heavy Ion Collider] would reach an upper limit of gluon concentration – a state of gluon saturation we call colour glass condensate.”
Higgs bosons: On July 4, 2012, Fabiola Gianotti, soon to be the next DG of CERN but then the spokesperson of the ATLAS experiment at the LHC, declared that physicists had found a Higgs boson. Widespread celebrations followed – while a technical nitpick remained: physicists only knew the particle resembled a Higgs boson and might not have been the real thing itself. Then, in March 2013, the particle was most likely identified as being a Higgs boson. And even then, one box remained to be checked: that it was the Higgs boson, not one of many kinds. For that, physicists have been waiting for more results from the upgraded LHC. But a machine like the LHeC would be able to produce a “few thousand” Higgs bosons a year, enabling physicists to study the elusive particle in more detail, confirm more of its properties – or, more excitingly, find that that’s not the case – and look for higher-energy versions of it.
A 2012 paper detailing the concept also notes that should the LHC find that signs of ‘new physics’ could exist beyond the default energy levels of the LHeC, scientists are bearing in mind the need for the electrons to be accelerated by the ERL to up to 140 GeV.
The unique opportunity presented by an electron-proton collider working in tandem with the LHC goes beyond the mammoth energies to a property called luminosity as well. It’s measured in inverse femtobarn per second, denoting the number of events occurring per 10-39 squared centimetres per second. For example, 10 fb-1 denotes 10 events occurring per 10-39 sq. cm s-1 – that’s 1040 events per sq. cm per second (The luminosity over a specific period of time, i.e. without the ‘per seconds’ in the unit, is called the integrated luminosity). At the LHeC, a luminosity of 1033 cm-2 s-1 is expected to be achieved and physicists hope that with some tweaks, it can be hiked by yet another order of magnitude. To compare: this is 100x what HERA achieved, providing an unprecedented scale at which to explore the effects of deep inelastic scattering, and 10x the LHC’s current luminosity.
It’s also 100x lower than that of the HL-LHC, which is the configuration of the LHC with which the ERL will be operating to make up the LHeC. And the LHeC’s lifetime will be the planned lifetime of the LHC, till the 2030s, about a decade. In the same period, if all goes well, a Chinese behemoth will have taken shape: the Circular Electron-Positron Collider (CEPC), with a circumference 2x that of the LHC. In its proton-proton collision configuration – paralleling the LHC’s – China claims it will reach energies of 70,000 GeV (as against the LHC’s current 14,000 GeV) and luminosity comparable to the HL-LHC. And when its electron-positron collision configuration, which the LHeC will be able to mimic, will be at its best, physicists reckon the CEPC will be able to produce 100,000 Higgs bosons a year.
As it happens, some groups at CERN are already drawing up plans, due to be presented in 2018, for a machine dwarfing even the CEPC. Meet the Future Circular Collider (FCC), by one account the “ultimate precision-physics machine” (and funnily named by another). To be fair, the FCC has been under consideration since about 2013 and independent of the CEPC. However, in sheer size, the FCC could swallow the CEPC – with an 80-100 km-long ring. It will also be able to accelerate protons to 50,000 GeV (by 2040), attain luminosities of 1035 cm-2 s-1, continue to work with the ERL, function as an electron-positron collider (video), and look for particles weighing up to 25,000 GeV (currently the heaviest known fundamental particle is the top quark, weighing 169-173 GeV).
And should it be approved and come online in the second half of the 2030s, there’s a good chance the world will be a different place, too: not just the CEPC – there will be (or will have been?) either the International Linear Collider (ILC) and Compact Linear Collider (CLIC) as well. ‘Either’ because they’re both linear accelerators with similar physical dimensions and planning to collide electrons with positrons, their antiparticles, to study QCD, the Higgs field and the prospects of higher dimensions, so only one of them might get built. And they will require a decade to be built, coming online in the late 2020s. The biggest difference between them is that the ILC will be able to reach collision energies of 1,000 GeV while the CLIC (whose idea was conceived at CERN), of 3,000 GeV.
Run 2 results from the LHC show that QCD is scale-invariant – in keeping with the Standard Model prediction.
Looks intimidating, doesn’t it? It’s also very interesting because it contains an important result acquired at the Large Hadron Collider (LHC) this year, a result that could disappoint many physicists.
The LHC reopened earlier this year after receiving multiple performance-boosting upgrades over the 18 months before. In its new avatar, the particle-smasher explores nature’s fundamental constituents at the highest energies yet, almost twice as high as they were in its first run. By Albert Einstein’s mass-energy equivalence (E = mc2), the proton’s mass corresponds to an energy of almost 1 GeV (giga-electron-volt). The LHC’s beam energy to compare was 3,500 GeV and is now 6,500 GeV.
At the start of December, it concluded data-taking for 2015. That data is being steadily processed, interpreted and published by the multiple topical collaborations working on the LHC. Two collaborations in particular, ATLAS and CMS, were responsible for plots like the one shown above.
This is CMS’s plot showing the same result:
When protons are smashed together at the LHC, a host of particles erupt and fly off in different directions, showing up as streaks in the detectors. These streaks are called jets. The plots above look particularly at pairs of particles called quarks, anti-quarks or gluons that are produced in the proton-proton collisions (they’re in fact the smaller particles that make up protons).
The sequence of black dots in the ATLAS plot shows the number of jets (i.e. pairs of particles) observed at different energies. The red line shows the predicted number of events. They both match, which is good… to some extent.
One of the biggest, and certainly among the most annoying, problems in particle physics right now is that the prevailing theory that explains it all is unsatisfactory – mostly because it has some really clunky explanations for some things. The theory is called the Standard Model and physicists would like to see it disproved, broken in some way.
In fact, those physicists will have gone to work today to be proved wrong – and be sad at the end of the day if they weren’t.
The annoying problem at its heart
The LHC chips in providing two kinds of opportunities: extremely sensitive particle-detectors that can provide precise measurements of fleeting readings, and extremely high collision energies so physicists can explore how some particles behave in thousands of scenarios in search of a surprising result.
So, the plots above show three things. First, the predicted event-count and the observed event-count are a match, which is disappointing. Second, the biggest deviation from the predicted count is highlighted in the ATLAS plot (look at the red columns at the bottom between the two blue lines). It’s small, corresponding to two standard deviations (symbol: σ) from the normal. Physicists need at least three standard deviations (3σ) from the normal for license to be excited.
But this is the most important result (an extension to the first): The predicted event-count and the observed event-count are a match across 6,000 GeV. In other words: physicists are seeing no cause for joy, and all cause for revalidating a section of the Standard Model, in a wide swath of scenarios.
The section in particular is called quantum chromodynamics (QCD), which deals with how quarks, antiquarks and gluons interact with each other. As theoretical physicist Matt Strassler explains on his blog,
… from the point of view of the highest energies available [at the LHC], all particles in the Standard Model have almost negligible rest masses. QCD itself is associated with the rest mass scale of the proton, with mass-energy of about 1 GeV, again essentially zero from the TeV point of view. And the structure of the proton is simple and smooth. So QCD’s prediction is this: the physics we are currently probing is essential scale-invariant.
Scale-invariance is the idea that two particles will interact the same way no matter how energetic they are. To be sure, the ATLAS/CMS results suggest QCD is scale-invariant in the 0-6,000 GeV range. There’s a long way to go – in terms of energy levels and future opportunities.
Something in the valley
The folks analysing the data are helped along by previous results at the LHC as well. For example, with the collision energy having been ramped up, one would expect to see particles of higher energies manifesting in the data. However, the heavier the particle, the wider the bump in the plot and more the focusing that’ll be necessary to really tease out the peak. This is one of the plots that led to the discovery of the Higgs boson:
That bump between 125 and 130 GeV is what was found to be the Higgs, and you can see it’s more of a smear than a spike. For heavier particles, that smear’s going to be wider with longer tails on the site. So any particle that weighs a lot – a few thousand GeV – and is expected to be found at the LHC would have a tail showing in the lower energy LHC data. But no such tails have been found, ruling out heavier stuff.
And because many replacement theories for the Standard Model involve the discovery of new particles, analysts will tend to focus on particles that could weigh less than about 2,000 GeV.
In fact that’s what’s riveted the particle physics community at the moment: rumours of a possible new particle in the range 1,900-2,000 GeV. A paper uploaded to the arXiv preprint server on December 10 shows a combination of ATLAS and CMS data logged in 2012, and highlights a deviation from the normal that physicists haven’t been able to explain using information they already have. This is the relevant plot:
The one on the middle and right are particularly relevant. They each show the probability of the occurrence of an event (observed as a bump in the data, not shown here) of some heavier mass of energy decaying into two different final states: of W and Z bosons (WZ), and of two Z bosons (ZZ). Bosons make a type of fundamental particle and carry forces.
The middle chart implies that the mysterious event is at least 1,000-times less likelier to occur than normally and the one on the left implies the event is at least 10,000-times less likelier to occur than normally. And both readings are at more than 3σ significance, so people are excited.
The authors of the paper write: “Out of all benchmark models considered, the combination favours the hypothesis of a [particle or its excitations] with mass 1.9-2.0 [thousands of GeV] … as long as the resonance does not decay exclusively to WW final states.”
But as physicist Tommaso Dorigo points out, these blips could also be a fluctuation in the data, which does happen.
Although the fact that the two experiments see the same effect … is suggestive, that’s no cigar yet. For CMS and ATLAS have studied dozens of different mass distributions, and a bump could have appeared in a thousand places. I believe the bump is just a fluctuation – the best fluctuation we have in CERN data so far, but still a fluke.
There’s a seminar due to happen today at the LHC Physics Centre at CERN where data from the upgraded run is due to be presented. If something really did happen in those ‘valleys’, which were filtered out of a collision energy of 8,000 GeV (basically twice the beam energy, where each beam is a train of protons), then those events would’ve happened in larger quantities during the upgraded run and so been more visible. The results will be presented at 1930 IST. Watch this space.
Featured image: Inside one of the control centres of the collaborations working on the LHC at CERN. Each collaboration handles an experiment, or detector, stationed around the LHC tunnel. Credit: CERN.
The world’s single largest science experiment will restart on March 23 after a two-year break. Scientists and administrators at the European Organization for Nuclear Research – known by its French acronym CERN – have announced the status of the agency’s upgrades on its Large Hadron Collider (LHC) and its readiness for a new phase of experiments running from now until 2018.
Before the experiment was shut down in late 2013, the LHC became famous for helping discover the elusive Higgs boson, a fundamental (that is, indivisible) particle that gives other fundamental particles their mass through a complicated mechanism. The find earned two of the physicists who thought up the mechanism in 1964, Peter Higgs and Francois Englert, a Nobel Prize in that year.
Though the LHC had fulfilled one of its more significant goals by finding the Higgs boson, its purpose is far from complete. In its new avatar, the machine boasts of the energy and technical agility necessary to answer questions that current theories of physics are struggling to make sense of.
As Alice Bean, a particle physicist who has worked with the LHC, said, “A whole new energy region will be waiting for us to discover something.”
The finding of the Higgs boson laid to rest speculations of whether such a particle existed and what its properties could be, and validated the currently reigning set of theories that describe how various fundamental particles interact. This is called the Standard Model, and it has been successful in predicting the dynamics of those interactions.
From the what to the why
But having assimilated all this knowledge, what physicists don’t know, but desperately want to, is why those particles’ properties have the values they do. They have realized the implications are numerous and profound: ranging from the possible existence of more fundamental particles we are yet to encounter to the nature of the substance known as dark matter, which makes up a great proportion of matter in the universe while we know next to nothing about it. These mysteries were first conceived to plug gaps in the Standard Model but they have only been widening since.
With an experiment now able to better test theories, physicists have started investigating these gaps. For the LHC, the implication is that in its second edition it will not be looking for something as much as helping scientists decide where to look to start with.
As Tara Shears, a particle physicist at the University of Liverpool, toldNature, “In the first run we had a very strong theoretical steer to look for the Higgs boson. This time we don’t have any signposts that are quite so clear.”
Higher energy, luminosity
The upgrades to the LHC that would unlock new experimental possibilities were evident in early 2012.
The machine works by using powerful electric currents and magnetic fields to accelerate two trains, or beams, of protons in opposite directions, within a ring 27 km long, to almost the speed of light and then colliding them head-on. The result is a particulate fireworks of such high energy that the most rare, short-lived particles are brought into existence before they promptly devolve into lighter, more common particles. Particle detectors straddling the LHC at four points on the ring record these collisions and their effects for study.
So, to boost its performance, upgrades to the LHC were of two kinds: increasing the collision energy inside the ring and increasing the detectors’ abilities to track more numerous and more powerful collisions.
The collision energy has been nearly doubled in its second life, from 7-8 TeV to 13-14 TeV. The frequency of collisions has also been doubled from one set every 50 nanoseconds (billionth of a second) to one every 25 nanoseconds. Steve Myers, CERN’s director for accelerators and technology, had said in December 2012, “More intense beams mean more collisions and a better chance of observing rare phenomena.”
The detectors have received new sensors, neutron shields to protect from radiation damage, cooling systems and superconducting cables. An improved fail-safe system has also been installed to forestall accidents like the one in 2008, when failing to cool a magnet led to a shut-down for eight months.
In all, the upgrades cost approximately $149 million, and will increase CERN’s electricity bill by 20% to $65 million. A “massive debugging exercise” was conducted last week to ensure all of it clicked together.
Going ahead, these new specifications will be leveraged to tackle some of the more outstanding issues in fundamental physics.
CERN listed a few–presumably primary–focus areas. They include investigating if the Higgs boson could betray the existence of undiscovered particles, the particles dark matter could be made of, why the universe today has much more matter than antimatter, and if gravity is so much weaker than other forces because it is leaking into other dimensions.
Stride forward in three frontiers
Physicists are also hopeful for the prospects of discovering a class of particles called supersymmetric partners. The theory that predicts their existence is called supersymmetry. It builds on some of the conclusions of the Standard Model, and offers predictions that plug its holes as well with such mathematical elegance that it has many of the world’s leading physicists enamored. These predictions involve the existence of new particles called partners.
In a neat infographic by Elizabeth Gibney in Nature, she explains that the partner that will be easiest to detect will be the ‘stop squark’ as it is the lightest and can show itself in lower energy collisions.
In all, the LHC’s new avatar marks a big stride forward not just in the energy frontier but also in the intensity and cosmic frontiers. With its ability to produce and track more collisions per second as well as chart the least explored territories of the ancient cosmos, it’d be foolish to think this gigantic machine’s domain is confined to particle physics and couldn’t extend to fuel cells, medical diagnostics or achieving systems-reliability in IT.
Here’s a fitting video released by CERN to mark this momentous occasion in the history of high-energy physics.
Featured image: A view of the LHC. Credit: CERN
Update: After engineers spotted a short-circuit glitch in a cooled part of the LHC on March 21, its restart was postponed from March 23 by a few weeks. However, CERN has assured that its a fully understood problem and that it won’t detract from the experiment’s goals for the year.
Through an extraordinary routine, the most powerful machine built by humankind is slowly but surely gearing up for its relaunch in March 2015. The Large Hadron Collider (LHC), straddling the national borders of France and Switzerland, will reawaken after two years of upgrades and fixes to smash protons at nearly twice the energy it did during its first run that ended in March 2012. Here are 10 things to look out for: five upgrades and five possible exciting discoveries.
Higher collision energy – In its previous run, each beam of protons destined for collision with other beams was accelerated to 3.5-4 TeV. By May 2015, each beam will be accelerated to 6.5-7 TeV. By doubling the collision energy, scientists hope to be able to observe higher-energy phenomena, such as heavier, more elusive particles.
Higher collision frequency – Each beam has bunches of protons that are collided with other oncoming bunches at a fixed frequency. During the previous run, this frequency was once every 50 nanoseconds. In the new run, this will be doubled to once every 25 nanoseconds. With more collisions happening per unit time, rarer phenomena will happen more frequently and become easier to spot.
Higher instantaneous luminosity – This is the detectability of particles per second. It will be increased by 10 times, to 1 × 1034 per cm2 per second. By 2022, engineers will aim to increase it to 7.73 × 1034 per cm2 per second.
New pixel sensors – An extra layer of pixel sensors, to handle the higher luminosity regime, will be added around the beam pipe within the ATLAS and CMS detectors. While the CMS was built with higher luminosities in mind, ATLAS wasn’t, and its pixel sensors are expected to wear out within a year. As an intermediate solution, a temporary layer of sensors will be added to last until 2018.
New neutron shields – Because of the doubled collision energy and frequency, instruments could be damaged by high-energy neutrons flying out of the beam pipe. To prevent this, advanced neutron shields will be screwed on around the pipe.
Dark matter – The LHC is adept at finding particles both fundamental and composite previously unseen before. One area of physics desperately looking for a particle of its own is dark matter. It’s only natural for both quests to converge at the collider. A leader candidate particle for dark matter is the WIMP: weakly-interacting massive particle. If the LHC finds it, or finds something like it, it could be the next big thing after the Higgs boson, perhaps bigger.
Dark energy – The universe is expanding at an accelerating pace. There is a uniform field of energy pervading it throughout that is causing this expansion, called the dark energy field. The source of dark energy’s potential is the vacuum of space, where extremely short-lived particles continuously pop in and out of existence. But to drive the expansion of the entire universe, the vacuum’s potential should be 10120 times what observations show it to be. At the LHC, the study of fundamental particles could drive better understanding of what the vacuum actually holds and where dark energy’s potential comes from.
Supersymmetry – The Standard Model of particle physics defines humankind’s understanding of the behavior of all known fundamental particles. However, some of their properties are puzzling. For example, some natural forces are too strong for no known reason; some particles are too light. For this, physicists have a theory of particulate interactions called supersymmetry, SUSY for short. And SUSY predicts the existence of some particles that don’t exist in the Model yet, called supersymmetric partners. These are heavy particles that could show themselves in the LHC’s new higher-energy regime. Like with the dark matter WIMPs, finding a SUSY particle could by a Nobel Prize-winner.
Higgs boson – One particle that’s too light in the Standard Model is the Higgs boson. As a result, physicists think it might not be the only Higgs boson out there. Perhaps there are others with the same properties but weigh lesser or more.
Antimatter reactions – Among the class of particles called mesons, one – designated B0 – holds the clue to answering a question that has astrophysicists stymied for decades: Why does the universe have more matter than antimatter if, when it first came into existence, there were equal amounts of both? An older result from the LHC shows the B0 meson decays into more matter particles than antimatter ones. Probing further about why this is so will be another prominent quest of the LHC’s.
Bonus: Extra dimensions – Many alternate theories of fundamental particles require the existence of extra dimensions. The way to look for them is to create extremely high energies and then look for particles that might pop into one of the three dimensions we occupy from another that we don’t.
The CERN Council has elected a new Director-General to succeed the incumbent Rolf-Dieter Heuer. Fabiola Gianotti, who served as the ATLAS collaboration’s spokesperson from 2009 to 2013 – a period that included the discovery of the long-sought Higgs boson by the ATLAS and CMS experiments – will be the first woman to hold the position. Her mandate begins from January 2016.
A CERN press release announcing the appointment said the “Council converged rapidly in favor of Dr. Gianotti”, implying it was a quick and unanimous decision.
The Large Hadron Collider (LHC), the mammoth particle smasher that produces the collisions that ATLAS, CMS and two other similar collaborations study, is set to restart in January 2015 after a series of upgrades to increase its energy and luminosity. And so Dr. Gianotti’s term will coincide with a distinct phase of science, this one eager for evidence to help answer deeper questions in particle physics – such as the Higgs boson’s mass, the strong force’s strength and dark matter.
Dr. Gianotti will succeed 15 men who, as Director Generals, have been responsible for not simply coordinating the scientific efforts stemming from CERN but also guiding research priorities and practices. They have effectively set the various agendas that the world’s preeminent nuclear physics lab has chosen to pursue since its establishment in 1945.
In fact, the title of ‘spokesperson’, which Dr. Gianotti held for the ATLAS collaboration for four years until 2013, is itself deceptively uncomplicated. The spokesperson not only speaks for the collaboration but is also the effective project manager who plays an important role when decisions are made about what measurements to focus on and what questions to answer. When on July 4, 2012, the discovery of a Higgs-boson-like particle was announced, results from the ATLAS particle-detector – and therefore Dr. Gianotti’s affable leadership – were instrumental in getting that far, and in getting Peter Higgs and Francois Englert their 2013 Nobel Prize in physics.
Earlier this year, she had likened her job to “a great scientific adventure”, and but “also a great human adventure”, to CNN. To guide the aspirations and creativity of 3,000 engineers and physicists without attenuation1 of productivity or will must have indeed been so.
That she will be the first woman to become the DG of CERN can’t escape attention either, especially at a time when women’s participation in STEM research seems to be on the decline and sexism in science is being recognized as a prevalent issue. Dr. Gianotti will no doubt make a strong role model for a field that is only 25% women. There will also be much to learn from her past, from the time she chose to become a physicist after learning about Albert Einstein’s idea of quantum mechanics to explain the photoelectric effect. She joined CERN while working toward her PhD from the University of Milan. She was 25, it was 1987 and the W/Z bosons had just been discovered at the facility’s UA1 and UA2 collaborations. Dr. Gianotti would join the latter.
It was an exciting time to be a physicist as well as exacting. Planning for the LHC would begin in that decade and launch one of the world’s largest scientific collaborations with it. The success of a scientist would start to demand not just research excellence but also a flair for public relations, bureaucratic diplomacy and the acuity necessary to manage public funds in the billions from different countries. Dr. Gianotti would go on to wear all these hats even as she started work in calorimetry at the LHC in 1990, on the ATLAS detector in 1992, and on the search for supersymmetric (‘new physics’) particles in 1996.
Her admiration for the humanities has been known to play its part in shaping her thoughts about the universe at its most granular. She has a professional music diploma from the Milan Conservatory and often unwinds at the end of a long day with a session on the piano. Her fifth-floor home in Geneva sometimes affords her a view of Mont Blanc, and she often enjoys long walks in the mountains. In the same interview, given to Financial Times in 2013, she adds,
There are many links between physics and art. For me, physics and nature have very nice foundations from an aesthetic point of view, and at the same time art is based on physics and mathematical principle. If you build a nice building, you have to build it with some criteria because otherwise it collapses.2
Her success in leading the ATLAS collaboration, and becoming the veritable face of the hunt for the Higgs boson, have catapulted her to being the next DG of CERN. At the same time, it must feel reassuring3 that as physicists embark on a new era of research that requires just as much ingenuity in formulating new ideas as in testing them, an era “where logic based on past theories does not guide us”4, Fabiola Gianotti’s research excellence, administrative astuteness and creative intuition is now there to guide them.
Good luck, Dr. Gianotti!
1Recommended read: Who really found the Higgs boson? The real genius in the Nobel Prize-winning discovery is not who you think it is.Nautilus, Issue 18.
2I must mention that it’s weird that someone which such strong aesthetic foundations used Comic Sans MS as the font of choice for her presentation at the CERN seminar in 2012 that announced the discovery of a Higgs-like-boson. It was probably the beginning of Comic Sans’s comeback.
3Though I am no physicist.
4In the words of Academy Award-winning film editor Walter S. Murch.
In a paper published in Physical Review Letters on July 17, 2014, a team of American researchers reported the most precisely measured value yet of the mass of the top quark, the heaviest fundamental particle. Its mass is so high that can exist only in very high energy environments – such as inside powerful particle colliders or in the very-early universe – and not anywhere else.
For this, the American team’s efforts to measure its mass come across as needlessly painstaking. However, there’s an important reason to get as close to the exact value as possible.
That reason is 2012’s possibly most famous discovery. It was drinks-all-round for the particle physics community when the Higgs boson was discovered by the ATLAS and CMS experiments on the Large Hadron Collider (LHC). While the elation lasted awhile, there were already serious questions being asked about some of the boson’s properties. For one, it was much lighter than is anticipated by some promising areas of theoretical particle physics. Proponents of an idea called naturalness pegged it to be 19 orders of magnitude higher!
Because the Higgs boson is the particulate residue of an omnipresent energy field called the Higgs field, the boson’s mass has implications for how the universe should be. Being much lighter, physicists couldn’t explain why the boson didn’t predicate a universe the size of a football – while their calculations did.
In the second week of September 2014, Stephen Hawking said the Higgs boson will cause the end of the universe as we know it. Because it was Hawking who said and because his statement contained the clause “end of the universe”, the media hype was ridiculous yet to be expected. What he actually meant was that the ‘unnatural’ Higgs mass had placed the universe in a difficult position.
The universe would ideally love to be in its lowest energy state, like you do when you’ve just collapsed into a beanbag with beer, popcorn and Netflix. However, the mass of the Higgs has trapped it on a chair instead. While the universe would still like to be in the lower-energy beanbag, it’s reluctant to get up from the higher-energy yet still comfortable chair.
Someday, according to Hawking, the universe might increase in energy (get out of the chair) and then collapsed into its lowest energy state (the beanbag). And that day is trillions of years away.
What does the mass of the top quark have to do with all this? Quite a bit, it turns out. Fundamental particles like the top quark possess their mass in the form of potential energy. They acquire this energy when they move through the Higgs field, which is spread throughout the universe. Some particles acquire more energy than others. How much energy is acquired depends on two parameters: the strength of the Higgs field (which is constant), and the particle’s Higgs charge.
The Higgs charge determines how strongly a particle engages with the Higgs field. It’s the highest for the top quark, which is why it’s also the heaviest fundamental particle. More relevant for our discussion, this unique connection between the top quark and the Higgs boson is also what makes the top quark an important focus of studies.
Getting the mass of the top quark just right is important to better determining its Higgs charge, ergo the extent of its coupling with the Higgs boson, ergo better determining the properties of the Higgs boson. Small deviations in the value of the top quark’s mass could spell drastic changes in when or how our universe will switch from the chair to the beanbag.
If it does, all our natural laws would change. Life would become impossible.
The American team that made the measurements of the top quark used values obtained from the D0 experiment on the Tevatron particle collider, at the Fermi National Accelerator Laboratory. The Tevatron was shut in 2011, so their measurements are the collider’s last words on top quark mass: 174.98 ± 0.76 GeV/c2 (the Higgs boson weighs around 126 GeV/c2; a gold atom, considered pretty heavy, weighs around 210 GeV/c2). This is a precision of better than 0.5%, the finest yet. This value is likely to be updated once the LHC restarts early next year.
Two years ago, physicists working on the Large Hadron Collider first announced the discovery of a Higgs boson-like particle, setting the high-energy physics community atwitter. And it was only a couple weeks ago that physicists also announced that the particle was definitely the one predicted by the sturdy Standard Model of particle physics, the theory that governs the Higgs boson’s properties and behavior.
But new results from the ongoing International Conference on High Energy Physics in Valencia, Spain, could add a twist to this plot. Physicists announced that they had evidence – albeit not strong enough – that the Higgs boson was showing signs of disobeying the model.
Members of the ATLAS and CMS collaborations, who work with the detectors of that name, said they had results showing the Higgs boson was decaying into a pair of particles called W bosons at a rate some 20% higher than predicted by the Standard Model. This non-compliance will be a breath of fresh air for physicists who have been faithful to a potent but as yet unobserved theory of new physics called supersymmetry, in short and fondly SUSY.
The W boson mediates the decay of radioactive substances in nature. At sufficiently high energies, such as produced inside the Large Hadron Collider (LHC), these bosons are produced by a multitude of particle interactions. Since their discovery in 1983, they have been widely studied. In these circumstances, announcing signs of SUSY through Higgs decays into WW pairs provides little room for uncertainties.
SUSY predicts that for every fermion, or matter particle, of the Standard Model there is a partner particle that is a boson called a sfermion. Conversely, for every boson, or force particle, of the Standard Model there is a partner particle that is a fermion called a bosino. Physicists who believe SUSY is a plausible theory use these extra particles to solve problems that the Standard Model can’t. One of them is that of dark matter; another is to explain why the Higgs boson weighs much lighter than it should.
Jong Soo Kim et al have described how the anomalous decay rates could be explained using a simple version of SUSY in a pre-print paper uploaded to arXiv on June 27. The paper is playfully titled ‘Stop that ambulance! New physics at the LHC?‘. The ‘Stop’ is a reference to the name of the suppersymmetric partner of the top quark. The authors describe how a combination of supersymmetric particles including the stop boson could explain the new results with only a 1-in-370 chance of error. Even though this means physicists have a confidence of 99.7% in the results, it’s still not high to claim evidence. When the LHC comes online in 2015, physicists will be eager to put these results to the test.
The paper’s title might also refer to a comment that physicist Chris Parkes, spokesperson for the UK participation in the LHCB experiment at the LHC, made to the BBC during the Hadron Collider Physics Symposium in Kyoto, Japan, in November 2012. Results had been announced of the B_s meson decaying into lighter particles at a rate predicted exactly by the Standard Model, nudging SUSY further toward impossibility. Parkes had said, “Supersymmetry may not be dead but these latest results have certainly put it into hospital.”