One particle in the big zoo of subatomic particles is the B meson. It has a very short lifetime once it’s created. In rare instances it decays to three lighter particles: a kaon, a lepton and an anti-lepton. There are many types of leptons and anti-leptons. Two are electrons/anti-electrons and muons/anti-muons. According to the existing theory of particle physics, they should be the decay products with equal probability: a B meson should decay to a kaon, electron and anti-electron as often as it decays to a kaon, muon and anti-muon (after adjusting for mass, since the muon is heavier).
In the last 13 years, physicists studying B meson decays had found on four occasions that it decayed to a kaon, electron and anti-electron more often. They were glad for it, in a way. They had worked out the existing theory, called the Standard Model of particle physics, from the mid-20th century in a series of Nobel Prize-winning papers and experiments. Today, it stands complete, explaining the properties of a variety of subatomic particles. But it still can’t explain what dark matter is, why the Higgs boson is so heavy or why there are three ‘generations’ of quarks, not more or less. If the Standard Model is old physics, particle physicists believe there could be a ‘new physics’ out there – some particle or force they haven’t discovered yet – which could really complete the Standard Model and settle the unresolved mysteries.
Over the years, they have explored various leads for ‘new physics’ in different experiments, but eventually, with more data, the findings have all been found to be in line with the predictions of the Standard Model. Until 2022, the anomalous B meson decays were thought to be a potential source of ‘new physics’ as well. A 2009 study in Japan found that some B meson decays created electron/anti-electrons pairs more often than muons/anti-muon pairs – as did a 2012 study in the US and a 2014 study in Europe. The last one involved the Large Hadron Collider (LHC), operated by the European Organisation for Nuclear Research (CERN) in France, and a detector on it called LHCb. Among other things, the LHCb tracks B mesons. In March 2021, the LHCb collaboration released data qualitatively significant enough to claim ‘evidence’ that some B mesons were decaying to electron/anti-electron pairs more often than to muon/anti-muon pairs.
But the latest data from the LHC, released on December 20, appears to settle the question: it’s still old physics. The formation of different types of lepton/anti-lepton particle pairs with equal probability is called lepton-flavour universality. Since 2009, physicists had been recording data that suggested that one type of some B meson decays were violating lepton-flavour university, in the form of a previously unknown particle or force acting on the decay process. In the new data, physicists analysed B meson decays in the current as well as one of two other pathways, and at two different energy levels – thus, as the official press release put it, “yielding four independent comparisons of the decays”. The more data there is to compare, the more robust the findings will be.
This data was collected over the last five years. Every time the LHC operates, it’s called a ‘run’. Each run generates several terabytes of data that physicists, with the help of computers, comb through in search of evidence for different hypotheses. The data for the new analysis was collected over two runs. And it led physicists to conclude that B mesons’ decay does not violate lepton-flavour universality. The Standard Model still stands and, perhaps equally importantly, a 13-year-old ‘new physics’ lead has been returned to dormancy.
The LHC is currently in its third run; scientists and engineers working with the machine perform maintenance and install upgrades between runs, so each new cycle of operations is expected to produce more as well as more precise data, leading to more high-precision analyses that could, physicists hope, one day reveal ‘new physics’.
At 9 pm India time on April 7, physicists at an American research facility delivered a shot in the arm to efforts to find flaws in a powerful theory that explains how the building blocks of the universe work.
Physicists are looking for flaws in it because the theory doesn’t have answers to some questions – like “what is dark matter?”. They hope to find a crack or a hole that might reveal the presence of a deeper, more powerful theory of physics that can lay unsolved problems to rest.
The story begins in 2001, when physicists performing an experiment in Brookhaven National Lab, New York, found that fundamental particles called muons weren’t behaving the way they were supposed to in the presence of a magnetic field. This was called the g-2 anomaly (after a number called the gyromagnetic factor).
An incomplete model
Muons are subatomic and can’t be seen with the naked eye, so it could’ve been that the instruments the physicists were using to study the muons indirectly were glitching. Or it could’ve been that the physicists had made a mistake in their calculations. Or, finally, what the physicists thought they knew about the behaviour of muons in a magnetic field was wrong.
In most stories we hear about scientists, the first two possibilities are true more often: they didn’t do something right, so the results weren’t what they expected. But in this case, the physicists were hoping they were wrong. This unusual wish was the product of working with the Standard Model of particle physics.
According to physicist Paul Kyberd, the fundamental particles in the universe “are classified in the Standard Model of particle physics, which theorises how the basic building blocks of matter interact, governed by fundamental forces.” The Standard Model has successfully predicted the numerous properties and behaviours of these particles. However, it’s also been clearly wrong about some things. For example, Kyberd has written:
When we collide two fundamental particles together, a number of outcomes are possible. Our theory allows us to calculate the probability that any particular outcome can occur, but at energies beyond which we have so far achieved, it predicts that some of these outcomes occur with a probability of greater than 100% – clearly nonsense.
The Standard Model also can’t explain what dark matter is, what dark energy could be or if gravity has a corresponding fundamental particle. It predicted the existence of the Higgs boson but was off about the particle’s mass by a factor of 100 quadrillion.
All these issues together imply that the Standard Model is incomplete, that it could be just one piece of a much larger ‘super-theory’ that works with more particles and forces than we currently know. To look for these theories, physicists have taken two broad approaches: to look for something new, and to find a mistake with something old.
For the former, physicists use particle accelerators, colliders and sophisticated detectors to look for heavier particles thought to exist at higher energies, and whose discovery would prove the existence of a physics beyond the Standard Model. For the latter, physicists take some prediction the Standard Model has made with a great degree of accuracy and test it rigorously to see if it holds up. Studies of muons in a magnetic field are examples of this.
According to the Standard Model, a number associated with the way a muon swivels in a magnetic field is equal to 2 plus 0.00116591804 (with some give or take). This minuscule addition is the handiwork of fleeting quantum effects in the muon’s immediate neighbourhood, and which make it wobble. (For a glimpse of how hard these calculations can be, see this description.)
Fermilab result
In the early 2000s, the Brookhaven experiment measured the deviation to be slightly higher than the model’s prediction. Though it was small – off by about 0.00000000346 – the context made it a big deal. Scientists know that the Standard Model has a habit of being really right, so when it’s wrong, the wrongness becomes very important. And because we already know the model is wrong about other things, there’s a possibility that the two things could be linked. It’s a potential portal into ‘new physics’.
“It’s a very high-precision measurement – the value is unequivocal. But the Standard Model itself is unequivocal,” Thomas Kirk, an associate lab director at Brookhaven, had told Science in 2001. The disagreement between the values implied “that there must be physics beyond the Standard Model.”
This is why the results physicists announced today are important.
The Brookhaven experiment that ascertained the g-2 anomaly wasn’t sensitive enough to say with a meaningful amount of confidence that its measurement was really different from the Standard Model prediction, or if there could be a small overlap.
Science writer Brianna Barbu has likened the mystery to “a single hair found at a crime scene with DNA that didn’t seem to match anyone connected to the case. The question was – and still is – whether the presence of the hair is just a coincidence, or whether it is actually an important clue.”
So to go from ‘maybe’ to ‘definitely’, physicists shipped the 50-foot-wide, 15-tonne magnet that the Brookhaven facility used in its Muon g-2 experiment to Fermilab, the US’s premier high-energy physics research facility in Illinois, and built a more sensitive experiment there.
The new result is from tests at this facility: that the observation differs from the Standard Model’s predicted value by 0.00000000251 (give or take a bit).
The Fermilab results are expected to become a lot better in the coming years, but even now they represent an important contribution. The statistical significance of the Brookhaven result was just below the threshold at which scientists could claim evidence but the combined significance of the two results is well above.
Potential dampener
So for now, the g-2 anomaly seems to be real. It’s not easy to say if it will continue to be real as physicists further upgrade the Fermilab g-2’s performance.
In fact there appears to be another potential dampener on the horizon. An independent group of physicists has had a paper published today saying that the Fermilab g-2 result is actually in line with the Standard Model’s prediction and that there’s no deviation at all.
This group, called BMW, used a different way to calculate the Standard Model’s value of the number in question than the Fermilab folks did. Aida El-Khadra, a theoretical physicist at the University of Illinois, told Quanta that the Fermilab team had yet to check BMW’s approach, but if it was found to be valid, the team would “integrate it into its next assessment”.
The ‘Fermilab approach’ itself is something physicists have worked with for many decades, so it’s unlikely to be wrong. If the BMW approach checks out, it’s possible according to Quanta that just the fact that two approaches lead to different predictions of the number’s value is likely to be a new mystery.
But physicists are excited for now. “It’s almost the best possible case scenario for speculators like us,” Gordan Krnjaic, a theoretical physicist at Fermilab who wasn’t involved in the research, told Scientific American. “I’m thinking much more that it’s possibly new physics, and it has implications for future experiments and for possible connections to dark matter.”
The current result is also important because the other way to look for physics beyond the Standard Model – by looking for heavier or rarer particles – can be harder.
This isn’t simply a matter of building a larger particle collider, powering it up, smashing particles and looking for other particles in the debris. For one, there is a very large number of energy levels at which a particle might form. For another, there are thousands of other particle interactions happening at the same time, generating a tremendous amount of noise. So without knowing what to look for and where, a particle hunt can be like looking for a very small needle in a very large haystack.
The ‘what’ and ‘where’ instead come from different theories that physicists have worked out based on what we know already, and design experiments depending on which one they need to test.
Into the hospital
One popular theory is called supersymmetry: it predicts that every elementary particle in the Standard Model framework has a heavier partner particle, called a supersymmetric partner. It also predicts the energy ranges in which these particles might be found. The Large Hadron Collider (LHC) in CERN, near Geneva, was powerful enough to access some of these energies, so physicists used it and went looking last decade. They didn’t find anything.
Other groups of physicists have also tried to look for rarer particles: ones that occur at an accessible energy but only once in a very large number of collisions. The LHC is a machine at the energy frontier: it probes higher and higher energies. To look for extremely rare particles, physicists explore the intensity frontier – using machines specialised in generating collisions.
The third and last is the cosmic frontier, in which scientists look for unusual particles coming from outer space. For example, early last month, researchers reported that they had detected an energetic anti-neutrino (a kind of fundamental particle) coming from outside the Milky Way participating in a rare event that scientists predicted in 1959 would occur if the Standard Model is right. The discovery, in effect, further cemented the validity of the Standard Model and ruled out one potential avenue to find ‘new physics’.
This event also recalls an interesting difference between the 2001 and 2021 announcements. The late British scientist Francis J.M. Farley wrote in 2001, after the Brookhaven result:
… the new muon (g-2) result from Brookhaven cannot at present be explained by the established theory. A more accurate measurement … should be available by the end of the year. Meanwhile theorists are looking for flaws in the argument and more measurements … are underway. If all this fails, supersymmetry can explain the data, but we would need other experiments to show that the postulated particles can exist in the real world, as well as in the evanescent quantum soup around the muon.
Since then, the LHC and other physics experiments have sent supersymmetry ‘to the hospital’ on more than one occasion. If the anomaly continues to hold up, scientists will have to find other explanations. Or, if the anomaly whimpers out, like so many others of our time, we’ll just have to put up with the Standard Model.
Featured image: A storage-ring magnet at Fermilab whose geometry allows for a very uniform magnetic field to be established in the ring. Credit: Glukicov/Wikimedia Commons, CC BY-SA 4.0.
For many years, one of the world’s most powerful scopes, as in a microscope, was the Hadron-Elektron Ring Anlage (HERA) particle accelerator in Germany. Where scopes bounce electromagnetic radiation – like visible light – off surfaces to reveal information hidden to the naked eye, accelerators reveal hidden information by bombarding the target with energetic particles. At HERA, those particles were electrons accelerated to 27.5 GeV. At this energy, the particles can probe a distance of a few hundredths of a femtometer (earlier called fermi) – 2.5 million times better than the 0.1 nanometers that atomic force microscopy can achieve (of course, they’re used for different purposes).
The electrons were then collided head on against protons accelerated to 920 GeV.
Unlike protons, electrons aren’t made up of smaller particles and are considered elementary. Moreover, protons are approx. 2,000-times heavier than electrons. As a result, the high-energy collision is more an electron scattering off of a proton, but the way it happens is that the electron imparts some energy to the proton before scattering off (this is imagined as an electron emitting some energy as a photon, which is then absorbed by the proton). This is called deep inelastic scattering: ‘deep’ for high-energy; ‘inelastic’ because the proton absorbs some energy.
One of the most famous deep-inelastic scattering experiments was conducted in 1968 at the Stanford Linear Accelerator Centre. Then, the perturbed protons were observed to ’emit’ other particles – essentially hitherto undetected constituent particles that escaped their proton formation and formed other kinds of composite particles. The constituent particles were initially dubbed partons but later found to be quarks, anti-quarks (the matter/anti-matter particles) and gluons (the force-particles that held the quarks/anti-quarks together).
HERA was shut in June 2007. Five years later, the plans for a successor at least 100-times more sensitive than HERA were presented – in the form of the Large Hadron-electron Collider (LHeC). As the name indicates, it is proposed to be built adjoining the Large Hadron Collider (LHC) complex at CERN by 2025 – a timeframe based on when the high-luminosity phase of the LHC is set to begin (2024).
On December 15, physicists working on the LHC had announced new results obtained from the collider – two in particular stood out. One was a cause for great, yet possibly premature, excitement: a hint of a yet unknown particle weighing around 747 GeV. The other was cause for a bit of dismay: quantum chromodynamics (QCD), the theory that deals with the physics of quarks, anti-quarks and gluons, seemed flawless across a swath of energies. Some physicists were hoping it wouldn’t be so (because its flawlessness has come at the cost of being unable to explain some discoveries, like dark matter). Over the next decade, the LHC will push the energy frontier further to see – among other things – if QCD ‘breaks’, becoming unable to explain a possible new phenomenon.
Against this background, the LHeC is being pitched as the machine that could be dedicated to examining this breakpoint and some other others like it, and in more detail than the LHC is equipped to. One helpful factor is that when electrons are one kind of particles participating in a collision, physicists don’t have to worry about how the energy will be distributed among constituent particles since electrons don’t have any. Hadron collisions, on the other hand, have to deal with quarks, anti-quarks and gluons, and are tougher to analyse.
So, to accomplish this, the team behind the LHeC is considering installing a pill-shaped machine called the energy recovery linac (ERL), straddling the LHC ring (shown above), to produce a beam of electrons that’d then take on the accelerated protons from the main LHC ring – making up the ‘linac-ring LHeC’ design. A first suggestion to install the LHeC as a ring, to accelerate electrons, along the LHC ring was rejected because it would hamper experiments during construction. Anyway, the electrons will be accelerated to 60 GeV while the protons, to 7,000 GeV. The total wall-plug power to the ERL is being capped at 100 MW.
The ERL has a slightly different acceleration mechanism from the LHC, and doesn’t simply accelerate particles continuously around a ring. First, the electrons are accelerated through a radio-frequency field in a linear accelerator (linac – the straight section of the ERL) and then fed into a circular channel, crisscrossed by magnetic fields, curving into the rear end of the linac. The length of the circular channel is such that by the time the electrons travel along it, their phase has shifted by 180º (i.e. if their spin was oriented “up” at one end, it’d have become flipped to “down” by the time they reached the other). And when the out-of-phase electrons reenter the channel, they decelerate. Their kinetic energy is lost to the RF field, which intensifies and so provides a bigger kick to the new batch of particles being injected to the linac at just that moment. This way, the linacrecovers the kinetic energy from each circulation.
Such a mechanism is employed at all because the amount of energy lost in a form called synchrotron radiation increases drastically as the particle’s mass gets lower – when accelerated radially using bending magnetic fields.
Keeping in mind the need to explore new areas of physics – especially those associated with leptons (elementary particles of which electrons are a kind) and quarks/gluons (described by QCD) – the energy of the electrons coming out of the ERL is currently planned to be 60 GeV. They will be collided with accelerated protons by positioning the ERL tangential to the LHC ring. And at the moment of the collision, CERN’s scientists hope that they will be able to use the LHeC to study:
Predicted unification of the electromagnetic and weak forces (into an electroweak force): The electromagnetic force of nature is mediated by the particles called photons while the weak force, by particles called W and Z bosons. Whether the scientists will observe the unification of these forces, as some theories predict, is dependent on the quality of electron-proton collisions. Specifically, if the square of the momentum transferred between the particles can reach up to 8-9 TeV, the collider will have created an environment in which physicists will be able to probe for signs of an electroweak force at play.
Gluon saturation: To quote from an interview given by theoretical physicist Raju Venugopalan in January 2013: “We all know the nuclei are made of protons and neutrons, and those are each made of quarks and gluons. There were hints in data from the HERA collider in Germany and other experiments that the number of gluons increases dramatically as you accelerate particles to high energy. Nuclear physics theorists predicted that the ions accelerated to near the speed of light at the [Relativistic Heavy Ion Collider] would reach an upper limit of gluon concentration – a state of gluon saturation we call colour glass condensate.”
Higgs bosons: On July 4, 2012, Fabiola Gianotti, soon to be the next DG of CERN but then the spokesperson of the ATLAS experiment at the LHC, declared that physicists had found a Higgs boson. Widespread celebrations followed – while a technical nitpick remained: physicists only knew the particle resembled a Higgs boson and might not have been the real thing itself. Then, in March 2013, the particle was most likely identified as being a Higgs boson. And even then, one box remained to be checked: that it was the Higgs boson, not one of many kinds. For that, physicists have been waiting for more results from the upgraded LHC. But a machine like the LHeC would be able to produce a “few thousand” Higgs bosons a year, enabling physicists to study the elusive particle in more detail, confirm more of its properties – or, more excitingly, find that that’s not the case – and look for higher-energy versions of it.
A 2012 paper detailing the concept also notes that should the LHC find that signs of ‘new physics’ could exist beyond the default energy levels of the LHeC, scientists are bearing in mind the need for the electrons to be accelerated by the ERL to up to 140 GeV.
The unique opportunity presented by an electron-proton collider working in tandem with the LHC goes beyond the mammoth energies to a property called luminosity as well. It’s measured in inverse femtobarn per second, denoting the number of events occurring per 10-39 squared centimetres per second. For example, 10 fb-1 denotes 10 events occurring per 10-39 sq. cm s-1 – that’s 1040 events per sq. cm per second (The luminosity over a specific period of time, i.e. without the ‘per seconds’ in the unit, is called the integrated luminosity). At the LHeC, a luminosity of 1033 cm-2 s-1 is expected to be achieved and physicists hope that with some tweaks, it can be hiked by yet another order of magnitude. To compare: this is 100x what HERA achieved, providing an unprecedented scale at which to explore the effects of deep inelastic scattering, and 10x the LHC’s current luminosity.
It’s also 100x lower than that of the HL-LHC, which is the configuration of the LHC with which the ERL will be operating to make up the LHeC. And the LHeC’s lifetime will be the planned lifetime of the LHC, till the 2030s, about a decade. In the same period, if all goes well, a Chinese behemoth will have taken shape: the Circular Electron-Positron Collider (CEPC), with a circumference 2x that of the LHC. In its proton-proton collision configuration – paralleling the LHC’s – China claims it will reach energies of 70,000 GeV (as against the LHC’s current 14,000 GeV) and luminosity comparable to the HL-LHC. And when its electron-positron collision configuration, which the LHeC will be able to mimic, will be at its best, physicists reckon the CEPC will be able to produce 100,000 Higgs bosons a year.
As it happens, some groups at CERN are already drawing up plans, due to be presented in 2018, for a machine dwarfing even the CEPC. Meet the Future Circular Collider (FCC), by one account the “ultimate precision-physics machine” (and funnily named by another). To be fair, the FCC has been under consideration since about 2013 and independent of the CEPC. However, in sheer size, the FCC could swallow the CEPC – with an 80-100 km-long ring. It will also be able to accelerate protons to 50,000 GeV (by 2040), attain luminosities of 1035 cm-2 s-1, continue to work with the ERL, function as an electron-positron collider (video), and look for particles weighing up to 25,000 GeV (currently the heaviest known fundamental particle is the top quark, weighing 169-173 GeV).
And should it be approved and come online in the second half of the 2030s, there’s a good chance the world will be a different place, too: not just the CEPC – there will be (or will have been?) either the International Linear Collider (ILC) and Compact Linear Collider (CLIC) as well. ‘Either’ because they’re both linear accelerators with similar physical dimensions and planning to collide electrons with positrons, their antiparticles, to study QCD, the Higgs field and the prospects of higher dimensions, so only one of them might get built. And they will require a decade to be built, coming online in the late 2020s. The biggest difference between them is that the ILC will be able to reach collision energies of 1,000 GeV while the CLIC (whose idea was conceived at CERN), of 3,000 GeV.
This article, as written by me, appeared in print in The Hindu on July 5, 2012.
—
The ATLAS (A Toroidal LHC Apparatus) collaboration at CERN has announced the sighting of a Higgs boson-like particle in the energy window of 125.3 ± 0.6 GeV. The observation has been made with a statistical significance of 5 sigma. This means the chances of error in their measurements are 1 in 3.5 million, sufficient to claim a discovery and publish papers detailing the efforts in the hunt.
Rolf-Dieter Heuer, Director General of CERN since 2009, said at the special conference called by CERN in Geneva, “It was a global effort, it is a global effort. It is a global success.” He expressed great optimism and concluded the conference saying this was “only the beginning.”
With this result, collaborations at the Large Hadron Collider (LHC), the atom-smashing machine, have vastly improved on their previous announcement on December 13, 2011, where the chance of an error was 1-in-50 for similar sightings.
Another collaboration, called CMS (Compact Muon Solenoid), announced the mass of the Higgs-like particle with a 4.9 sigma result. While insufficient to claim a discovery, it does indicate only a one-in-two-million chance of error.
Joe Incandela, CMS spokesman, added, “We’re reaching into the fabric of the universe at a level we’ve never done before.”
The LHC will continue to run its experiments so that results revealed on Wednesday can be revalidated before it shuts down at the end of the year for maintenance. Even so, by 2013, scientists, such as Dr. Rahul Sinha, a participant of the Belle Collaboration in Japan, are confident that a conclusive result will be out.
“The LHC has the highest beam energy in the world now. The experiment was designed to yield quick results. With its high luminosity, it quickly narrowed down the energy-ranges. I’m sure that by the end of the year, we will have a definite word on the Higgs boson’s properties,” he said.
However, even though the Standard Model, the framework of all fundamental particles and the dominating explanatory model in physics today, predicted the particle’s existence, slight deviations have been observed in terms of the particle’s predicted mass. Even more: zeroing in on the mass of the Higgs-like particle doesn’t mean the model is complete when, in fact, it is far from.
While an answer to the question of mass formation took 50 years to be reached, physicists are yet to understand many phenomena. For instance, why aren’t the four fundamental forces of nature equally strong?
The weak, nuclear, electromagnetic, and gravitational forces were born in the first few moments succeeding the Big Bang 13.75 billion years ago. Of these, the weak force is, for some reason, almost 1 billion, trillion, trillion times stronger than the gravitational force! Called the hierarchy problem, it evades a Standard Model explanation.
In response, many theories were proposed. One, called supersymmetry (SUSY), proposed that all fermions, which are particles with half-integer spin, were paired with a corresponding boson, or particles with integer spin. Particle spin is the term quantum mechanics attributes to the particle’s rotation around an axis.
Technicolor was the second framework. It rejects the Higgs mechanism, a process through which the Higgs boson couples stronger with some particles and weaker with others, making them heavier and lighter, respectively.
Instead, it proposes a new form of interaction with initially-massless fermions. The short-lived particles required to certify this framework are accessible at the LHC. Now, with a Higgs-like particle having been spotted with a significant confidence level, the future of Technicolor seems uncertain.
However, “significant constraints” have been imposed on the validity of these and such theories, labeled New Physics, according to Prof. M.V.N. Murthy of the Institute of Mathematical Sciences (IMS), whose current research focuses on high-energy physics.
Some other important questions include why there is more matter than antimatter in this universe, why fundamental particles manifest in three generations and not more or fewer, and the masses of the weakly-interacting neutrinos. State-of-the-art technology worldwide has helped physicists design experiments to study each of these problems better.
For example, the India-based Neutrino Observatory (INO), under construction in Theni, will house the world’s largest static particle detector to study atmospheric neutrinos. Equipped with its giant iron-calorimeter (ICAL) detector, physicists aim to discover which neutrinos are heavier and which lighter.
The LHC currently operates at the Energy Frontier, with high-energy being the defining constraint on experiments. Two other frontiers, Intensity and Cosmic, are also seeing progress. Project X, a proposed proton accelerator at Fermilab in Chicago, Illinois, will push the boundaries of the Intensity Frontier by trying to look for ultra-rare process. On the Cosmic Frontier, dark matter holds the greatest focus.
As the much-anticipated lead-up to the CERN announcement on Wednesday unfolds, the scientific community is rife with many speculations and few rumours. In spite of this deluge, it may be that we could expect a confirmation of the God particle’s existence in the seminar called by physicists working on the Large Hadron Collider (LHC).
The most prominent indication of good news is that five of the six physicists who theorized the Higgs mechanism in a seminal paper in 1964 have been invited to the meeting. The sixth physicist, Robert Brout, passed away in May 2011. Peter Higgs, the man for whom the mass-giving particle is named, has also agreed to attend.
The other indication is much more subtle but just as effective. Dr. Rahul Sinha, a professor of high-energy physics and a participant in the Japanese Belle collaboration, said, “Hints of the Higgs boson have already been spotted in the energy range in which LHC is looking. If it has to be ruled out, four-times as much statistical data should have been gathered to back it up, but this has not been done.”
The energy window which the LHC has been combing through was based on previous searches for the particle at the detector during 2010 and at the Fermilab’s Tevatron before that. While the CERN-based machine is looking for signs of two-photon decay of the notoriously unstable boson, the American legend looked for signs of the boson’s decay into two bottom quarks.
Last year, on December 13, CERN announced in a press conference that the particle had been glimpsed in the vicinity of 127 GeV (GeV, or giga-electron-volt, is used as a measure of particle energy and, by extension of the mass-energy equivalence, its mass).
However, scientists working on the ATLAS detector, which is heading the search, could establish only a statistical significance of 2.3 sigma then, or a 1-in-50 chance of error. To claim a discovery, a 5-sigma result is required, where the chances of errors are one in 3.5 million.
Scientists, including Dr. Sinha and his colleagues, are hoping for a 4-sigma result announcement on Wednesday. If they get it, the foundation stone will have been set for physicists to explore further into the nature of fundamental particles.
Dr. M.V.N. Murthy, who is currently conducting research in high-energy physics at the Institute of Mathematical Sciences (IMS), said, “Knowing the mass of the Higgs boson is the final step in cementing the Standard Model.” The model is a framework of all the fundamental particles and dictates their behaviour. “Once we know the mass of the particle, we can move on and explore the nature of New Physics. It is just around the corner,” he added.