A gear-train for particle physics

Clockwork theory has been revived and reformulated by scientists from CERN to solve a difficult problem at the heart of particle physics.

It has come under scrutiny at various times by multiple prominent physicists and thinkers, but it’s not hard to see why, when the idea of ‘grand unification’ first set out, it seemed plausible to so many. The first time it was seriously considered was about four decades ago, shortly after physicists had realised that two of the four fundamental forces of nature were in fact a single unified force if you ramped up the energy at which it acted. (electromagnetic + weak = electroweak). The thought that followed was simply logical: what if, at some extremely high energy (like what was in the Big Bang), all four forces unified into one? This was 1974.

There has been no direct evidence of such grand unification yet. Physicists don’t know how the electroweak force will unify with the strong nuclear force – let alone gravity, a problem that actually birthed one of the most powerful mathematical tools in an attempt to solve it. Nonetheless, they think they know the energy at which such grand unification should occur if it does: the Planck scale, around 1019 GeV. This is about as much energy as is contained in a few litres of petrol, but it’s stupefyingly large when you have to accommodate all of it in a particle that’s 10-15 metres wide.

This is where particle accelerators come in. The most powerful of them, the Large Hadron Collider (LHC), uses powerful magnetic fields to accelerate protons to close to light-speed, when their energy approaches about 7,000 GeV. But the Planck energy is still 10 million billion orders of magnitude higher, which means it’s not something we might ever be able to attain on Earth. Nonetheless, physicists’ theories show that that’s where all of our physical laws should be created, where the commandments by which all that exists does should be written.

… Or is it?

There are many outstanding problems in particle physics, and physicists are desperate for a solution. They have to find something wrong with what they’ve already done, something new or a way to reinterpret what they already know. The clockwork theory is of the third kind – and its reinterpretation begins by asking physicists to dump the idea that new physics is born only at the Planck scale. So, for example, it suggests that the effects of quantum gravity (a quantum-mechanical description of gravity) needn’t necessarily become apparent only at the Planck scale but at a lower energy itself. But even if it then goes on to solve some problems, the theory threatens to present a new one. Consider: If it’s true that new physics isn’t born at the highest energy possible, then wouldn’t the choice of any energy lower than that just be arbitrary? And if nothing else, nature is not arbitrary.

To its credit, clockwork sidesteps this issue by simply not trying to find ‘special’ energies at which ‘important’ things happen. Its basic premise is that the forces of nature are like a set of interlocking gears moving against each other, transmitting energy – rather potential – from one wheel to the next, magnifying or diminishing the way fundamental particles behave in different contexts. Its supporters at CERN and elsewhere think it can be used to explain some annoying gaps between theory and experiment in particle physics, particularly the naturalness problem.

Before the Higgs boson was discovered, physicists predicted based on the properties of other particles and forces that its mass would be very high. But when the boson’s discovery was confirmed at CERN in January 2013, its mass implied that the universe would have to be “the size of a football” – which is clearly not the case. So why is the Higgs boson’s mass so low, so unnaturally low? Scientists have fronted many new theories that try to solve this problem but their solutions often require the existence of other, hitherto undiscovered particles.

Clockwork’s solution is a way in which the Higgs boson’s interaction with gravity – rather gravity’s associated energy – is mediated by a string of effects described in quantum field theory that tamp down the boson’s mass. In technical parlance, the boson’s mass becomes ‘screened’. An explanation for this that’s both physical and accurate is hard to draw up because of various abstractions. So as University of Bruxelles physicist Daniele Teresi suggests, imagine this series: Χ = 0.5 × 0.5 × 0.5 × 0.5 × … × 0.5. Even if each step reduces Χ’s value by only a half, it is already an eighth after three steps; after four, a sixteenth. So the effect can get quickly drastic because it’s exponential.

And the theory provides a mathematical toolbox that allows for all this to be achieved without the addition of new particles. This is advantageous because it makes clockwork relatively more elegant than another theory that seeks to solve the naturalness problem, called supersymmetry, SUSY for short. Physicists like SUSY also because it allows for a large energy hierarchy: a distribution of particles and processes at energies between electroweak unification and grand unification, instead of leaving the region bizarrely devoid of action like the Standard Model does. But then SUSY predicts the existence of 17 new particles, none of which have been detected yet.

Even more, as Matthew McCullough, one of clockwork’s developers, showed at an ongoing conference in Italy, its solutions for a stationary particle in four dimensions exhibit conceptual similarities to Maxwell’s equations for an electromagnetic wave in a conductor. The existence of such analogues is reassuring because it recalls nature’s tendency to be guided by common principles in diverse contexts.

This isn’t to say clockwork theory is it. As physicist Ben Allanach has written, it is a “new toy” and physicists are still playing with it to solve different problems. Just that in the event that it has an answer to the naturalness problem – as well as to the question why dark matter doesn’t decay, e.g. – it is notable. But is this enough: to say that clockwork theory mops up the math cleanly in a bunch of problems? How do we make sure that this is how nature works?

McCullough thinks there’s one way, using the LHC. Very simplistically: clockwork theory induces fluctuations in the probabilities with which pairs of high-energy photons are created at some energies at the LHC. These should be visible as wavy squiggles in a plot with energy on the x-axis and events on the y-axis. If these plots can be obtained and analysed, and the results agree with clockwork’s predictions, then we will have confirmed what McCullough calls an “irreducible prediction of clockwork gravity”, the case of using the theory to solve the naturalness problem.

To recap: No free parameters (i.e. no new particles), conceptual elegance and familiarity, and finally a concrete and unique prediction. No wonder Allanach thinks clockwork theory inhabits fertile ground. On the other hand, SUSY’s prospects have been bleak since at least 2013 (if not earlier) – and it is one of the more favoured theories among physicists to explain physics beyond the Standard Model, physics we haven’t observed yet but generally believe exists. At the same time, and it bears reiterating, clockwork theory will also have to face down a host of challenges before it can be declared a definitive success. Tik tok tik tok tik tok

Some notes and updates

Four years of the Higgs boson, live-tweeting and timezones, new music, and quickly reviewing an Erikson book.

Four years of the Higgs boson

Missed this didn’t I. On July 4, 2012, physicists at CERN announced that the Large Hadron Collider had found a Higgs-boson-like particle. Though the confirmation would only come in January 2013 (that it was the Higgs boson and not any other particle), July 4 is the celebrated date. I don’t exactly mark the occasion every year except to recap on whatever’s been happening in particle physics. And this year: everyone’s still looking for supersymmetry; there was widespread excitement about a possible new fundamental particle weighing about 750 GeV when data-taking began at the LHC in late May but strong rumours from within CERN have it that such a particle probably doesn’t exist (i.e. it’s vanishing in the new data-sets). Pity. The favoured way to anticipate what might come to be well before the final announcements are made in August is to keep an eye out for conference announcements in mid-July. If they’re made, it’s a strong giveaway that something’s been found.

Live-tweeting and timezones

I’ve a shitty internet connection at home in Delhi which means I couldn’t get to see the live-stream NASA put out of its control room or whatever as Juno executed its orbital insertion manoeuvre this morning. Fortunately, Twitter came to the rescue; NASA’s social media team had done such a great job of hyping up the insertion (deservingly so) that it seemed as if all the 480 accounts I followed were tweeting about it. I don’t believe I missed anything at all, except perhaps the sounds of applause. Twitter’s awesome that way, and I’ll say that even if it means I’m stating the obvious. One thing did strike me: all times (of the various events in the timeline) were published in UTC and EDT. This makes sense because converting from UTC to a local timezone is easy (IST = UTC + 5.30) while EDT corresponds to the US east cost. However, the thing about IST being UTC + 5.30 isn’t immediately apparent to everyone (at least not to me), and every so often I wish an account tweeting from India, such as a news agency’s, uses IST. I do it every time.

New music

I don’t know why I hadn’t found Yat-kha earlier considering I listen to Huun Huur Tu so much, and Yat-kha is almost always among the recommendations (all bands specialising in throat-singing). And while Huun Huur Tu likes to keep their music traditional and true to its original compositional style, Yat-kha takes it a step further, banding its sound up with rock, and this tastes much better to me. With a voice like Albert Kuvezin’s, keeping things traditional can be a little disappointing – you can hear why in the song above. It’s called Kaa-khem; the same song by Huun Huur Tu is called Mezhegei. Bass evokes megalomania in me, and it’s all the more sensual when its rendition is accomplished with human voice, rising and falling. Another example of what I’m talking about is called Yenisei punk. Finally, this is where I’d suggest you stop if you’re looking for throat-singing made to sound more belligerent: I stumbled upon War horse by Tengger Cavalry, classified as nomadic folk metal. It’s terrible.

Fall of Light, a part 2

In fantasy trilogies, the first part benefits from establishing the premise and the third, from the denouement. If the second part has to benefit from anything at all, then it is the story itself, not the intensity of the stakes within its narrative. At least, that’s my takeaway from Fall of Light, the second book of Steven Erikson’s Kharkanas trilogy. Its predecessor, Forge of Darkness, established the kingdom of Kurald Galain and the various forces that shape its peoples and policies. Because the trilogy has been described as being a prequel (note: not the prequel) to Erikson’s epic Malazan Book of the Fallen series, and because of what we know about Kurald Galain in the series, the last book of the trilogy has its work cut out for it. But in the meantime, Fall of Light was an unexpectedly monotonous affair – and that was awesome. As a friend of mine has been wont to describe the Malazan series: Erikson is a master of raising the stakes. He does that in all of his books (including the Korbal Broach short-stories) and he does it really well. However, Fall of Light rode with the stakes as they were laid down at the end of the first book, through a plot that maintained the tension at all times. It’s neither eager to shed its burden nor is it eager to take on new ones. If you’ve read the Malazan series, I’d say he’s written another Deadhouse Gates, but better.

Oh, and this completes one of my bigger goals for 2016.

Money for science

Spending money on science has been tied to evaluating the value of spin-offs, assessing the link between technological advancement and GDP, and dissecting the metrics of productivity, but the debate won’t ever settle no matter how convincingly each time it is resolved.

For a piece titled The Telescope of the 2030s, Dennis Overbye writes in The New York Times,

I used to think $10 billion was a lot of money before TARP, the Troubled Asset Relief Program, the $700 billion bailout that saved the banks in 2008 and apparently has brought happy days back to Wall Street. Compared with this, the science budget is chump change, lunch money at a place like Goldman Sachs. But if you think this is not a bargain, you need look only as far as your pocket. Companies like Google and Apple have leveraged modest investments in computer science in the 1960s into trillions of dollars of economic activity. Not even Arthur C. Clarke, the vaunted author and space-age prophet, saw that coming.

Which is to say that all that NASA money — whether for planetary probes or space station trips — is spent on Earth, on things that we like to say we want more of: high technology, education, a more skilled work force, jobs, pride in American and human innovation, not to mention greater cosmic awareness, a dose of perspective on our situation here among the stars.

And this is a letter from Todd Huffman, a particle physicist at Oxford, to The Guardian:

Simon Jenkins parrots a cry that I have heard a few times during my career as a research scientist in high-energy physics (Pluto trumps prisons when we spend public money, 17 July). He is unimaginatively concerned that the £34m a year spent by the UK at Cern (and a similar amount per year would have been spent on the New Horizons probe to Pluto) is not actually money well spent.

Yet I read his article online using the world wide web, which was developed initially by and for particle physicists. I did this using devices with integrated circuits partly perfected for the aerospace industry. The web caused the longest non-wartime economic boom in recorded history, during the 90s. The industries spawned by integrated circuits are simply too numerous to count and would have been impossible to predict when that first transistor was made in the 50s. It is a failure of society that funnels such economic largesse towards hedge-fund managers and not towards solving the social ills Mr Jenkins rightly exposes.

Conflict of interest? Not really. Science is being cornered from all sides and if anyone’s going to defend its practice, it’s going to be scientists. But we’re often so ready to confuse participation for investment, and at the first hint of any allegation of conflict, don’t wait to verify matters for ourselves.

I’m sure Yuri Milner’s investment of $100 million today to help the search for extra-terrestrial intelligence will be questioned, too, despite Stephen Hawking’s moving endorsement of it:

Somewhere in the cosmos, perhaps, intelligent life may be watching these lights of ours, aware of what they mean. Or do our lights wander a lifeless cosmos — unseen beacons, announcing that here, on one rock, the Universe discovered its existence. Either way, there is no bigger question. It’s time to commit to finding the answer – to search for life beyond Earth. We are alive. We are intelligent. We must know.

Pursuits like exploring the natural world around us are, I think, what we’re meant to do as humans, what we must do when we can, and what we must ultimately aspire to.

The Large Hadron Collider is back online, ready to shift from the “what” of reality to “why”

The world’s single largest science experiment will restart on March 23 after a two-year break. Scientists and administrators at the European Organization for Nuclear Research – known by its French acronym CERN – have announced the status of the agency’s upgrades on its Large Hadron Collider (LHC) and its readiness for a new phase of experiments running from now until 2018.

Before the experiment was shut down in late 2013, the LHC became famous for helping discover the elusive Higgs boson, a fundamental (that is, indivisible) particle that gives other fundamental particles their mass through a complicated mechanism. The find earned two of the physicists who thought up the mechanism in 1964, Peter Higgs and Francois Englert, a Nobel Prize in that year.

Though the LHC had fulfilled one of its more significant goals by finding the Higgs boson, its purpose is far from complete. In its new avatar, the machine boasts of the energy and technical agility necessary to answer questions that current theories of physics are struggling to make sense of.

As Alice Bean, a particle physicist who has worked with the LHC, said, “A whole new energy region will be waiting for us to discover something.”

The finding of the Higgs boson laid to rest speculations of whether such a particle existed and what its properties could be, and validated the currently reigning set of theories that describe how various fundamental particles interact. This is called the Standard Model, and it has been successful in predicting the dynamics of those interactions.

From the what to the why

But having assimilated all this knowledge, what physicists don’t know, but desperately want to, is why those particles’ properties have the values they do. They have realized the implications are numerous and profound: ranging from the possible existence of more fundamental particles we are yet to encounter to the nature of the substance known as dark matter, which makes up a great proportion of matter in the universe while we know next to nothing about it. These mysteries were first conceived to plug gaps in the Standard Model but they have only been widening since.

With an experiment now able to better test theories, physicists have started investigating these gaps. For the LHC, the implication is that in its second edition it will not be looking for something as much as helping scientists decide where to look to start with.

As Tara Shears, a particle physicist at the University of Liverpool, told Nature, “In the first run we had a very strong theoretical steer to look for the Higgs boson. This time we don’t have any signposts that are quite so clear.”

Higher energy, luminosity

The upgrades to the LHC that would unlock new experimental possibilities were evident in early 2012.

The machine works by using powerful electric currents and magnetic fields to accelerate two trains, or beams, of protons in opposite directions, within a ring 27 km long, to almost the speed of light and then colliding them head-on. The result is a particulate fireworks of such high energy that the most rare, short-lived particles are brought into existence before they promptly devolve into lighter, more common particles. Particle detectors straddling the LHC at four points on the ring record these collisions and their effects for study.

So, to boost its performance, upgrades to the LHC were of two kinds: increasing the collision energy inside the ring and increasing the detectors’ abilities to track more numerous and more powerful collisions.

The collision energy has been nearly doubled in its second life, from 7-8 TeV to 13-14 TeV. The frequency of collisions has also been doubled from one set every 50 nanoseconds (billionth of a second) to one every 25 nanoseconds. Steve Myers, CERN’s director for accelerators and technology, had said in December 2012, “More intense beams mean more collisions and a better chance of observing rare phenomena.”

The detectors have received new sensors, neutron shields to protect from radiation damage, cooling systems and superconducting cables. An improved fail-safe system has also been installed to forestall accidents like the one in 2008, when failing to cool a magnet led to a shut-down for eight months.

In all, the upgrades cost approximately $149 million, and will increase CERN’s electricity bill by 20% to $65 million. A “massive debugging exercise” was conducted last week to ensure all of it clicked together.

Going ahead, these new specifications will be leveraged to tackle some of the more outstanding issues in fundamental physics.

CERN listed a few–presumably primary–focus areas. They include investigating if the Higgs boson could betray the existence of undiscovered particles, the particles dark matter could be made of, why the universe today has much more matter than antimatter, and if gravity is so much weaker than other forces because it is leaking into other dimensions.

Stride forward in three frontiers

Physicists are also hopeful for the prospects of discovering a class of particles called supersymmetric partners. The theory that predicts their existence is called supersymmetry. It builds on some of the conclusions of the Standard Model, and offers predictions that plug its holes as well with such mathematical elegance that it has many of the world’s leading physicists enamored. These predictions involve the existence of new particles called partners.

In a neat infographic by Elizabeth Gibney in Nature, she explains that the partner that will be easiest to detect will be the ‘stop squark’ as it is the lightest and can show itself in lower energy collisions.

In all, the LHC’s new avatar marks a big stride forward not just in the energy frontier but also in the intensity and cosmic frontiers. With its ability to produce and track more collisions per second as well as chart the least explored territories of the ancient cosmos, it’d be foolish to think this gigantic machine’s domain is confined to particle physics and couldn’t extend to fuel cells, medical diagnostics or achieving systems-reliability in IT.

Here’s a fitting video released by CERN to mark this momentous occasion in the history of high-energy physics.

Featured image: A view of the LHC. Credit: CERN

Update: After engineers spotted a short-circuit glitch in a cooled part of the LHC on March 21, its restart was postponed from March 23 by a few weeks. However, CERN has assured that its a fully understood problem and that it won’t detract from the experiment’s goals for the year.

Fabiola Gianotti, the first woman Director-General of CERN

The CERN Council has elected a new Director-General to succeed the incumbent Rolf-Dieter Heuer. Fabiola Gianotti, who served as the ATLAS collaboration’s spokesperson from 2009 to 2013 – a period that included the discovery of the long-sought Higgs boson by the ATLAS and CMS experiments – will be the first woman to hold the position. Her mandate begins from January 2016.

A CERN press release announcing the appointment said the “Council converged rapidly in favor of Dr. Gianotti”, implying it was a quick and unanimous decision.

The Large Hadron Collider (LHC), the mammoth particle smasher that produces the collisions that ATLAS, CMS and two other similar collaborations study, is set to restart in January 2015 after a series of upgrades to increase its energy and luminosity. And so Dr. Gianotti’s term will coincide with a distinct phase of science, this one eager for evidence to help answer deeper questions in particle physics – such as the Higgs boson’s mass, the strong force’s strength and dark matter.

Dr. Gianotti will succeed 15 men who, as Director Generals, have been responsible for not simply coordinating the scientific efforts stemming from CERN but also guiding research priorities and practices. They have effectively set the various agendas that the world’s preeminent nuclear physics lab has chosen to pursue since its establishment in 1945.

In fact, the title of ‘spokesperson’, which Dr. Gianotti held for the ATLAS collaboration for four years until 2013, is itself deceptively uncomplicated. The spokesperson not only speaks for the collaboration but is also the effective project manager who plays an important role when decisions are made about what measurements to focus on and what questions to answer. When on July 4, 2012, the discovery of a Higgs-boson-like particle was announced, results from the ATLAS particle-detector – and therefore Dr. Gianotti’s affable leadership – were instrumental in getting that far, and in getting Peter Higgs and Francois Englert their 2013 Nobel Prize in physics.

Earlier this year, she had likened her job to “a great scientific adventure”, and but “also a great human adventure”, to CNN. To guide the aspirations and creativity of 3,000 engineers and physicists without attenuation1 of productivity or will must have indeed been so.

That she will be the first woman to become the DG of CERN can’t escape attention either, especially at a time when women’s participation in STEM research seems to be on the decline and sexism in science is being recognized as a prevalent issue. Dr. Gianotti will no doubt make a strong role model for a field that is only 25% women. There will also be much to learn from her past, from the time she chose to become a physicist after learning about Albert Einstein’s idea of quantum mechanics to explain the photoelectric effect. She joined CERN while working toward her PhD from the University of Milan. She was 25, it was 1987 and the W/Z bosons had just been discovered at the facility’s UA1 and UA2 collaborations. Dr. Gianotti would join the latter.

It was an exciting time to be a physicist as well as exacting. Planning for the LHC would begin in that decade and launch one of the world’s largest scientific collaborations with it. The success of a scientist would start to demand not just research excellence but also a flair for public relations, bureaucratic diplomacy and the acuity necessary to manage public funds in the billions from different countries. Dr. Gianotti would go on to wear all these hats even as she started work in calorimetry at the LHC in 1990, on the ATLAS detector in 1992, and on the search for supersymmetric (‘new physics’) particles in 1996.

Her admiration for the humanities has been known to play its part in shaping her thoughts about the universe at its most granular. She has a professional music diploma from the Milan Conservatory and often unwinds at the end of a long day with a session on the piano. Her fifth-floor home in Geneva sometimes affords her a view of Mont Blanc, and she often enjoys long walks in the mountains. In the same interview, given to Financial Times in 2013, she adds,

There are many links between physics and art. For me, physics and nature have very nice foundations from an aesthetic point of view, and at the same time art is based on physics and mathematical principle. If you build a nice building, you have to build it with some criteria because otherwise it collapses.2

Her success in leading the ATLAS collaboration, and becoming the veritable face of the hunt for the Higgs boson, have catapulted her to being the next DG of CERN. At the same time, it must feel reassuring3 that as physicists embark on a new era of research that requires just as much ingenuity in formulating new ideas as in testing them, an era “where logic based on past theories does not guide us”4, Fabiola Gianotti’s research excellence, administrative astuteness and creative intuition is now there to guide them.

Good luck, Dr. Gianotti!

1Recommended read: Who really found the Higgs boson? The real genius in the Nobel Prize-winning discovery is not who you think it is. Nautilus, Issue 18.

2I must mention that it’s weird that someone which such strong aesthetic foundations used Comic Sans MS as the font of choice for her presentation at the CERN seminar in 2012 that announced the discovery of a Higgs-like-boson. It was probably the beginning of Comic Sans’s comeback.

3Though I am no physicist.

4In the words of Academy Award-winning film editor Walter S. Murch.

Featured image credit: Claudia Marcelloni/CERN

Restarting the LHC: A timeline

CERN has announced the restart schedule of its flagship science “project”, the Large Hadron Collider, that will see the giant machine return online in early 2015. I’d written about the upgrades that could be expected shortly before it shut down in 2012. They range from new pixel sensors and safety systems to facilities that will double the collider’s energy and the detectors’ eyes for tracking collisions. Here’s a little timeline I made with Timeline.js, check it out.

(It’s at times like this that I really wish WP.com would let bloggers embed iframes in posts.)

The hunt for supersymmetry: Reviewing the first run – 2

I’d linked to a preprint paper [PDF] on arXiv a couple days ago that had summarized the search for Supersymmetry (Susy) from the first run of the Large Hadron Collider (LHC). I’d written to one of the paper’s authors, Pascal Pralavorio at CERN, seeking some insights into his summary, but unfortunately he couldn’t reply by the time I’d published the post. He replied this morning and I’ve summed them up.

Pascal says physicists trained their detectors for “the simplest extension of the Standard Model” using supersymmetric principles called the Minimal Supersymmetric Standard Model (MSSM), formulated in the early 1980s. This meant they were looking for a total of 35 particles. In the first run, the LHC operated at two different energies: first at 7 TeV (at a luminosity of 5 fb-1), then at 8 TeV (at 20 fb-1; explainer here). The data was garnered from both the ATLAS and CMS detectors.

In all, they found nothing. As a result, as Pascal says, “When you find nothing, you don’t know if you are close or far from it!

His paper has an interesting chart that summarized the results for the search for Susy from Run 1. It is actually a superimposition of two charts. One shows the different Standard Model processes (particle productions, particle decays, etc.) at different energies (200-1,600 GeV). The second shows the Susy processes that are thought to occur at these energies.

Cross sections of several SUSY production channels, superimposed with Standard Model process at s = 8 TeV. The right-handed axis indicates the number of events for 20/fb.
Cross sections of several SUSY production channels, superimposed with Standard Model process at s = 8 TeV. The right-handed axis indicates the number of events for 20/fb.

The cross-section of the chart is the probability of an event-type to appear during a proton-proton collision. What you can see from this plot is the ratio of probabilities. For example, stop-stop* (the top quark’s Susy partner particle and anti-particle, respectively) production with a mass of 400 GeV is 1010 (10 billion) less probable than inclusive di-jet events (a Standard Model process). “In other words,” Pascal says, it is “very hard to find” a Susy process while Standard Model processes are on, but it is “possible for highly trained particle physics” to get there.

Of course, none of this means physicists aren’t open to the possibility of there being a theory (and corresponding particles out there) that even Susy mightn’t be able to explain. The most popular among such theories is “the presence of a “possible extra special dimension” on top of the three that we already know. “We will of course continue to look for it and for supersymmetry in the second run.”

Which way does antimatter swing?

In our universe, matter is king: it makes up everything. Its constituents are incredibly tiny particles – smaller than even the protons and neutrons they constitute – and they work together with nature’s forces to make up… everything.

There was also another form of particle once, called antimatter. It is extinct today, but when the universe was born 13.82 billion years ago, there were equal amounts of both kinds.

Nobody really knows where all the antimatter disappeared to or how, but they are looking. Some others, however, are asking another question: did antimatter, while it lasted, fall downward or upward in response to gravity?

Joel Fajans, a professor at the University of California, Berkeley, is one of the physicists doing the asking. “It is the general consensus that the interaction of matter with antimatter is the same as gravitational interaction of matter,” he told this correspondent.

But he wants to be sure, because what he finds could revolutionize the world of physics. Over the years, studying particles and their antimatter counterparts has revealed most of what we know today about the universe. In the future, physicists will explore their minuscule world, called the quantum world, further to see if answers to some unsolved problems are found. If, somewhere, an anomaly is spotted, it could pave the way for new explanations to take over.

“Much of our basic understanding of the evolution of the early universe might change. Concepts like dark energy and dark matter might have be to revised,” Fajans said.

Along with his colleague Jonathan Wurtele, Fajans will work with the ALPHA experiment at CERN to run an elegant experiment that could directly reveal gravity’s effect on antimatter. ALPHA stands for Anti-hydrogen Laser Physics Apparatus.

We know gravity acts on a ball by watching it fall when dropped. On Earth, the ball will fall toward the source of the gravitational pull, a direction called ‘down’. Fajans and Wurtele will study if down is in the same place for antimatter as for matter.

An instrument at CERN called the anti-proton decelerator (AD) synthesizes the antimatter counterpart of protons for study in the lab at a low energy. Fajans and co. will then use the ALPHA experiment’s setup to guide them into the presence of anti-electrons derived from another source using carefully directed magnetic fields.

When an anti-proton and an anti-electron come close enough, their charges will trap each other to form an anti-hydrogen atom.

Because antimatter and matter annihilate each other in a flash of energy, they couldn’t be let near each other during the experiment. Instead, the team used strong magnetic fields to form a force-field around the antimatter, “bottling” it in space.

Once this was done, the experiment was ready to go. Like fingers holding a ball unclench, the magnetic fields were turned off – but not instantaneously. They were allowed to go from ‘on’ to ‘off’ over 30 milliseconds. In this period, the magnetic force wears off and lets gravitational force take its place.

And in this state, Fajans and his team studied which way the little things moved: up or down.

The results

The first set of results from the experiment have allowed no firm conclusions to be drawn. Why? Fajans answered, “Relatively speaking, gravity has little effect on the energetic anti-atoms. They are already moving so fast that they are barely affected by the gravitational forces.” According to Wurtele, about 411 out 434 anti-atoms in the trap were so energetic that the way they escaped from the trap couldn’t be attributed to gravity’s pull or push on them.

Among them, they observed roughly equal numbers of anti-atoms to falling out at the bottom of the trap as at the top (and sides, for that matter.)

They shared this data with their ALPHA colleagues and two people from the University of California, lecturer Andrew Charman and postdoc Andre Zhmoginov. They ran statistical tests to separate results due to gravity from results due to the magnetic field. Again, much statistical uncertainty remained.

The team has no reason to give up, though. For now, they know that gravity would have to be 100 times stronger than it is for them to see any of its effects on anti-hydrogen atoms. They have a lower limit.

Moreover, the ALPHA experiment is also undergoing upgrades to become ALPHA-2. With this avatar, Fajans’s team also hopes to incorporate laser-cooling, a method of further slowing the anti-atoms, so that the effects of gravity are enhanced. Michael Doser, however, is cautious.

The future

As a physicist working with antimatter at CERN, Doser says, “I would be surprised if laser cooling of antihydrogen atoms, something that hasn’t been attempted to date, would turn out to be straightforward.” The challenge lies in bringing the systematics down to the point at which one can trust that any observation would be due to gravity, rather than due to the magnetic trap or the detectors being used.

Fajans and co. also plan to turn off the magnets more slowly in the future to enhance the effects of gravity on the anti-atom trajectories. “We hope to be able to definitively answer the question of whether or not antimatter falls down or up with these improvements,” Fajans concluded.

Like its larger sibling, the Large Hadron Collider, the AD is also undergoing maintenance and repair in 2013, so until the next batch of anti-protons are available in mid-2014, Fajans and Wurtele will be running tests at their university, checking if their experiment can be improved in any way.

They will also be taking heart from there being two other experiments at CERN that can verify their results if they come up with something anomalous, two experiments working with antimatter and gravity. They are the Anti-matter Experiment: Gravity, Interferometry, Spectrocopy (AEGIS), for which Doser is the spokesperson, and the Gravitational Behaviour of Anti-hydrogen at Rest (GBAR).

Together, they carry the potential benefit of an independent cross-check between techniques and results. “This is less important in case no difference to the behaviour of normal matter is found,” Doser said, “but would be crucial in the contrary case. With three experiments chasing this up, the coming years look to be interesting!”

This post, as written by me, originally appeared in The Copernican science blog at The Hindu on May 1, 2013.

Higgs boson closer than ever

The article, as written by me, appeared in The Hindu on March 7, 2013.

Ever since CERN announced that it had spotted a Higgs boson-like particle on July 4, 2012, their flagship Large Hadron Collider (LHC), apart from similar colliders around the world, has continued running experiments to gather more data on the elusive particle.

The latest analysis of the results from these runs was presented at a conference now underway in Italy.

While it is still too soon to tell if the one spotted in July 2012 was the Higgs boson as predicted in 1964, the data is convergent toward the conclusion that the long-sought particle does exist and with the expected properties. More results will be presented over the upcoming weeks.

In time, particle physicists hope that it will once and for all close an important chapter in physics called the Standard Model (SM).

The announcements were made by more than 15 scientists from CERN on March 6 via a live webcast from the Rencontres de Moriond, an annual particle physics forum that has been held in La Thuile, Italy, since 1966.

“Since the properties of the new particle appear to be very close to the ones predicted for the SM Higgs, I have personally no further doubts,” Dr. Guido Tonelli, former spokesperson of the CMS detector at CERN, told The Hindu.

Interesting results from searches for other particles, as well as the speculated nature of fundamental physics beyond the SM, were also presented at the forum, which runs from March 2-16.

Physicists exploit the properties of the Higgs to study its behaviour in a variety of environments and see if it matches with the theoretical predictions. A key goal of the latest results has been to predict the strength with which the Higgs couples to other elementary particles, in the process giving them mass.

This is done by analysing the data to infer the rates at which the Higgs-like particle decays into known lighter particles: W and Z bosons, photons, bottom quarks, tau leptons, electrons, and muons. These particles’ signatures are then picked up by detectors to infer that a Higgs-like boson decayed into them.

The SM predicts these rates with good precision.

Thus, any deviation from the expected values could be the first evidence of new, unknown particles. By extension, it would also be the first sighting of ‘new physics’.

Bad news for new physics, good news for old

After analysis, the results were found to be consistent with a Higgs boson of mass near 125-126 GeV, measured at both 7- and 8-TeV collision energies through 2011 and 2012.

The CMS detector observed that there was fairly strong agreement between how often the particle decayed into W bosons and how often it ought to happen according to theory. The ratio between the two was pinned at 0.76 +/- 0.21.

Dr. Tonelli said, “For the moment, we have been able to see that the signal is getting stronger and even the difficult-to-measure decays into bottom quarks and tau-leptons are beginning to appear at about the expected frequency.”

The ATLAS detector, parallely, was able to observe with 99.73 per cent confidence-level that the analysed particle had zero-spin, which is another property that brings it closer to the predicted SM Higgs boson.

At the same time, the detector also observed that the particle’s decay to two photons was 2.3 standard-deviations higher than the SM prediction.

Dr. Pauline Gagnon, a scientist with the ATLAS collaboration, told this Correspondent via email, “We need to asses all its properties in great detail and extreme rigour,” adding that for some aspects they would need more data.

Even so, the developments rule out signs of any new physics around the corner until 2015, when the LHC will reopen after a two-year shutdown and multiple upgrades to smash protons at doubled energy.

As for the search for Supersymmetry, a favoured theoretical concept among physicists to accommodate phenomena that haven’t yet found definition in the Standard Model: Dr. Pierluigi Campana, LHCb detector spokesperson, told The Hindu that there have been only “negative searches so far”.

A different kind of experiment at CERN

This article, as written by me, appeared in The Hindu on January 24, 2012.

At the Large Hadron Collider (LHC) at CERN, near Geneva, Switzerland, experiments are conducted by many scientists who don’t quite know what they will see, but know how to conduct the experiments that will yield answers to their questions. They accelerate beams of particles called protons to smash into each other, and study the fallout.

There are some other scientists at CERN who know approximately what they will see in experiments, but don’t know how to do the experiment itself. These scientists work with beams of antiparticles. According to the Standard Model, the dominant theoretical framework in particle physics, every particle has a corresponding particle with the same mass and opposite charge, called an anti-particle.

In fact, at the little-known AEgIS experiment, physicists will attempt to produce an entire beam composed of not just anti-particles but anti-atoms by mid-2014.

AEgIS is one of six antimatter experiments at CERN that create antiparticles and anti-atoms in the lab and then study their properties using special techniques. The hope, as Dr. Jeffrey Hangst, the spokesperson for the ALPHA experiment, stated in an email, is “to find out the truth: Do matter and antimatter obey the same laws of physics?”

Spectroscopic and gravitational techniques will be used to make these measurements. They will improve upon, “precision measurements of antiprotons and anti-electrons” that “have been carried out in the past without seeing any difference between the particles and their antiparticles at very high sensitivity,” as Dr. Michael Doser, AEgIS spokesperson, told this Correspondent via email.

The ALPHA and ATRAP experiments will achieve this by trapping anti-atoms and studying them, while the ASACUSA and AEgIS will form an atomic beam of anti-atoms. All of them, anyway, will continue testing and upgrading through 2013.

Working principle

Precisely, AEgIS will attempt to measure the interaction between gravity and antimatter by shooting an anti-hydrogen beam horizontally through a vacuum tube and then measuring how it much sags due to the gravitational pull of the Earth to a precision of 1 per cent.

The experiment is not so simple because preparing anti-hydrogen atoms is difficult. As Dr. Doser explained, “The experiments concentrate on anti-hydrogen because that should be the most sensitive system, as it is not much affected by magnetic or electric fields, contrary to charged anti-particles.”

First, antiprotons are derived from the Antiproton Decelerator (AD), a particle storage ring which “manufactures” the antiparticles at a low energy. At another location, a nanoporous plate is bombarded with anti-electrons, resulting in a highly unstable mixture of both electrons and anti-electrons called positronium (Ps).

The Ps is then excited to a specific energy state by exposure to a 205-nanometre laser and then an even higher energy state called a Rydberg level using a 1,670-nanometre laser. Last, the excited Ps traverses a special chamber called a recombination trap, when it mixes with antiprotons that are controlled by precisely tuned magnetic fields. With some probability, an antiproton will “trap” an anti-electron to form an anti-hydrogen atom.


Before a beam of such anti-hydrogen atoms is generated, however, there are problems to be solved. They involve large electric and magnetic fields to control the speed of and collimate the beams, respectively, and powerful cryogenic systems and ultra-cold vacuums. Thus, Dr. Doser and his colleagues will spend many months making careful changes to the apparatus to ensure these requirements work in tandem by 2014.

While antiparticles were first discovered in 1959, “until recently, it was impossible to measure anything about anti-hydrogen,” Dr. Hangst wrote. Thus, the ALPHA and AEgIS experiments at CERN provide a seminal setting for exploring the world of antimatter.

Anti-particles have been used effectively in many diagnostic devices such as PET scanners. Consequently, improvements in our understanding of them feed immediately into medicine. To name an application: Antiprotons hold out the potential of treating tumors more effectively.

In fact, the feasibility of this application is being investigated by the ACE experiment at CERN.

In the words of Dr. Doser: “Without the motivation of attempting this experiment, the experts in the corresponding fields would most likely never have collaborated and might well never have been pushed to solve the related interdisciplinary problems.”

LHC to re-awaken in 2015 with doubled energy, luminosity

This article, as written by me, appeared in The Hindu on January 10, 2012.

After a successful three-year run that saw the discovery of a Higgs-boson-like particle in early 2012, the Large Hadron Collider (LHC) at CERN, near Geneva, Switzerland, will shut down for 18 months for maintenance and upgrades.

This is the first of three long shutdowns, scheduled for 2013, 2017, and 2022. Physicists and engineers will use these breaks to ramp up one of the most sophisticated experiments in history even further.

According to Mirko Pojer, Engineer In-charge, LHC-operations, most of these changes were planned in 2011. They will largely concern fixing known glitches on the ATLAS and CMS particle-detectors. The collider will receive upgrades to increase its collision energy and frequency.

Presently, the LHC smashes two beams, each composed of precisely spaced bunches of protons, at 3.5-4 tera-electron-volts (TeV) per beam.

By 2015, the beam energy will be pushed up to 6.5-7 TeV per beam. Moreover, the bunches which were smashed at intervals of 50 nanoseconds will do so at 25 nanoseconds.

After upgrades, “in terms of performance, the LHC will deliver twice the luminosity,” Dr. Pojer noted in an email to this Correspondent, with reference to the integrated luminosity. Precisely, it is the number of collisions that the LHC can deliver per unit area which the detectors can track.

The instantaneous luminosity, which is the luminosity per second, will be increased to 1×1034 per centimetre-squared per second, ten-times greater than before, and well on its way to peaking at 7.73×1034 per centimetre-squared per second by 2022.

As Steve Myers, CERN’s Director for Accelerators and Technology, announced in December 2012, “More intense beams mean more collisions and a better chance of observing rare phenomena.” One such phenomenon is the appearance of a Higgs-boson-like particle.

The CMS experiment, one of the detectors on the LHC-ring, will receive some new pixel sensors, a technology responsible for tracking the paths of colliding particles. To make use of the impending new luminosity-regime, an extra layer of these advanced sensors will be inserted around a smaller beam pipe.

If results from it are successful, CMS will receive the full unit in late-2016.

In the ATLAS experiment, unlike with CMS which was built with greater luminosities in mind, pixel sensors are foreseen to wear out within one year after upgrades. As an intermediate solution, a new layer of sensors called the B-layer will be inserted within the detector for until 2018.

Because of the risk of radiation damage due to more numerous collisions, specific neutron shields will be fit, according to Phil Allport, ATLAS Upgrade Coordinator.

Both ATLAS and CMS will also receive evaporative cooling systems and new superconducting cables to accommodate the higher performance that will be expected of them in 2015. The other experiments, LHCb and ALICE, will also undergo inspections and upgrades to cope with higher luminosity.

An improved failsafe system will be installed and the existing one upgraded to prevent accidents such as the one in 2008.

Then, an electrical failure damaged 29 magnets and leaked six tonnes of liquid helium into the tunnel, precipitating an eight-month shutdown.

Generally, as Martin Gastal, CMS Experimental Area Manager, explained via email, “All sub-systems will take the opportunity of this shutdown to replace failing parts and increase performance when possible.”

All these changes have been optimised to fulfil the LHC’s future agenda. This includes studying the properties of the newly discovered particle, and looking for signs of new theories of physics like supersymmetry and higher dimensions.

(Special thanks to Achintya Rao, CMS Experiment.)

Putting particle physics research to work

In the whole gamut of comments regarding the Higgs boson, there is a depressingly large number decrying the efforts of the ATLAS and CMS collaborations. Why? Because a lot of people think the Large Hadron Collider (LHC) is a yawning waste of time and money, an investment that serves mankind no practical purpose.

Well, here and here are some cases in point that demonstrate the practical good that the LHC has made possible in the material sciences. Another big area of application is in medical diagnostics: making the point is one article about hunting for the origin of Alzheimer’s, and another about the very similar technology used in particle accelerators and medical imaging devices, meteorology, VLSI, large-scale networking, cryogenics, and X-ray spectroscopy.

Moving on to more germane applications: arXiv has reams of papers that discuss the deployment of

… amongst others.

The LHC, above all else, is the brainchild of the European Centre for Nuclear Research, popularly known as CERN. These guys invented the notion of the internet, developed the first touch-screen devices, and pioneered the earliest high-energy medical imaging techniques.

With experiments like those being conducted at the LHC, it’s easy to forget every other development in such laboratories apart from the discovery of much-celebrated particles. All the applications I’ve linked to in this post were conceived by scientists working with the LHC, if only to argue that everyone, the man whose tax money pays for these giant labs to the man who uses the money to work in the labs, is mindful of practical concerns.