Curious Bends – counting tigers, curing PTSD, farming in battlefields and more

1. Tracking tigers by keeping an eye out for their stripes

“The software ExtractCompare, the first of its kind for tigers, projects photographs of tigers on to a three-dimensional surface, and tallies their stripe patterns, distinct as a barcode, against images stored in a database. When a “high similarity” score is reached, a successful identification is made. “This rapid pattern matching system speeds up the matching process when thousands of comparisons are involved,” said Karanth. But no software, no matter how sophisticated, has yet topped a field researcher’s well-trained eye.” (6 min read)

2. The Deccan Traps volcanoes did their bit to kill the dinos

“Based on the precise dates for the Deccan Traps, the scientists believe the massive eruptions may have played a significant role in extinguishing the dinosaurs – although the exact kill mechanism may never be known. “I don’t think the debate will ever go away. The asteroid impact may have caused the extinction. But perhaps its effect was enhanced because things were softened up a bit by the eruption of these volcanoes,” said co-author Prof Sam Bowring of Massachusetts Institute of Technology.” (3 min read)

3. On the cusp of a cure for post-traumatic stress disorder…

“The above findings clearly demonstrate that the potentially greater cost of failing to discriminate correctly pushed the animals’ behaviour towards playing it safe. They behaved as if there was potential danger in the safe sound as well. The neurons too reflected this tendency to play it safe at the behavioural level. The work by Chattarji and Ghosh thus marks a significant advance in our understanding of how information is processed in the amygdala at the level of individual neurons to maintain a balance between when one should be and should not be afraid.” (14 min read)

4. The first signs of resistance to the ‘last antibiotic’ are showing in India

“”Colistin resistance is still rare. It is carbapenem (the strongest class of antibiotic) resistance that is increasing across the world. Colistin is used to treat cases that are resistant even to carbapenem. It is an emerging problem. Doctors in Greece had published colistin-resistance data in 2006, and the US recorded it two years ago,” said infectious diseases consultant Dr Abdul Ghafur, one of the authors of the paper on the 13 cases, and also the coordinator of Chennai Declaration that has laid out guidelines for hospitals and doctors on antibiotic use.” (7 min read)

5. What’s the price you pay when you farm along the world’s most dangerous border?

“When the barbed fence was raised in the mid-1980s, it divided Harjinder Singh’s fields into two — 15 acres in India, 15 acres in no-man’s land (between the fence and the actual line of control) — almost 40 years after Partition divided his family’s fields between India and Pakistan. Half his land, across the fence, is now under curfew between 6 p.m. and 8 a.m. This makes his land at the international border half as profitable as that in India, Singh said. “They took away the food from our plate,” Singh said. “And gave us hardship in return.”” (8 min read)

Chart of the week

“China is not alone in having low transplant rates in Asia—even richer countries such as Japan and Singapore fall far short of Western countries. Most transplants in these countries also tend to come from live donors, compared with under a third in the West. That suggests there is a lot of room to increase the deceased-donor supply, whether through public-information campaigns or “opt-out” donor-consent regimes. China is considering a legal standard for brain death, enabling exploitation of intact organs while a patient’s heart is still functioning but recovery is deemed impossible.” The Economist has more.

Possible first signal of dark matter detected

A simulated map of the universe's dark matter (in blue) compiled after extensive observations with the Hubble space telescope.
A simulated map of the universe’s dark matter (in blue) compiled after extensive observations with the Hubble space telescope. Credit: J.-P. Kneib (Observatoire Midi-Pyrenees, Caltech) et al., ESA, NASA

Dark matter is thought to make up more than 80% of the matter in our universe. However, it is relatively difficult to detect for various reasons. The two most important are because

  1. Scientists don’t know what the constituent particles of dark matter are, or how much they could weigh. There are various theories – each of them describes a different particle with distinct properties. Various observatories around the world and in space have been looking for them, with little success.
  2. Dark matter interacts with normal matter only through the force of gravity. And of the four known fundamental forces in nature, gravity is the weakest even though it acts across the largest distances. Moreover, there are enough objects in the universe that exert the gravitational forces. Filtering out a gravitational signal coming solely from dark matter is difficult in this sense.

Thankfully, many of these theories postulate other ways to find dark matter. One of them predicts that the particles of dark matter are sterile neutrinos. Neutrinos are a class of fundamental particles that have the lowest mass in nature – aside from the massless photon – and interact excruciatingly rarely with normal matter.

These interactions are confined to happen through the gravitational force and the weak force. However, sterile neutrinos are unique because they interact only through the gravitational force.

The sterile neutrino

When a sterile neutrino decays, it yields one massless photon and one normal neutrino, according to the sterile neutrino model. Because of the configuration of masses, the photon is detectable as an X-ray emission. Moreover, if the dark matter particle has mass in the keV region, the X-ray photon should have an energy of a few keVs.

Precisely this emission line was detected by two groups of astrophysicists who were studying the Perseus cluster of galaxies, located in the constellation of Perseus, in 2012. It is one of the most massive objects in the universe, and is thought to contain 190 million trillion trillion trillion kilograms of dark matter. This vast quantity means that even if the dark matter decay rate is slow – with lifetimes of 1021 years – there are still about 1077 sterile neutrinos of keV mass decaying into X-ray photons and neutrinos in the Perseus cluster.

One group, lead from the Institute for Theoretical Physics in the University of Leiden, Germany, used the ESA XMM-Newton X-ray observatory to measure an X-ray emission of 3.5 keV coming from the Perseus cluster. Another group, lead from the Harvard-Smithsonian Center for Astrophysics (CfA), USA, used the NASA Chandra X-ray observatory to observe the same emissions. XMM-Newton and Chandra are space-based observatories.

Both groups published their papers in December 2014, and both groups were able to measure the emission with confidence levels more than 99.999%. However, their observations need further confirmation before they can graduate from being scientific evidence to knocking on the doors of scientific fact.

The nature of these confirmations will be twofold. On the one hand, scientists will have to gather more evidence to assert that the X-ray emission is indeed from dark matter decays. To this end, they will need to see if the emission intensity varies according to how the density of dark matter varies through space. They will also look for a Doppler effect, which would make the emission look like a smear in a spectrograph.

On the other hand, to deny that the X-ray emission could be from other sources will require a thorough knowledge of other sources in the same volume of space that could emit X-ray photons and their behavior over time. Fortunately, this knowledge already exists – it was on its strength that the two groups that made the observation were able to postulate that the emission was from dark matter decays. Nevertheless, more detailed descriptions of the gases, elements, compounds and other objects in the area will be sought.

The Astro-H telescope

Even so, there’s one more problem: if the observation was made with high confidence, why was the signal weak? If this is indeed the dark matter signature that scientists have been looking for for over six decades, why isn’t the X-ray emission line more pronounced?

Alexey Boyarsky from the University of Leiden notes,

The [dark matter] decay line is much narrower than the spectral resolution of the present day X-ray telescopes and, as previous searches have shown, should be rather weak.

As if by coincidence, Esra Bulbul from the CfA highlights a solution that’s on the horizon: the planned 14-meter-long Astro-H X-ray telescope to be launched in 2015 by the Japanese Aerospace Exploration Agency. As Bulbul writes,

The future high-resolution Astro-H observations will be able to measure the broadening of the line, which will allow us to measure its velocity dispersion. To detect a dark matter decay line [that is weaker than other lines] will require a significantly long exposure.

The excitement these discoveries have set off is palpable, and deserves to be. Bulbul had told NASA in 2012, “After we submitted the paper, theoreticians came up with about 60 different dark matter types which could explain this line. Some particle physicists have jokingly called this particle a ‘bulbulon’.”

Apart from trying to assert the findings and invalidate competing theories, scientists will – rather could – also look for sterile neutrinos through neutrino experiments on Earth. Although some particles similar to them have been detected in the past, experiments looking for sterile neutrinos henceforth will also have to focus on the 3.5 keV mass scale.

Curious Bends – where Indians come from, Irrawady dolphins, human spaceflight and more

1. A genetic history: Where do Indians come from?

“In 2005, K. Thangaraj and his colleagues at CCMB published their findings about the origin of Andaman islanders in the journal Science. The Onge turned out to have surprisingly unmixed origins. They had likely lived isolated in the islands since the arrival here of the first group of humans out of Africa. There were mutations in their mtDNA that were found nowhere else in the world. These mutations must have originated here and not spread. The Onge were an untouched link to the earliest humans who settled the planet.” (24 min read)

2. Oil spill in the Sunderbans threatens the endangered Irrawady dolphins

““Dolphins are at the top of the food chain so they will be affected sooner or later by eating the fish from these waters,” said Rubaiyat Mansur, Bangladesh head of the Wildlife Conservation Society. Mansur also worries about the more direct impact on the animals. “The oil slick collects at the confluences and meanders of the river and those are the places that the dolphins like to hang around in and look for prey,” he said. “Coming up in an oil slick, opening a blow hole and breathing in and breathing out won’t be a good idea because the air right above the oil slick will be quite toxic.”” (4 min read)

3. ISRO will launch its crew module on its first test flight on December 18

“While a capsule in orbit around Earth will re-enter with a velocity of over 28,000 km per hour, next week’s test will see the GSLV Mark III leave the crew module at a height of about 125 km with a velocity of around 19,000 km per hour. The crew module carries sensors that will make measurements of over 200 parameters during the flight, including the temperature, pressure and stress experienced at various points in the structure. “This flight will give us tremendous confidence in our design and provide important inputs for proceeding with development of the manned capsule,” observed S. Unnikrishnan Nair, project director for the Human Spaceflight Programme.” (4 min read)

4. Why are the women dying in India’s sterilisation camps?

“These dangerous conditions are not uncommon in sterilisation camps throughout India, claim women’s health activists. They say that such camps, favoured by the Indian government as a way to perform tubectomies on many women in one go, often exceed the prescribed limit for surgeries in a day, do not adequately sterilise the equipment used on patients, and do not provide counselling before operations or care afterwards. “This was waiting to happen,” Abhijit Das, a public health researcher at Delhi’s Centre for Health and Social Justice, told The BMJ.” (7 min read)

5. India is a breeding ground for the world’s super-bugs

“In the developing world, unregulated use of these drugs coupled with poor sanitation and health care are fueling the rise of resistant bacteria. In India, these factors have created the perfect breeding ground for so-called super bugs. Last year, more than 58,000 babies died from antibiotic-resistant infections.” (2 min read)

Chart of the week

“Japan is the third-largest economy after America and China. It is so wealthy that its regions boast the same economic heft as large countries. The entire economy of Brazil fits into the Kanto region that includes Tokyo, for example. Yet despite this wealth, Japan’s economic growth has been largely stagnant over a period known as the two “lost decades”. America’s GDP grew threefold during that time while China’s soared. After a short stint as prime minister in 2006-07, Shinzo Abe returned in 2012 calling for a bold, three-part plan of stimulus spending, monetary easing and structural reforms—the so-called “three arrows” of Abenomics.” The Economist has more.

The GSLV Mk-III is no jugaad

Scroll
December 18, 2014

(Note: This piece was written in the future-tense and published before ISRO’s successful test flight this morning.)

Come Thursday, the Indian Space Research Organisation will launch its GSLV Mk-III rocket from its launch pad in Sriharikota. In the run-up, most media attention has been on a conical module the rocket will carry on board. But of greater interest is the rocket itself, which holds the key to making ISRO a serious contender in the international satellite-launch sector.

The module is part of the Crew-Module Atmospheric Reentry Experiment, which will see it being released at an altitude of 126 kilometres, upon which it will re-enter earth’s atmosphere and crash into the Bay of Bengal, some 200 kilometres west of the Andaman Islands.

Scientists at ISRO will monitor CARE during its journey and gather important data about its surface and interiors. If the module’s performance matches their predictions, India will be that much closer to using it as a crew capsule for a manned mission into space planned in the early 2020s.

Cashing in on the growth

Forgotten in the media buzz around the module is the rocket itself.

The Mk-III, a next-generation variant of ISRO’s fleet of geosynchronous satellite launch vehicles, boasts of India’s highest payload capacity yet: 10,000 kilograms to low-earth orbit and 4,000 kilograms to the highly elliptical geostationary-transfer orbit.

If the launch is successful – and if future test flights establish reliability – ISRO’s commercial space programme will be in a position to cash in on the rapidly growing global satellite-launching industry as well as give domestic engineers the leeway to design more sophisticated satellites.

This was an important consideration during the Mars Orbiter Mission. The orbiter itself, currently revolving around the Red Planet, weighs only 15 kilograms because the Polar Satellite Launch Vehicle’s payload limit to earth orbit is 1,350 kilograms. This includes all the other instruments on board to ensure a smooth journey. A heavier orbiter could have included more than the five instruments it did.

Dependence on others

In this regard, the GSLV Mk-III will be important because it will determine where India’s native space research programme is headed and how it plans to leverage the increased payload mass option.

It will also reduce India’s dependence on foreign launch vehicles to get heavier satellites into orbit, although self-reliance comes with problems of its own. The common choice in lieu of a reliable GSLV has been the French Arianespace programme, which currently serves almost 65% of the Asia-Pacific market. The Mk-III bears many structural similarities to the Ariane 6 variant. Also, both rockets have a liquid main-stage, a cryogenic upper-stage and two solid-fuel boosters.

The Ariane 6 can lift 6,500 kilograms to the geostationary-transfer orbit, and each launch costs India about $95 million. Assuming the cost-per-launch of the Mk-III is comparable to the Mk-II’s, the number approximately comes down to $40 million (this is likely to be slightly higher). Compare this to the global average price-per-launch of vehicles capable of reaching the geostationary-transfer orbit: $145.57 million, as of 2013.

Skyrocketing profits

From 1999 to 2014, ISRO launched 40 foreign satellites, all with PSLV rockets, and earned EUR 50.47 million and $17.17 million (or Rs 505.74 crore) from 19 countries. Antrix, the commercial arm of ISRO in charge of handling the contracts with foreign space agencies, has reported profits ranging from Rs 19 crore to Rs 169 crore between 2002 and 2009.

This is a pittance compared to what Arianespace made in 2013 alone: EUR 680.1 million. A reliable launch vehicle to the geostationary-transfer orbit can change this for the better and position ISRO as a serious contender in the space-launch sector, assuming it is accompanied by a more efficient Antrix and an ISRO that is willing to work with foreign counterparts, both private and governmental.

It must also consider expanding its launch capabilities to the geostationary-transfer orbit and prepare to keep up with the 5-15% growth rate recorded in the last five years in the satellites industry. Now is an opportune time, too, to get on the wagon: the agency’s flags are flying high on the success of the Mars Orbiter Mission.

Facing other challenges

ISRO has to be ready to confront the likes of SpaceX, a space transport services company which already has the Falcon 9 rocket that can launch 13,150 kilograms to low-earth orbit and 4,850 kilograms to the geostationary-transfer orbit at starting costs of $57 million per launch.

On another front, ISRO will have to move the public dialogue away from its fixation on big science missions and toward less grandiose but equally significant ones. These will help establish the space agency’s mettle in reliably executing higher-altitude launches, enhancing India’s capabilities in the space-launch and space-research sectors. These will also, in turn, serve to make high-cost missions more meaningful than simple proofs of concepts.

For example, ISRO Chairman K Radhakrishnan has announced that a project report compiled by the agency envisages a Rs 12,400-crore manned space mission by 2021. In the next seven years, thus, ISRO aims to master concepts of re-entry technology, human spaceflight and radiation protection. This will happen not just through repeated test flights and launches of crew modules but also using satellites, space-borne observatories and data analysis.

For all these reasons, the GSLV Mk-III marks an important step by ISRO, one that will expose it to greater competition from European and American launchers, increase its self-reliance in a way that it will have to justify its increasing launch capabilities with well-integrated projects, and help the agency establish a legacy over and beyond the jugaad that took it to Mars.

The Mars Orbiter Mission was launched around the same time as NASA’s MAVEN mission to Mars, and with comparable instrumental specifications. While MOM cost ISRO $74 million, MAVEN cost NASA $672 million. In fact, ISRO’s orbiter was by far the least expensive Mars satellite ever built.

A voyager on an unknown sea

Early 2012. The Voyager 1 space-probe is millions of kilometres beyond the orbit of the dwarf planet Pluto. In fact, it’s in a region of space filled with scattered rocks and constantly perturbed by charged particles streaming in from outer space. Has it left the Solar System, then? Nobody is sure.

Late 2012. Scientists still aren’t sure if Voyager 1 has crossed over into the interstellar medium. The ISM is the region of the universe between stars, where the probe would definitely have been outside the Solar System. The probe’s batteries had been low for a while. An important instrument on-board that could’ve ‘sniffed’ at the charged particles and known where the probe was is dead. Only something like luck could save the day.

June 2013. Three papers published in Science discuss changes in the magnetic fields around the probe. Some measurements indicate Voyager 1 is in the ISM. Others say it’s just entered a new region of space, a ‘transition zone’ between the Solar System’s outermost fringes and the first tastes of the universe beyond.

August 2013. Luck finally struck. A storm on the surface of the Sun had ejected a massive burst of its own charged particles, way back in March 2012. They coursed in waves throughout the Solar System. When the waves met the charged particles Voyager 1 was swimming in, there was a resonating, a twang in the electromagnetic field. Some other instruments could pick that up well. It was confirmation that Voyager 1 was out and away.

September 2013. The announcement was made to much celebration.

But in December 2014, there was a surprise.

Tsunamis

When the charged particles from the Sun, called a coronal mass ejection, meet the sea of charged particles in the ISM, it’s like a big wave hitting a placid shore. There is a tsunami, a disturbance spreading outward like ripples in water. Scientists don’t know how potent these tsunamis can be, but they assumed not too much because of the distances involved as well as the timescales.

They were wrong. On December 15, NASA reported that Voyager 1 was still recording the effects of a tsunami that had been unleashed 10 months ago, in February. As Don Gurnett, professor of physics at the University of Iowa, noted, “Most people would have thought the interstellar medium would have been smooth and quiet. But these shock waves seem to be more common than we thought.”

Just like a small ball floating on the surface of a pond bobs up and down when ripples pass under it, Voyager 1’s instruments pick up a bobbing of the electromagnetic field around it. These oscillations can be translated to and relayed as a sound with rising and falling pitches. Listen to it here.

One of the telltale signs that Voyager 1 is in interstellar space is that the sea of particles – or plasma – it’s cruising through gets thicker, as if more viscous. Based on observations, the plasma density has been increasing the farther out Voyager 1 goes. “Is that because the interstellar medium is denser as Voyager moves away from the heliosphere, or is it from the shock wave itself? We don’t know yet,” said Ed Stone, project scientist for the Voyager mission at Caltech.

If you’ve listened to the audio file, you’ll see how eerie it feels. The Sun’s coronal mass ejection behaves like a lighthouse in this sense. As its light – in the form of the charged particles – sweeps through space, the little boat called Voyager 1 finds its way in a rough and uncharted sea, one bob at a time. Here’s to the Sun keeping it going.

 

A new LHC: 10 things to look out for

Through an extraordinary routine, the most powerful machine built by humankind is slowly but surely gearing up for its relaunch in March 2015. The Large Hadron Collider (LHC), straddling the national borders of France and Switzerland, will reawaken after two years of upgrades and fixes to smash protons at nearly twice the energy it did during its first run that ended in March 2012. Here are 10 things to look out for: five upgrades and five possible exciting discoveries.

Technical advancements

  1. Higher collision energy – In its previous run, each beam of protons destined for collision with other beams was accelerated to 3.5-4 TeV. By May 2015, each beam will be accelerated to 6.5-7 TeV. By doubling the collision energy, scientists hope to be able to observe higher-energy phenomena, such as heavier, more elusive particles.
  2. Higher collision frequency – Each beam has bunches of protons that are collided with other oncoming bunches at a fixed frequency. During the previous run, this frequency was once every 50 nanoseconds. In the new run, this will be doubled to once every 25 nanoseconds. With more collisions happening per unit time, rarer phenomena will happen more frequently and become easier to spot.
  3. Higher instantaneous luminosity – This is the detectability of particles per second. It will be increased by 10 times, to 1 × 1034 per cm2 per second. By 2022, engineers will aim to increase it to 7.73 × 1034 per cm2 per second.
  4. New pixel sensors – An extra layer of pixel sensors, to handle the higher luminosity regime, will be added around the beam pipe within the ATLAS and CMS detectors. While the CMS was built with higher luminosities in mind, ATLAS wasn’t, and its pixel sensors are expected to wear out within a year. As an intermediate solution, a temporary layer of sensors will be added to last until 2018.
  5. New neutron shields – Because of the doubled collision energy and frequency, instruments could be damaged by high-energy neutrons flying out of the beam pipe. To prevent this, advanced neutron shields will be screwed on around the pipe.

Research advancements

  1. Dark matter – The LHC is adept at finding particles both fundamental and composite previously unseen before. One area of physics desperately looking for a particle of its own is dark matter. It’s only natural for both quests to converge at the collider. A leader candidate particle for dark matter is the WIMP: weakly-interacting massive particle. If the LHC finds it, or finds something like it, it could be the next big thing after the Higgs boson, perhaps bigger.
  2. Dark energy – The universe is expanding at an accelerating pace. There is a uniform field of energy pervading it throughout that is causing this expansion, called the dark energy field. The source of dark energy’s potential is the vacuum of space, where extremely short-lived particles continuously pop in and out of existence. But to drive the expansion of the entire universe, the vacuum’s potential should be 10120 times what observations show it to be. At the LHC, the study of fundamental particles could drive better understanding of what the vacuum actually holds and where dark energy’s potential comes from.
  3. Supersymmetry – The Standard Model of particle physics defines humankind’s understanding of the behavior of all known fundamental particles. However, some of their properties are puzzling. For example, some natural forces are too strong for no known reason; some particles are too light. For this, physicists have a theory of particulate interactions called supersymmetry, SUSY for short. And SUSY predicts the existence of some particles that don’t exist in the Model yet, called supersymmetric partners. These are heavy particles that could show themselves in the LHC’s new higher-energy regime. Like with the dark matter WIMPs, finding a SUSY particle could by a Nobel Prize-winner.
  4. Higgs boson – One particle that’s too light in the Standard Model is the Higgs boson. As a result, physicists think it might not be the only Higgs boson out there. Perhaps there are others with the same properties but weigh lesser or more.
  5. Antimatter reactions – Among the class of particles called mesons, one – designated B0 – holds the clue to answering a question that has astrophysicists stymied for decades: Why does the universe have more matter than antimatter if, when it first came into existence, there were equal amounts of both? An older result from the LHC shows the B0 meson decays into more matter particles than antimatter ones. Probing further about why this is so will be another prominent quest of the LHC’s.

Bonus: Extra dimensions – Many alternate theories of fundamental particles require the existence of extra dimensions. The way to look for them is to create extremely high energies and then look for particles that might pop into one of the three dimensions we occupy from another that we don’t.

Sauron’s singularity: Sucking in light but lighting up the universe

This composite image shows the central region of the spiral galaxy NGC 4151, dubbed the 'Eye of Sauron' by astronomers for its similarity to the eye of the malevolent character in 'The Lord of the Rings'.
This composite image shows the central region of the spiral galaxy NGC 4151, dubbed the ‘Eye of Sauron’ by astronomers for its similarity to the eye of the malevolent character in ‘The Lord of the Rings’. Image: X-ray: NASA/CXC/CfA/J.Wang et al.; Optical: Isaac Newton Group of Telescopes, La Palma/Jacobus Kapteyn Telescope, Radio: NSF/NRAO/VLA

When heavier stars run out of hydrogen to fuse into helium, the fusion reactions that keep the stars from imploding due to their own gravity become more difficult (as they infeasibly fuse helium into heavier elements) and eventually stop happening. At this stage, they blow away their outermost layer of gases and collapse into neutron stars (If the parent star is heavy enough, the neutron star collapses into a black hole).

The neutron star is an extremely dense, rapidly rotating body composed mostly of neutrons and ridden with powerful magnetic fields. These magnetic fields accelerate particles on and around some neutron stars and eject them in beams from the poles. Because the star is spinning, these beams periodically point toward and away from Earth, making them look like flashing points of light in the night sky.

For this reason, such neutron stars are called pulsars, and pulsars are used as ‘cosmic candlesticks’, relatively fixed points of light that astronomers use to gauge distances in the universe. Pulsars can remain stable for 10-100 million years, making them reliable on par with atomic clocks when it comes to keeping time as well.

The keys to their relevance for human observations are the stability and distinctness of the beams. Astronomers would use any other natural object like pulsars if only they emitted radiation that was long-lasting and distinguishable from the other light in the universe. Now, they might have a new kind of candidates starting with the ‘Eye of Sauron’.

That’s the common name of the galaxy NGC 4151, located about 40 million light-years from Earth. A group of Danish astrophysicists have measured the distance between the supermassive black hole at the heart of this galaxy and Earth by studying how it is heating up gas clouds and a ring of dust that surround it.

The clouds are heated as they’re compressed by the black hole’s intense gravitational pull. In the process, they emit ultraviolet radiation. The UV radiation then heats up a ring of dust orbiting the black hole at a large distance, and in turn the ring emits infrared radiation. Effectively, thanks to the thermal cascade, there are two concentric ‘zones’ of radiation around the singularity.

Astronomers from the Niels Bohr Institute at the University of Copenhagen used the twin Keck Telescopes in Hawaii and this effect to their advantage when trying to measure how far the black hole is from Earth. Darach Watson, an associate professor at the institute, explained,

Using telescopes on Earth, we [measured] the time delay between the ultraviolet light from the black hole and the subsequent infrared radiation emitted from the dust cloud.

Keeping in mind the speed of light, Watson’s team calculated the delay to be 30 light-days, corresponding to a distance of about 777 million km between the cloud of irradiated gas and the ring of dust.

If this weren’t cool enough, the astronomers then used a technique from 19th century (a.k.a. high school) optics to measure the distance between the black hole itself and Earth.

The most powerful astronomical telescopes are not built to observe electromagnetic radiation at all wavelengths because their resolution depends on the wavelength of the radiation they’re observing. Specifically, a telescope with a fixed lens diameter will have lower angular resolution (which is good) when observing radiation of lower wavelengths. So each of the 10-meter-wide Keck Telescopes will have an angular resolution of 8.75 arc-seconds when observing infrared emissions but 1.6 arc-seconds when observing UV light – an increase in resolution by 5.4-times.

But what makes Keck much better is a technique called interferometry. The two telescopes are separated by 85 meters, which makes their collective effective lens diameter 85 meters. The resultant interference pattern due to the difference in the time at which light reaches each of the lenses is then corrected for by computers, giving rise to an image of the object as if it were observed by an 85-meter-wide telescope.

Related: What is Very Long Baseline Interferometry?

Using interferometry, Watson and his colleagues were able to measure the diameter of the entire dust ring. As a result, they had two fixed distances in the night sky: the distance between the ring and the cloud of gas, and the width of the ring. The only thing left to find out the black hole’s distance from Earth was simple trigonometry, and a simple trigonometric calculation later, the astronomers had their answer: 62 million light-years.

Clouds of gas and rings of dust are common around supermassive black holes, which often reside at the center of large galaxies (the one at the Milky Way’s center is called Sagittarius A*). This means the ‘Eye of Sauron’ needn’t be an uncommon occurrence and could instead join pulsars in holding up candles in space’s dark for astronomers.

And coolness wasn’t the only outcome of the Niels Bohr Institute group’s experiment. Their work heralds a long-sought element of precision missing until now in measuring the masses of black holes. As Watson explained, again,

The calculations of the mass of the supermassive black holes at the heart of galaxies depends on two main factors: the rotational speed of the stars in the galaxy and how far it is from the black hole to the stars. The rotational speed can be observed and the distance from the black hole out to the rotating disc of stars can now be calculated precisely using the new method.

Watson & co. were able to find that the ‘Eye of Sauron’ was 40% heavier than expected.

So, not just coolness…

72513364

… but also awesomeness.

Sauron's singularity: Sucking in light but lighting up the universe

This composite image shows the central region of the spiral galaxy NGC 4151, dubbed the 'Eye of Sauron' by astronomers for its similarity to the eye of the malevolent character in 'The Lord of the Rings'.
This composite image shows the central region of the spiral galaxy NGC 4151, dubbed the ‘Eye of Sauron’ by astronomers for its similarity to the eye of the malevolent character in ‘The Lord of the Rings’. Image: X-ray: NASA/CXC/CfA/J.Wang et al.; Optical: Isaac Newton Group of Telescopes, La Palma/Jacobus Kapteyn Telescope, Radio: NSF/NRAO/VLA

When heavier stars run out of hydrogen to fuse into helium, the fusion reactions that keep the stars from imploding due to their own gravity become more difficult (as they infeasibly fuse helium into heavier elements) and eventually stop happening. At this stage, they blow away their outermost layer of gases and collapse into neutron stars (If the parent star is heavy enough, the neutron star collapses into a black hole).

The neutron star is an extremely dense, rapidly rotating body composed mostly of neutrons and ridden with powerful magnetic fields. These magnetic fields accelerate particles on and around some neutron stars and eject them in beams from the poles. Because the star is spinning, these beams periodically point toward and away from Earth, making them look like flashing points of light in the night sky.

For this reason, such neutron stars are called pulsars, and pulsars are used as ‘cosmic candlesticks’, relatively fixed points of light that astronomers use to gauge distances in the universe. Pulsars can remain stable for 10-100 million years, making them reliable on par with atomic clocks when it comes to keeping time as well.

The keys to their relevance for human observations are the stability and distinctness of the beams. Astronomers would use any other natural object like pulsars if only they emitted radiation that was long-lasting and distinguishable from the other light in the universe. Now, they might have a new kind of candidates starting with the ‘Eye of Sauron’.

That’s the common name of the galaxy NGC 4151, located about 40 million light-years from Earth. A group of Danish astrophysicists have measured the distance between the supermassive black hole at the heart of this galaxy and Earth by studying how it is heating up gas clouds and a ring of dust that surround it.

The clouds are heated as they’re compressed by the black hole’s intense gravitational pull. In the process, they emit ultraviolet radiation. The UV radiation then heats up a ring of dust orbiting the black hole at a large distance, and in turn the ring emits infrared radiation. Effectively, thanks to the thermal cascade, there are two concentric ‘zones’ of radiation around the singularity.

Astronomers from the Niels Bohr Institute at the University of Copenhagen used the twin Keck Telescopes in Hawaii and this effect to their advantage when trying to measure how far the black hole is from Earth. Darach Watson, an associate professor at the institute, explained,

Using telescopes on Earth, we [measured] the time delay between the ultraviolet light from the black hole and the subsequent infrared radiation emitted from the dust cloud.

Keeping in mind the speed of light, Watson’s team calculated the delay to be 30 light-days, corresponding to a distance of about 777 million km between the cloud of irradiated gas and the ring of dust.

If this weren’t cool enough, the astronomers then used a technique from 19th century (a.k.a. high school) optics to measure the distance between the black hole itself and Earth.

The most powerful astronomical telescopes are not built to observe electromagnetic radiation at all wavelengths because their resolution depends on the wavelength of the radiation they’re observing. Specifically, a telescope with a fixed lens diameter will have lower angular resolution (which is good) when observing radiation of lower wavelengths. So each of the 10-meter-wide Keck Telescopes will have an angular resolution of 8.75 arc-seconds when observing infrared emissions but 1.6 arc-seconds when observing UV light – an increase in resolution by 5.4-times.

But what makes Keck much better is a technique called interferometry. The two telescopes are separated by 85 meters, which makes their collective effective lens diameter 85 meters. The resultant interference pattern due to the difference in the time at which light reaches each of the lenses is then corrected for by computers, giving rise to an image of the object as if it were observed by an 85-meter-wide telescope.

Related: What is Very Long Baseline Interferometry?

Using interferometry, Watson and his colleagues were able to measure the diameter of the entire dust ring. As a result, they had two fixed distances in the night sky: the distance between the ring and the cloud of gas, and the width of the ring. The only thing left to find out the black hole’s distance from Earth was simple trigonometry, and a simple trigonometric calculation later, the astronomers had their answer: 62 million light-years.

Clouds of gas and rings of dust are common around supermassive black holes, which often reside at the center of large galaxies (the one at the Milky Way’s center is called Sagittarius A*). This means the ‘Eye of Sauron’ needn’t be an uncommon occurrence and could instead join pulsars in holding up candles in space’s dark for astronomers.

And coolness wasn’t the only outcome of the Niels Bohr Institute group’s experiment. Their work heralds a long-sought element of precision missing until now in measuring the masses of black holes. As Watson explained, again,

The calculations of the mass of the supermassive black holes at the heart of galaxies depends on two main factors: the rotational speed of the stars in the galaxy and how far it is from the black hole to the stars. The rotational speed can be observed and the distance from the black hole out to the rotating disc of stars can now be calculated precisely using the new method.

Watson & co. were able to find that the ‘Eye of Sauron’ was 40% heavier than expected.

So, not just coolness…

72513364

… but also awesomeness.

Hardy DNA could mean we’re aliens

The TEXUS mission sounding rocket taking off in March 2011 from Kiruna, Sweden.
A TEXUS mission sounding rocket taking off in March 2011 from Kiruna, Sweden. Image: Adrian Mettauer

A team of European scientists have shown that DNA molecules can withstand the rough temperatures and pressures that rockets experience when they reenter Earth’s atmosphere from space. Their finding is important from the perspective of meteorites and other space rocks that crash on Earth. Many scientists think such objects could once have seeded our planet with the first molecules of life, billions of years ago.

The scientists had attached bits of plasmid DNA – the part physically separated from chromosomal DNA in biological cells and capable of reproducing independently – on 15 different parts of the outer shell of a TEXUS mission sounding rocket (powered by the Brazilian VSB-30 motor). On March 29, 2011, the rocket took off from the European Space and Sounding Rocket Range near Kiruna, Sweden, for a suborbital flight that exposed the DNA to the vacuum and low temperatures of space before shooting back toward Earth, exposing the samples to friction against the atmosphere.

The entire flight lasted 780 seconds and reached a height of 268 km. While going up, the acceleration maxed at 13.5 g and while coming down, 17.6 g. When outside Earth’s atmosphere, the rocket and samples also experienced about 378 seconds of microgravity. The maximum temperature experienced during atmospheric reentry was just below 130 degrees Celsius on the surface of the rocket; the gases in the air around the samples attached to the sides of the rocket could have reached 1,000 degrees Celsius.

A schematic showing the design of the TEXUS-49 payload and the various positions at which the DNA samples were attached.
A schematic showing the design of the TEXUS-49 payload and the various positions at which the DNA samples were attached. For full caption, see footnote. Image: Screenshot from paper

Promising results

In all, a maximum of 53% of the DNA could be recovered intact and 35% was fully biologically functional. Analysis also showed that “DNA applied to the bottom side of the payload had the highest degree of integrity followed by the samples applied in the grooves of the screw heads”, according to the study paper. It was published in PLOS ONE on November 26.

The ability of the DNA molecules to sustain life was then recorded by observing how many bacterial colonies each of the 15 samples could engender per nanogram. The 100% transformation efficiency was set at 1,955 colonies/nanogram, which was what an unaffected bit of plasmid DNA could achieve.

Curiously, for sample #1, which was attached on the side of the rocket where there was minimum shielding especially during atmospheric reentry, 69 colonies/nanogram were identified. The highest density of colonies was for sample #10, which was attached in the grooves of screw-heads on the rocket: 1,368/nanogram.

“We were totally surprised,” said Cora Thiel and Oliver Ullrich, coauthors of the study and biologists at the University of Zurich, in a statement. “Originally, we designed this experiment as a technology test for biomarker stability during spaceflight and reentry. We never expected to recover so many intact and functional active DNA.”

Last molecule standing

It’s clear that the damage inflicted on the DNA samples by the harsh conditions of acceleration, microgravity, temperature fluctuations, solar radiation and cosmic rays may not have been sufficient in deterring the molecules from retaining their biological functions. In fact, this study imposes new lower limits on the survivability of life: it may not be as fragile as we like to think it is.

Scientists have known temperature to be the most effective destroyer of DNA double-strands. Studies in the past have shown that the molecules weren’t able to withstand more than 95 degrees Celsius for more than five minutes without becoming denatured. During the TEXUS-49 mission, bacterial plasmid DNA temporarily withstood up to 130 degrees Celsius, maybe more.

By extension, it is not inconceivable that a fragment of a comet could have afforded any organic molecules on-board the same kind of physical shielding that a TEXUS-49 sounding rocket did. Studies dating from the mid-1970s have also shown that adding magnesium chloride or potassium chloride to the DNA further enhances its ability to withstand high temperatures without breaking down.

How big a hurdle is that out of the way? Pretty big. If DNA can put itself through as much torture and live to tell the tale, there’s no need for it to have been confined to Earth, trapped under the blanket of its atmosphere. In fact, in 2013, scientists from the Indian Center for Space Physics were able to show, through computer simulations, that biomolecules like DNA bases and amino acids are capable of being cooked up in the interstellar medium – the space between stars – where they could latch on to trespassing comets or asteroids and bring themselves into the Solar System.

According to the study, published in New Astronomy in April 2013, cosmic rays from stars can heat up particles in the interstellar medium and promote the formation of so-called precursor molecules – such as methyl isocyanate, cyanamide and cyanocarbene – which then go on to form amino acids. The only conditions his team presupposed were a particle density of 10,000-100,000 per cubic centimeter and an ambient temperature of 10 kelvin to say about 1 gram of amino acids could be present in 1014 kg of matter.

Compared to the mass density of the observable universe (9.9 × 10-27 kg/m3), that predicted density of amino acids, if true, is quite high. So, the question arises: Could we be aliens?

The first experiments

The first studies to entertain this possibility and send hapless living things to space and back began as far back as 1966, in the early days of the Space Age, alongside the Gemini IX and XII missions. Prominent missions since then include the Spacelab 1 launch (1983), the Foton 9, 11 and 12 rockets (1994-1999), the Foton M2 and M3 missions (2005-2007) and ISS EXPOSE-R mission (2009-2011). The Foton launches hosted the STONE and BIOPAN missions, which investigated if microbial lifeforms such as bacteria and fungi could survive conditions in space, such as a low temperature, solar radiation and microgravity.

Through most of these missions, scientists were able to find that the damage to lifeforms often extended down to the DNA-level. Now, we’re one step closer to understanding exactly what kind of damage is inflicted, and if there are simple ways for them to be fended off like with the addition of salts.

The STONE-5 mission (2005) was particularly interesting because it also tested how rocks would behave during atmospheric reentry, being a proxy for meteorites. It was found that the surface of a rock reached temperatures of more than 1,800 degrees Celsius. However, mission scientists concluded that if the rock layer had been thick enough (at least more than 5 mm as during the test, or 2 cm during STONE-6) to provide insulation, the innards could survive.

Fragment of the Murchison meteorite (at right) and isolated individual particles (shown in the test tube).
Fragment of the Murchison meteorite (at right) and isolated individual particles (shown in the test tube). Image: Wikimedia Commons

In the same vein, the ultimate experiments – though not performed by humans – could have been the Murchison meteorite that crashed near a town of the same name in Australia in 1969 and the Black Beauty, a rock of Martian origins, that splintered over the Sahara a thousand years ago. The Murchison meteorite was found to contain more than 70 different amino acids, only 19 of which are found on Earth. The Black Beauty was found to be 4.4 billion years old and made of sediments, signalling that a young Mars did have water.

Their arrivals’ prime contribution to humankind was that they turned our eyes skyward in the search of our origins. The experiments conducted with the TEXUS-49 mission keep them there.


Full caption for second image: a Scheme of the TEXUS 49 payload with DNA sample 1–12 application sites b Plasmid DNA samples 1–12 were applied on the outside of the TEM (TEXUS Experiment Module) EML 4 c I DNA samples 1–4 were applied circular at 0, 90, 180, 270 degree directly on the surface of the payload DNA samples 5–12 were also applied with a distance of 90 degree each in the screw heads of the payload c II DNA samples 13–15 were applied directly on the payload surface at the bottom side d DNA samples 1–4 were pipetted directly on the surface and locations were marked with a pen e DNA samples 5–12 were applied in the grooves of the screw heads f DNA samples 13–15 were applied directly on the payload surface on the bottom side and locations were marked with a pen.

Hardy DNA could mean we're aliens

The TEXUS mission sounding rocket taking off in March 2011 from Kiruna, Sweden.
A TEXUS mission sounding rocket taking off in March 2011 from Kiruna, Sweden. Image: Adrian Mettauer

A team of European scientists have shown that DNA molecules can withstand the rough temperatures and pressures that rockets experience when they reenter Earth’s atmosphere from space. Their finding is important from the perspective of meteorites and other space rocks that crash on Earth. Many scientists think such objects could once have seeded our planet with the first molecules of life, billions of years ago.

The scientists had attached bits of plasmid DNA – the part physically separated from chromosomal DNA in biological cells and capable of reproducing independently – on 15 different parts of the outer shell of a TEXUS mission sounding rocket (powered by the Brazilian VSB-30 motor). On March 29, 2011, the rocket took off from the European Space and Sounding Rocket Range near Kiruna, Sweden, for a suborbital flight that exposed the DNA to the vacuum and low temperatures of space before shooting back toward Earth, exposing the samples to friction against the atmosphere.

The entire flight lasted 780 seconds and reached a height of 268 km. While going up, the acceleration maxed at 13.5 g and while coming down, 17.6 g. When outside Earth’s atmosphere, the rocket and samples also experienced about 378 seconds of microgravity. The maximum temperature experienced during atmospheric reentry was just below 130 degrees Celsius on the surface of the rocket; the gases in the air around the samples attached to the sides of the rocket could have reached 1,000 degrees Celsius.

A schematic showing the design of the TEXUS-49 payload and the various positions at which the DNA samples were attached.
A schematic showing the design of the TEXUS-49 payload and the various positions at which the DNA samples were attached. For full caption, see footnote. Image: Screenshot from paper

Promising results

In all, a maximum of 53% of the DNA could be recovered intact and 35% was fully biologically functional. Analysis also showed that “DNA applied to the bottom side of the payload had the highest degree of integrity followed by the samples applied in the grooves of the screw heads”, according to the study paper. It was published in PLOS ONE on November 26.

The ability of the DNA molecules to sustain life was then recorded by observing how many bacterial colonies each of the 15 samples could engender per nanogram. The 100% transformation efficiency was set at 1,955 colonies/nanogram, which was what an unaffected bit of plasmid DNA could achieve.

Curiously, for sample #1, which was attached on the side of the rocket where there was minimum shielding especially during atmospheric reentry, 69 colonies/nanogram were identified. The highest density of colonies was for sample #10, which was attached in the grooves of screw-heads on the rocket: 1,368/nanogram.

“We were totally surprised,” said Cora Thiel and Oliver Ullrich, coauthors of the study and biologists at the University of Zurich, in a statement. “Originally, we designed this experiment as a technology test for biomarker stability during spaceflight and reentry. We never expected to recover so many intact and functional active DNA.”

Last molecule standing

It’s clear that the damage inflicted on the DNA samples by the harsh conditions of acceleration, microgravity, temperature fluctuations, solar radiation and cosmic rays may not have been sufficient in deterring the molecules from retaining their biological functions. In fact, this study imposes new lower limits on the survivability of life: it may not be as fragile as we like to think it is.

Scientists have known temperature to be the most effective destroyer of DNA double-strands. Studies in the past have shown that the molecules weren’t able to withstand more than 95 degrees Celsius for more than five minutes without becoming denatured. During the TEXUS-49 mission, bacterial plasmid DNA temporarily withstood up to 130 degrees Celsius, maybe more.

By extension, it is not inconceivable that a fragment of a comet could have afforded any organic molecules on-board the same kind of physical shielding that a TEXUS-49 sounding rocket did. Studies dating from the mid-1970s have also shown that adding magnesium chloride or potassium chloride to the DNA further enhances its ability to withstand high temperatures without breaking down.

How big a hurdle is that out of the way? Pretty big. If DNA can put itself through as much torture and live to tell the tale, there’s no need for it to have been confined to Earth, trapped under the blanket of its atmosphere. In fact, in 2013, scientists from the Indian Center for Space Physics were able to show, through computer simulations, that biomolecules like DNA bases and amino acids are capable of being cooked up in the interstellar medium – the space between stars – where they could latch on to trespassing comets or asteroids and bring themselves into the Solar System.

According to the study, published in New Astronomy in April 2013, cosmic rays from stars can heat up particles in the interstellar medium and promote the formation of so-called precursor molecules – such as methyl isocyanate, cyanamide and cyanocarbene – which then go on to form amino acids. The only conditions his team presupposed were a particle density of 10,000-100,000 per cubic centimeter and an ambient temperature of 10 kelvin to say about 1 gram of amino acids could be present in 1014 kg of matter.

Compared to the mass density of the observable universe (9.9 × 10-27 kg/m3), that predicted density of amino acids, if true, is quite high. So, the question arises: Could we be aliens?

The first experiments

The first studies to entertain this possibility and send hapless living things to space and back began as far back as 1966, in the early days of the Space Age, alongside the Gemini IX and XII missions. Prominent missions since then include the Spacelab 1 launch (1983), the Foton 9, 11 and 12 rockets (1994-1999), the Foton M2 and M3 missions (2005-2007) and ISS EXPOSE-R mission (2009-2011). The Foton launches hosted the STONE and BIOPAN missions, which investigated if microbial lifeforms such as bacteria and fungi could survive conditions in space, such as a low temperature, solar radiation and microgravity.

Through most of these missions, scientists were able to find that the damage to lifeforms often extended down to the DNA-level. Now, we’re one step closer to understanding exactly what kind of damage is inflicted, and if there are simple ways for them to be fended off like with the addition of salts.

The STONE-5 mission (2005) was particularly interesting because it also tested how rocks would behave during atmospheric reentry, being a proxy for meteorites. It was found that the surface of a rock reached temperatures of more than 1,800 degrees Celsius. However, mission scientists concluded that if the rock layer had been thick enough (at least more than 5 mm as during the test, or 2 cm during STONE-6) to provide insulation, the innards could survive.

Fragment of the Murchison meteorite (at right) and isolated individual particles (shown in the test tube).
Fragment of the Murchison meteorite (at right) and isolated individual particles (shown in the test tube). Image: Wikimedia Commons

In the same vein, the ultimate experiments – though not performed by humans – could have been the Murchison meteorite that crashed near a town of the same name in Australia in 1969 and the Black Beauty, a rock of Martian origins, that splintered over the Sahara a thousand years ago. The Murchison meteorite was found to contain more than 70 different amino acids, only 19 of which are found on Earth. The Black Beauty was found to be 4.4 billion years old and made of sediments, signalling that a young Mars did have water.

Their arrivals’ prime contribution to humankind was that they turned our eyes skyward in the search of our origins. The experiments conducted with the TEXUS-49 mission keep them there.


Full caption for second image: a Scheme of the TEXUS 49 payload with DNA sample 1–12 application sites b Plasmid DNA samples 1–12 were applied on the outside of the TEM (TEXUS Experiment Module) EML 4 c I DNA samples 1–4 were applied circular at 0, 90, 180, 270 degree directly on the surface of the payload DNA samples 5–12 were also applied with a distance of 90 degree each in the screw heads of the payload c II DNA samples 13–15 were applied directly on the payload surface at the bottom side d DNA samples 1–4 were pipetted directly on the surface and locations were marked with a pen e DNA samples 5–12 were applied in the grooves of the screw heads f DNA samples 13–15 were applied directly on the payload surface on the bottom side and locations were marked with a pen.

How Venus could harbor life: supercritical carbon dioxide

The dark spot of Venus crossed our parent star in 2012. Pictured above during the occultation, the Sun was imaged in three colors of ultraviolet light by the Earth-orbiting Solar Dynamics Observatory.
The dark spot of Venus crossed our parent star in 2012. Pictured above during the occultation, the Sun was imaged in three colors of ultraviolet light by the Earth-orbiting Solar Dynamics Observatory. Image: NASA/SDO & the AIA, EVE, and HMI teams

A new study published in the online journal Life says a hotter, pressurized form of carbon dioxide could harbor life in a similar way water does on Earth. This is an interesting find, theoretical though it is, because it might obviate the need for water to be present for life to exist on other planets. In fact, of the more than 2,700 exoplanet candidates, more than 2,000 are massive enough to have such carbon dioxide present on their surface.

At about 305 kelvin and 73-times Earth’s atmospheric pressure, carbon dioxide becomes supercritical, a form of matter that exhibits the physical properties of both liquids and gases. Its properties are very different from what they usually are in its common state – in the same way highly pressurized water is acidic but normal water isn’t. Supercritical carbon dioxide is often used as a sterilization agent because it can deactivate microorganisms quickly at low temperatures.

As the study’s authors found, some enzymes were more stable in supercritical carbon dioxide because it contains no water. The anhydrous property also enables a “molecular memory” in the enzymes, when they ‘remember’ their acidity from previous reactions to guide the future construction of organic molecules more easily. Moreover, as stated in the paper,

… the surface tension in carbon dioxide is much lower than that of water, whereas the diffusivity of solutes in scCO2 is markedly higher [because of lower viscosity]. Thus, scCO2 can much easier penetrate [cell membranes] than subcritical fluids can.

The easiest way – no matter that it’s still difficult – to check if life could exist in supercritical carbon dioxide naturally is to check the oceans at about a kilometer’s depth, where pressures are sufficient to entertain pockets of supercritical fluids. As the authors write in their paper, supercritical carbon dioxide is less dense than water, so they could be trapped under rocky formations which in turn could be probed for signs of life.

A similarly accessible place to investigate would be at shallow depths below the surface of Venus. Carbon dioxide is abundant on Venus and the planet has the hottest surface in the Solar System. Its subsurface pressures could then harbor supercritical carbon dioxide. Dirk Schulze-Makuch, a coauthor of the paper and an astrobiologist at Washington State University, notes,

An interesting twist is that Venus was located in the habitable zone of our Solar System in its early history. [Him and his coworkers] suggested the presence of an early biosphere on the surface of this planet, before a run-away greenhouse effect made all life near the Venusian surface all but impossible.

The probability that Venus could once have harbored life is as strange as it is fascinating. In fact, if further studies indicate that supercritical carbon dioxide can play the role of a viable bio-organic solvent,  the implications will stretch far out into anywhere that a super-Earth or gas-giant is found. Because its reactions with complex organic molecules such as amines will not be the same as water’s, the life-forms supercritical carbon dioxide could harbor will be different – perhaps more primitive and/or short-lived. We don’t know yet.

This study continues a persistent trend among astrobiologists since the 1980s to imagine, and then rationalize, if and how life could take root in environments considered extreme on Earth. After the NASA Kepler space telescope launched in 2009 and, in only four years of observation, yielded almost 4,100 exoplanet candidates (more than a thousand confirmed as of now), astrobiologists began to acquire a better picture of the natural laboratories their hypotheses had at their disposal, as well as which hypotheses seemed more viable.

In August this year, Schulze-Makuch himself had another paper, in Science, that discussed how a lake of asphalt in Trinidad harbored life despite a very low water content (13.5%), and what this said about the possibilities of life on Saturn’s moon Titan, which exhibits a similar chemistry on its surface. The Science paper had cited another study from 2004. Titled ‘Is there a common chemical model for life in the universe?‘, it contained a pertinent paragraph about why the search for alien life is important as well as likely endless:

The universe of chemical possibilities is huge. For example, the number of different proteins 100 amino acids long, built from combinations of the natural 20 amino acids, is larger than the number of atoms in the cosmos. Life on Earth certainly did not have time to sample all possible sequences to find the best. What exists in modern Terran [i.e. Earth-bound] life must therefore reflect some contingencies, chance events in history that led to one choice over another, whether or not the choice was optimal.

SpaceShipTwo crash brings down Richard Branson with it

Virgin Galactic’s commercial spaceflight program was pushed back by more than a year after the test flight of its SpaceShipTwo rocket-plane over the California desert blew up mid-air and killed one of its two pilots on October 31. Virgin Galactic had planned to go start operating suborbital flights as soon as 2015 before the incident. Later, an investigation into the details of the accident by the National Transportation Safety Board (NTSB) said it would take until 2016 to conclude.

That date may be pushed further back as far as Virgin Galactic is concerned because new details have emerged that SpaceShipTwo had been plagued by numerous technical issues, that none outside its engineering and contracting teams knew about, even before the test flight. The Wall Street Journal reports,

Engineers and subcontractors working on SpaceShipTwo spent years wrestling with difficulties, ranging from inadequate rocket-motor thrust to problems in the flight-control system to structural deficiencies affecting the wings of the rocket’s carrier plane.

… Fixes were devised, flight tests were delayed and the result, [Virgin Galactic employees] said, was that some important elements of the project remained in flux for several years. It isn’t unusual for complex vehicles such as spacecraft and airliners to face repeated pitfalls and delays during development. Yet throughout the process, Virgin Galactic founder Richard Branson repeatedly announced timetables that were more aggressive than technical advances warranted, the people said.

Peter Siebold, the surviving co-pilot, had told NTSB officials while in hospital that Mike Alsbury, who was killed, had prematurely unlocked the feathering mechanism and that he (Siebold) didn’t know about it. This then violated the principle that the unlocking be announced verbally, although it remains to be seen if Siebold simply didn’t hear it. SpaceShipTwo boasts of two ‘feathers’, one on each wing, which are hinged booms that can be moved to become vertical to the wings to provide extra drag during atmospheric reentry (obviating the need for heat shields).

SpaceShipTwo schematic.
SpaceShipTwo schematic showing the raised feathers. Image: Virgin Galactic

Related: Feathering malfunction, not hyped motor, suspected in SpaceShipTwo crash

It is unclear why Branson would make such vaunted promises but for the money Virgin Galactic stood to make as one of the first commercial spaceflight operators. Tickets on the flight were going to be priced at $250,000 apiece, with each flight ferrying six passengers and two crew. Soon after the October 31 accident, in fact, the company’s CEO George Whitesides said in an interview to Financial Times that another SpaceShipTwo model being built in New Mexico was 65% complete. In 2015, Virgin Galactic would’ve then had at least two flights to operate (it had announced in 2013 that it would eventually operate five). However, it is unknown if the second model is also as problem-prone as the first one was.

Branson’s impatience to get started on a profitable note is brought out by another example. Earlier, media concerns focused on a new hybrid engine that SpaceShipTwo was going to use in the ill-fated test flight. Designed by the Sierra Nevada Corporation, it used solid HTPB fuel – a plastic – and a liquid nitrous oxide oxidizer. A previous version of the engine using a different fuel couldn’t provide enough bang to power the 60-foot vehicle and Sierra Nevada had asked Branson to reduce the passenger limit. However, according to WSJ, Branson had declined saying Virgin Galactic wouldn’t make money unless it flew six people per flight, prompting the switch to the plastic fuel.

NTSB investigation in the week following the accident had managed to recover all the parts of disintegrated SpaceShipTwo, which had blown up at an altitude of 50,000 feet (15.2 km). The lack of commensurate burn marks exonerated the engine but the fragmentation of components meant investigation would take at least a year to complete. Moreover, with news emerging that Virgin Galactic’s business practices could have severely conflicted with technical issues during development, Richard Branson’s dreams of commercial spaceflight even by 2016 could be pixie dust.