For coronavirus claims, there is a world between true and false

In high school, you must have learnt about Boolean algebra, possibly the most fascinating kind of algebra for its deceptive ease and simplicity. But thanks to its foundations in computer science, Boolean algebra – at least as we it learnt in school – is fixated with ‘true’ and ‘false’ states but not with the state of ‘don’t know’ that falls in between. This state may not have many applications as regards the functioning of logic gates but in the real world, it is quite important, especially when the truth threatens to be spun out of control.

Amitabh Bachchan recently published a video in which he delivered a monologue claiming that when a fly alights on human faeces containing traces of the new coronavirus, flies off and then alights on some food, the food could also be contaminated by the same virus. The Wire Science commissioned a fact-check from Dr Deepak Natarajan, a reputed (and thankfully opinionated) cardiologist in New Delhi. In his straightforward article, Dr Natarajan presents evidence from peer-reviewed papers to argue that while we know the new coronavirus does enter the faeces of an infected person, we don’t know anything about whether the virus remains viable, or capable of precipitating an infection. Second, we know nothing of the participation of flies either.

The thing to remember here is that, during a panic – or in a pre-panic situation that constantly threatens to devolve into a panic – society as such has an unusually higher uptake capacity for information that confirms their biases irrespective of whether it is true. This property, so to speak, amplifies the importance of ‘not knowing’.

Thanks to scientism, there is a common impression among many experts and most non-experts that science has, or could have, the answers to all questions that could ever be asked. So when a scientist says she does not know something, there is a pronounced tendency among some groups of people – particularly, if not entirely, those who may not be scientistic themselves but believe science itself is scientistic – to assume the lack of an answer means the absence of an answer. That is, to think “If the scientist does not have an answer, then the science does not have an answer”, rather than “If the scientist does not have an answer, then the science does not have an answer yet” or even “If the scientist does not have an answer yet, she could have an answer later“.

This response at a time of panic or pre-panic forces almost all information to be classified as either ‘true’ or ‘false’, precluding the agency science still retains to move towards a ‘true’ or ‘false’ conclusion and rendering their truth-value to be a foregone conclusion. That is, we need evidence to say if something is true – but we also need to understand that saying something is ‘not true’ without outright saying it is ‘false’ is an important state of the truth itself.

It also forces the claimant to be more accountable. Here is one oversimplified but nonetheless illustrative example: When only ‘true’ and ‘false’ exist, any new bit of information has a 50% chance of being in one bin or the other. But when ‘not true/false’ or ‘don’t know’ is in the picture, new information has only a 33% chance of assuming one of the truth values. Further, the only truth value based on which people should be allowed to claim something is true is ‘true’. ‘False’ has never been good enough but ‘don’t know’ is not good enough either, which means that before we subject a claim to a test, it has a 66% chance of being ‘not true’.

Amitabh Bachchan’s mistake was to conflate ‘don’t know’ and ‘true’ without considering the possibility of ‘not true’, and has thus ended up exposing his millions of followers on Twitter to claims that are decidedly not true. As Dr Natarajan said, silence has never been more golden.

Lord of the Rings Day

An artist's impression of the iconic One Ring of 'Lord of the Rings' film trilogy.

A happy Lord of the Rings Day to you! (Previous editions: 2019, 2018, 2017, 2016, 2014)

Every year I pen a commemorative piece about Lord of the Rings, and share something about the books and films that I think about nearly every day week. This year, I don’t have the strength, thanks to the workload due to the coronavirus pandemic, to say anything more than that you should take advantage of the lockdown – and the commute time it has likely saved you – to read more works of fantasy fiction.

It remains the single most rewarding thing in my life, even more than my blog, because fantasy as I’ve said before in quite clumsy terms is fractal. It recapitulates itself, especially its careful – or deliberately and absurdly careless – inventiveness, demanding more answers of the writer than any other form of fiction ever could simply because fantasy brings together three infinities: both what is and what isn’t that are the general attributes of all fiction plus the preserve of ‘are you frigging kidding me’. Reading good fantasy is sure to give you ideas of your own, to push towards (or away from) new worlds and new world-visions.

Fantasy is to my mind ergodic: riding its coattails, I get to visit all possibilities available to visit in the possibility-space of my mind; if I keep reading, I get to solipsistically encompass the worlds and world-visions of my fellow creators as well. Fantasy to me is newness, an endless font of it, in a world that has only been becoming more and more predictable; it is a secret place where goodness still lives, and on occasion even reaches a hand out and nudges me towards the right thing.

If I had been in Faramir’s shoes and stood before Denethor, bearing the full brunt of my father’s derision and being told he’d rather I had been killed instead of my brother, I would have done to him what he did to himself later: set him on fire. But Faramir rode out into a battle that he knew full well he was going to lose. Nothing about it was fair – just as nothing was fair about Anomander Rake’s tortuous, tortuous penance. Ours is a nasty world, and right and wrong aren’t always clear just as they might not have been to Faramir and Rake in moments of profound distress. In fact, the distinction is sometimes so blurry it might as well not be there.

When I’m lost for ideas, when I really don’t know what to do, when I would really like to just be told what I should do instead of having to think it up myself, I often turn to fantasy’s ideas about right and wrong, about what Faramir or Rake might have done, because fantasy is fundamentally empathetic in its alienness: its creations are often apart from this world – just as I feel sometimes, and you probably do too. It’s a place “infused with bright hope now so scarce in the realm of the real,” as a friend put it – a place to go when you don’t like this one (and from there to other places, picking and choosing what you like), and it’s a place that will let you go when you’d like to return, all in peace. The faith it demands is only the faith you’d like to give. What more could one want?

[Takes a break from the typing frenzy]

At least, good fantasy is all I want. And this Lord of the Rings Day, I invite you to take a short dip into a fantastic realm of your choice. If you’d like recommendations, I highly recommend starting with Lord of the Rings itself; if you’ve read that and want to try something more ambitious, try the Malazan Book of the Fallen series by Steven Erikson or Black Leopard, Red Wolf by Marlon James. If you’d like something that won’t consume the next three to five years of your life, I recommend Exhalation, a collection of short stories by Ted Chiang that I’m currently reading, or all of Terry Pratchett’s Discworld books.

If you’d like even more recommendations – or titles more gender-balanced, say – I also recommend recommendations by the following souls (all on Twitter):

  • @srividyatadpole
  • @thebekku
  • @dpanjana
  • @chitralekha_tcc
  • @notrueindian
  • @supriyan

There are many, many others, of course, but these people came immediately to mind.

I really need to get back to work now.

On India's path to community transmission

There’s a virus out there among many, many viruses that’s caught the world’s attention. This virus came into existence somewhere else, it doesn’t matter where, and developed a mutation at some point that allowed it to do what it needs to do inside the body of one specific kind of animal: Homo sapiens. And once it enters one Homo sapiens, it takes advantage of its new surroundings to produce more copies of itself. Then, its offspring wait for the animal to cough or sneeze – acts originally designed to expel irritating substances – to exit their current home and hopefully enter a new one. There, these viruses go through the same cycle of reproduction and expulsion, and so forth.

This way, the virus has infected over 210,000 people in the last hundred days or so. Some people’s bodies have been so invaded by the virus that their immune systems weren’t able to fight it off, and they – nearly 9,000 of them – succumbed to it.

Thus far, the virus has reportedly invaded the bodies of at least 282 people in India. There’s no telling how the virus will dissipate through the rest of the population – if it needs to – except by catching people who have the virus early, separating them from the rest of the population for long enough to ensure they don’t have and/or transmit the virus or, if they do, providing additional treatment, and finally reintegrating them with the general population.

But as the virus spreads among more and more people, it’s going to become harder and harder to tell how every single new patient got their particular infection. Ultimately, a situation is going to arise wherein too many people have the virus for public-health officials to be able to say how exactly the virus got to them. The WHO calls this phase ‘community transmission’.

India is a country of over 1.3 billion people, and is currently on the cusp of what the Indian Council of Medical Research (ICMR) has called ‘stage 3’ – the advent of community transmission. It’s impossible to expect a developing country as big and as densely populated as India to begin testing all 1.3 billion Indians for the virus as soon as there is news of the virus having entered the national border because the resource cost required to undertake such an exercise is extremely high, well beyond what India can generally afford. However, this doesn’t mean Indians are screwed.

Instead of testing every Indian, ICMR took a different route. Consider the following example: there’s a population of red flecks randomly interspersed with yellow flecks. You need to choose a small subset of flecks from this grid (shown below) such that checking for the number of yellow flecks in the subset gives you a reliable idea of the number of yellow flecks overall.

The ideal subset would be the whole set, of course, so there is one more catch: you have a fixed amount of money to figure out the correct answer (as well as for a bunch of other activities), so it’s in your best interests to keep the subset as small as possible. In effect, you need to balance the tension between two important demands: getting to a more accurate answer while spending less.

Similarly, ICMR assumed that the virus is randomly distributed in the Indian population, and decided to divide the population into different groups, for example by their relative proximity to a testing centre. That is, each testing centre would correspond to the group of all people who live closer to that testing centre than any other. Then, ICMR would pick a certain number of people from each group, collect their nasal and throat samples and send it to the corresponding labs for tests.

Say group size equals 100. For a Bernoulli random variable with unknown probability p, if no events occur in n independent trials, the maximum value of p (at 95% confidence) is approximately 3/n. In our case, n = 100 and p at 95% confidence is 3/100, which is 3%. Since this is the upper bound, it means less than 3% of the population has the ‘event’ which didn’t occur in n trials – which in our case is the event of ‘testing positive’. Do note, this is what is safe to say; it’s not what may actually be happening on the ground. So by increasing the sample size n as much as possible, ICMR can ascertain with higher and higher confidence as to whether the corresponding group has community transmission or not.

Thus far, ICMR has said there is no community transmission in India based on these calculations. Independent experts have been reluctant to take its word, however, because while ICMR has publicised what the sample size and the number of positives are, there is very little information about two other things.

First: we don’t know how ICMR selected the samples that it did for testing. While the virus’s distribution in the population can be considered to be random, especially if community transmission is said to have commenced, the selection of samples needs to have an underlying logic. What is that logic?

Second: we don’t know the group sizes. It’s important for the sample size to be proportionate to the group size. So without knowing what the group size underlying each sample is, it becomes impossible to tell if ICMR is doing its job right.

On March 17, one ICMR scientist said that some testing centres had admitted fewer people with COVID-19-like symptoms and the source of whose infections was unknown (i.e. community transmission) than the size of the sample chosen from their corresponding group. She was suggesting that ICMR’s choice of samples from each group was large enough to not overlook community transmission. To translate in terms of the example above: she was saying ICMR’s subset size was big enough to catch at least one yellow fleck – and didn’t.

As it happens, on March 20, ICMR announced that it would begin testing for a potential type of community-transmission cases even though its sampling exercise had produced 1,020 negative results in 1,020 samples (distributed across 51 testing centres).

The reasons for this are yet unclear but suggests that ICMR suspects there is community transmission of the virus in the country even though its methods – which ICMR has always stood by – haven’t found evidence of such transmission. This in turn prompts the following question: why not test for all types of community transmission? The answer is the same as before: ICMR has limited resources but at the same time has been tasked with discovering how many yellow flecks are there in the total population.

The virus is not an intelligent creature. In fact, it’s extremely primitive. Each virus is in its essence a packet of chemical reactions, and when each reaction happens depends on a combination of internal and external conditions. Other than this, the virus does not harbour any intentions or aspirations. It simply responds to stimuli that it cannot manipulate or affect in any way.

The overarching implication is that beyond how good the virus is at spreading from person to person, a pandemic is what it is because of human interactions, and because of human adaptation and mitigation systems. And as more and more people get infected, and their groups verge towards the WHO’s definition of ‘community transmission’, the virus’s path through the population becomes less and less obvious, but at the same time a greater depth of transmission opens the path to better epidemiological modelling.

When such transmission happens in a country like India, the body responsible for keeping the people safe – whether the Union health ministry, ICMR or any other entity – faces the same challenge that ICMR did. This is also why direct comparisons of India’s and South Korea’s testing strategies are difficult to justify, especially of the number of people tested per million: India has nearly 26-times as many people but spends 11.5-times less on healthcare per capita.

At the same time, ICMR isn’t making it easy for anyone – least of all itself – when it doesn’t communicate properly, and leaves itself open to criticism, which in turn chips away at its authority and trustworthiness in a time as testing as this. Demonetisation taught us very well that a strategy is only as good as its implementation.

But on the flip side, it wouldn’t be amiss to make a distinction here: between testing enough to get a sense of the virus’s prevalence in the population – in order to guide further action and policy – and the fact that the low expenditure on public healthcare is always going to incentivise India to skew towards a sampling strategy instead of an alternative that requires mass-testing. ICMR and the Union health ministry haven’t inspired confidence on the first count but it’s important to ensure criticism of the former doesn’t spillover into criticism of the latter as well.

Anyway, the corresponding sampling strategy is going to have to be based on a logic. Why? Because while the resources for the virus to spread exist abundantly in nature (in the form of humans), the human response to containing the spread requires resources that humans find hard to get. Against the background of this disparity, sampling, testing and treatment logics – such as Italy’s brutal triaging policy – help us choose better sampling strategies; predict approximately how many people will need to be quarantined in the near future; prepare our medical supplies; recruit the requisite number of health workers; stockpile important drugs; prepare for economic losses; issue rules of social conduct for the people; and so forth.

A logic could even help anticipate (or perpetuate, depending on your appetite for cynicism) ‘leakages’ arising due to, say, caste or class issues. Think of it like trying to draw a circle with only straight lines of a fixed length: with 200 strokes, you could technically draw a polygon with 200 sides that looks approximately like a circle – but it will still have some discernible edges and vertices that won’t exactly map on a circle, leaving a small part of the latter out. Similarly, using a properly designed technique that can predict which person might get infected and who might not can still catch a large number of people – but the technique won’t catch all of them.

One obvious way to significantly improve the technique’s efficacy as it stands is to account for the fact that more than half of all Indians are treated at private hospitals whereas you can be tested for COVID-19 only at a government facility, and not all VRDLs receive samples from all private hospitals in their respective areas.

Ultimately, the officials who devise the logics must be expected to justify how the combination of all logics can – even if only on paper – uncover most, if not all, cases of the virus’s infection in India.

The sea of metal

A scene from 'Greatest Events of WWII in Colour' showing the Allied armada crossing the English Channel on June 6, 1944.

Two of the most decisive moments of the Second World War that I can’t get enough of are the Battle of Stalingrad and the D-Day landings. In the Battle of Stalingrad, Adolf Hitler’s army suffered its first major defeat, signalling to Nazi Germany that it was just as capable of bleeding as any other regime, that its forces – despite the individual formidability of each German soldier – were capable of defeat. The D-Day landings were the proximate beginning of the end, allowing Allied forces to penetrate Hitler’s Atlantic Wall and, in due course, bring the fight to Germany.

These two battles played out differently in one way (among others, of course). The Battle of Stalingrad began on German initiative but turned into a Soviet siege that slowly but continuously drove home the point to German soldiers trapped in the Soviet city that they couldn’t possibly win. Eventually, on January 31, 1943, the Germans surrendered together with their leader, Friedrich Paulus, who also became the first Field Marshal of the Nazi armed forces to be captured by the enemy during the war. Operation Overlord – of which the D-Day landings were part – on the other hand hinged on a single, potentially decisive event: of blowing a hole in the Atlantic Wall at Normandy on June 6, 1944, and securing it for long enough for more Allied troops to land ashore as well as for those already inside France to assemble and establish communications.

The Allies succeeded of course, although slower than planned at first, but in all successfully marching from there to liberate France and then take Berlin on May 2, 1945 (Hitler would commit suicide on April 30 to avoid capture), effectively ending the war.

Operation Overlord is well-documented, particularly so from the Allied point of view, with records as well as video footage describing the great lengths to which American, Australian, Belgian, British, Canadian, Czech, Danish, Dutch, French, Greek, Luxembourger, New Zealander, Norwegian and Polish forces went to ensure it was a success. The Allies had to do five things: keep Hitler in the dark, or at least confused, about where the Allies were going to attack the Atlantic Wall; sabotage the Germans’ ability to respond quickly to wherever the Allies attacked; transport an army across the English Channel and land it ashore on a heavily fortified beach; establish and then link five beachheads; and capture the city of Caen. The documentary Greatest Events of WWII in Colour narrates these events to the accompaniment of riveting visual detail – a must-watch for anyone interested in military history, especially the Second World War.

I enjoyed some bits of it more than others, one of them about Operation Overlord itself. The Allied beach-landing at Normandy is perhaps the most important event of the Second World War, and it’s quite easy to find popular historical material about it; the opening scenes of Saving Private Ryan (1998) come to mind. However, I’ve always wondered how the German soldiers sitting in their bunkers and pill-boxes on the shores of Normandy might have felt. To behold one of the largest armies in modern history rise unexpectedly out of the horizon is no trivial thing. Greatest Events of WWII in Colour documents this.

Narrator: As the dawn breaks, it’s the German soldiers in Normandy, not Calais [where Hitler et al were made to believe the Allies would attack], who witness the enormity of the Allied invasion fleet for the first time.

Peter Lieb, historian: For the Germans sitting in their bunkers in Normandy, the sight of the Allied armada must have been terrifying. A sea full of metal.

Geoffrey Wawro, professor of military history: Witnesses recall just absolute stunned disbelief. This was the greatest armada assembled in world history, and this thing suddenly appears out of the darkness off the coast of Normandy.

A sea of metal!

There’s a certain masculinity imbibed in the picture, a grand combination of brawn, self-righteousness and exhibition that wartime rhetoric prizes because its adrenaline elides the tragedy of war itself. The Second World War was a particularly brutal affair with crimes against humanity perpetrated by the Allied and Axis powers both, and continuing even after 1945 across multiple continents. However, it is also tempting to believe that the start of Operation Overlord, by striking fear in the Germans and bearing down upon a fascist government that had to be destroyed, is one of those rare acts of war that deserves to be recounted with this rousing rhetoric. Greatest Events of WWII in Colour is only shrewd enough to play along.

‘Hunters’, sci-fi and pseudoscience

One of the ways in which pseudoscience is connected to authoritarian governments is through its newfound purpose and duty to supply an alternate intellectual tradition that subsumes science as well as culminates in the identitarian superiority of a race, culture or ethnic group. In return, aspects of the tradition are empowered by the regime both to legitimise it and to catalyse its adoption by the proverbial masses, tying faith in its precepts with agency, and of course giving itself divine sanction to rule.

The readers of this blog will recognise the spiritual features of Hindutva that the Bharatiya Janata Party regularly draws on that fit the bill. A German rocket scientist named Willy Ley who emigrated to the US before World War II published an essay entitled ‘Pseudoscience in Naziland’ in 1947, in which he describes the sort of crazy beliefs that prepared the ground with other conditions for the advent of Nazism.

In Hunters, the Amazon Prime show about Jewish Nazi-hunters in 1970s America, Edward Bulwer-Lytton’s sci-fi novel The Coming Race (1871) finds brief mention as a guiding text for neo-Nazis. In the novel, a subterranean race of angelic humanoids has acquired great power and superhuman abilities by manipulating a magical substance called Vril, and threatens to rise to the surface and destroy the human race one day.

Bulwer-Lytton also wrote that Vril alludes to electricity (i.e. the flow of electrons) and that The Coming Race is an allegory about how an older generation of people finds itself culturally and political incompatible with a new world order powered by electric power. (At the same time, he believed these forces were a subset of the aether, so to speak.) In a letter to John Forster on March 20, 1870 – precisely 150 years ago in twelve days – Bulwer-Lytton wrote:

I did not mean Vril for mesmerism, but for electricity, developed into uses as yet only dimly guessed, and including whatever there may be genuine in mesmerism, which I hold to be a mere branch current of the one great fluid pervading all nature. I am by no means, however, wedded to Vril, if you can suggest anything else to carry out this meaning – namely, that the coming race, though akin to us, has nevertheless acquired by hereditary transmission, etc., certain distinctions which make it a different species, and contains powers which we could not attain through a slow growth of time’ so that this race would not amalgamate with, but destroy us.

And yet this race, being in many respects better and milder than we are, ought not to be represented terrible, except through the impossibility of our tolerating them or they tolerating us, and they possess some powers of destruction denied to ourselves.

The collection of letters is available here.

In Bulwer-Lytton’s conception, higher technological prowess was born of hereditary traits. In a previous letter, dated March 15, Bulwer-Lytton had written to Forster:

The [manuscript] does not press for publication, so you can keep it during your excursion  and think over it among the other moonstricken productions which may have more professional demand on your attention. The only important point is to keen in view the Darwinian proposition that a coming race is destined to supplant our races, that such a race would be very gradually formed, and be indeed a new species developing itself out of our old one, that this process would be invisible to our eyes, and therefore in some region unknown to us.

So this is not a simple confusion or innocent ignorance. Bulwer-Lytton’s attribution of the invention of electricity to genetic ability was later appropriated by interwar German socialists.

This said, I’m not sure how much I can read into the reimagination of technological ability as a consequence of evolution or racial superiority because another part of Bulwer-Lytton’s letters suggests his example of electricity was incidental: “… in the course of the development [of the new species], the coming race will have acquired some peculiarities so distinct from our ways … and certain destructive powers which our science could not enable us to attain to, or cope with. Therefore, the idea of electrical power occurred to me, but some other might occur to you.”

Now, according to Ley, the Society for Truth believed Vril to be a real thing and used its existence to explain how the Britons created their empire. I don’t know how much stock Adolf Hitler and his “shites of the round table” (to quote from Hunters) placed in this idea but the parallels must have been inescapable – especially so since Ley also writes that not just any pseudoscientific belief could have supported Hitler’s rise nor have acquired his patronage. Instead, the beliefs had to be culturally specific to Germany, pandering to local folklore and provincialism.

Without commenting on whether this conclusion would apply to Fascism 2.0 in a world with the internet, civil aviation and computerised banking, and in naïve spite of history’s fondness for repeating itself and the politico-corporate-media complex, I wonder what lessons there are here – if any – for science educators, a people already caught between political anti-intellectualism and a stronger sense of their purpose in an intellectually debilitated society.

Review: ‘Hunters’ (2020)

Just binge-watched the first season of Hunters, the bizarre Amazon Prime original about a covert group of Jews in 1970s’ New York city tracking down and killing Nazis who were integrated by the US government into American society under Operation Paperclip. It’s obvious how this premise could be presented through 10 hours of grit and moral dilemma but instead we get 10 hours of grit mixed with satire and melodrama – a combination that only brings a certain journalist’s words in 2013, delivered as a comment on a prominent newspaper’s suddenly disagreeable design, to mind: “pastiche and mishmash”.

I’m not sure what Hunters is trying to be, beyond a vessel for Al Pacino as its protagonist and patriarch, because its story is weak and the violence is neither realistic nor displays purpose; the only exception that everyone seems to be able to agree on, with good reason, is Jerrika Hinton as Agent Morris. But worst of all, the show gives more than ample screen-time for neo-Nazi characters to air their newly sharpened anti-Semitic and supremacist points of view.

Hunters seems to believe that such views are instantaneously and automatically disqualified by their implicit absurdity whereas the opposite is true. We live today in a world where conspiracy theories have moved from the fringes of society to the centre. So beyond the first time the Nazis are allowed to spew their bile, the show resembles porn for the sufficiently misguided bigot looking for a new language and new methods to assert his dominance. Makes you want to skip forward in cringe. Even the concentration camp scenes are awfully close to being voyeuristic.

A great discussion on the history of India’s tech.

On February 27, the Bangalore International Centre and Carnegie India hosted a panel discussion around Midnight Machines, the new book by Arun Mohan Sukumar that traces the interplay of technology and politics in independent India (read The Wire Science‘s review here). The panelists were Arun (my friend and former colleague at The Hindu), space entrepreneur Susmita Mohanty, Rajya Sabha MP Rajeev Gowda, historian of science Jahnavi Phalkey, and Anu Singh of Carnegie India.

The whole discussion was about 90 minutes long, and picked up steam after the first 15 minutes or so. If you’re at all interested in the history of science and technology in India, I recommend you watch the whole thing on YouTube. If not, I’d like to draw your attention to two a few interesting (to me) passages of discussion and which I’ve also transcribed below. The parts where Arun and Phalkey directly debated each other, Arun emerged with only minor bruises, which I shouldn’t have to tell you is a considerable feat and may not have been the case in a full-on, two-person debate!

Jahnavi Phalkey, 32:00 – The political ambition of a state is now technological ambition. That’s why the technological story of the latter half of the 20th century is a political one, and is therefore also political in India. The other aspect of this is centralisation. While we in India have argued that the Indian state centralised research funding through the CSIR, DAE, the space programme, etc. with all money going into a few facilities, look at Europe. The European answer was CERN, with countries coming together to build facilities. Apart from the UN, there was no economy then that could conduct scientific research at the scale the tone for which was set during the Second World War.

Therefore, the centralisation solution adopted (also) in India was no different from what was happening globally. So what was happening in India was not anomalous. It’s a part of the larger story. To add a footnote to the Nehru story: Nehru spoke science, he said “scientific temper”, but look at the institutions he established: the IITs (when it was 60 years before India setup the IISERs) and the CSIR (he didn’t go for the Max Planck Institutes model, the Kaiser Wilhelm Institutes model or the Harnack principle but focused on industrial research); the IISc came 50 years before independence. So the accusation that Nehru spoke science, did science but didn’t do technology is not held out.

[At one point, Arun also talks about how India needed a Nehru to navigate the Non-Aligned Movement to still secure favours form different governments without upsetting the precarious balance of powers (so to speak) to help set up some of India’s preeminent IITs. I skimmed through the video twice but couldn’t find the exact timestamp.]

Arun Mohan Sukumar, 43:50 – A CSIR scientist said the failure of the solar cooker project basically ensured that all the scientists [who worked on it] retreated into the comfort of their labs and put them off “applied science”.

Here’s a project commenced almost immediately after independence meant to create technology by Indians for Indians, and after it failed for various reasons, the political spotlight that had been put on the project was counterproductive. Nehru himself investing this kind of capital exposed him and the scientific establishment to criticism that they were perhaps not used to. These were venerated men in their respective fields and they were perhaps unused to being accountable in this visceral way. India offered a prototype of the solar cooker to Egypt and, I believe, Rhodesia or South Africa, and the joke goes that the order was never repeated. D.D. Kosambi says in an opinion piece at the time that the only person who made any profit out of the solar cooker affair is the contractor who sold it for scraps.

This is the kind of criticism confronted by the scientific establishment and it is a consequence of politics. I agree with Prof Phalkey when she says it was a consequence of the political establishment not insulating the scientific establishment from the sort of criticism which may or may not informed but you know how the press is. That led to a gradual breaking of ranks between the CSIR and the political vision for India where you’d have these mass technologies that [Phalkey] mentioned, and you can see the best evidence for that is Nehru’s pursuit of massive industrialisation in the second Five Year Plan, from 1956 to 1961.

This isn’t to say that Nehru was surrounded by advisers who all believed in the same thing; there was of course [P.C.] Mahalanobis who believed in a more aggressive form of industrialisation. But at various points of time one constituency was trumping another, within even the establishment. But it needs to be said that the PM was not in favour of introducing tractors in agriculture… Again, this is all criticism with the wisdom of hindsight.

Jahnavi Phalkey, 53:16 – In the 1970s, look at the number of democratic regimes that fell due to hot wars fought during the Cold War in the rest of the world. You’ll start to see why the need for control was felt.

Arun Mohan Sukumar [following after Rajeev Gowda’s comments], 55:05 – Another dimension is the presence of universities in the US, which incubated the military-industrial complex. Harvard and MIT in Boston and Stanford in the Silicon Valley were the nuclei for research. In India, some of these are truly unfortunate circumstances that the government has no control over. When the first batch of graduates passed out of IIT Kanpur in 1965, Lyndon B. Johnson passed the Immigration and Naturalisation Act giving Indians, and people of other nationalities, an automatic path to citizenship. So the best minds of our country were prompted by the fact that there aren’t enough jobs or enough well-paying jobs in India [to enter] a feeder line created between India and the US, from which it is very difficult to come back. Those circumstances too must be acknowledged.

Susmita Mohanty, 56:20 – Even brain drain is hugely exaggerated. I’ve lived in four different countries. The talent pool we have in India today is as big or bigger. There are people leaving the country but not everyone is the best coder in town.

Arun Mohan Sukumar, 57:24 – The appropriate technology movement that started in the late 1960s and early 1970s was this philosophy that grew out of Western Europe and the US which called for lesser consumption of natural resources and labour-intensive jobs with a view to conserving resources for the planet, a lot of which was precipitated by a report called ‘Limits to Growth’, which essentially predicted this catastrophe that would befall humanity by 2000.

And then economist [E.F.] Schumacher writes this book called ‘Small is Beautiful’ [in 1973] and creates a revolution incidentally not just in advanced societies but also in developing countries, where leaders like Indira Gandhi coopted the movement to say to the people that you should consume less, conserve your natural resources and deploy labour-intensive technologies that will essentially be beneficial to you and your way of life. Seminar after seminar was organised by top institutes of the time to talk about how you can create fuel out of biogas, how you can mechanise bullock carts – technologies that are not scalable but nevertheless are quick-fixes, and this is where ‘jugaad’ has its historical origin: in the valorisation of frugal innovation.

[Phalkey shakes her head in disagreement.]

This would’ve been acceptable had it not been for the fact that investments in the space and nuclear programmes continued unabated. … So on the one hand the state was promoting big science and it wasn’t as if they had an ideological or political compulsion against Big Machine and big technologies. There was just factors such as financial considerations and the government’s own inability to develop technology at home which, I argue, led Indira Gandhi to co-opt the appropriate technology movement. … In India, perhaps it’s harsh to say that we moved backwards, but the objective was not to redefine technology but to shun it altogether. [Phalkey is quite in disagreement at this point.] That unfortunately is I feel a byproduct of the legacy of the 1970s.

Jahnavi Phalkey, 1:01:14 – I have to disagree because there’s been only one science plan in the country in its history, and that was done in the 1970s under Mrs Gandhi’s regime. Eighteen-hundred people from user ministries, the Planning Commission, scientific institutions and industry sat together over 18-24 months and came up with a comprehensive plan as to how to take research happening in the institutions and in the CSIR through Planning Commission allocation of money to the user ministries. We haven’t seen anything on this scale before or since.

Problem was as soon as Mrs Gandhi implemented the plan, she also implemented the Emergency. When the Emergency was pulled back, the Morarji Desai regime decided that India did not need [the science plan]. So the argument you’re making [addressing Arun] of scaling back on technology or technology as a solution to the social, political and other problems that India had was more due to the Janata regime and not Mrs Gandhi’s. One needs to make this small distinction because this was simply not true at the time.

Arun Mohan Sukumar, 1:06:09 – What was remarkable to me while writing this book was this factoid that comes from this book on the history of computing in India by C.R. Subramanian: he says the import of computers to India tripled during the years of the Emergency. For my life, I can’t imagine why! But it goes to show that despite the anti-automation protests of the 1970s and 1970s, and remember that 1978 is the year when IBM quit India for whatever reasons, there was beginning to be this gradual embrace of technology and which really takes off from the 1980s. And from the moment of liberalisation in 1991, it’s a different story altogether.

Some of these legacies continue to haunt us, whether it is popular protests against nuclear plants, which really came of age in the 1960s and 1970s, not just in India but also in other parts of the world. Some of that really bore on India as well, and I believe continued into the debate on genetically modified crops. If you ask a person who really has a strong opinion on these subjects, I wonder whether he or she would have a clear idea of what the technology is. But they evoke such strong views, and perhaps some of it is due to the constant politicisation of the virtues and vices of the technology.

Arun Mohan Sukumar, 1:09:04 – One of the reasons why the Indian opposition to the Human Genome Project was so pronounced in the early 1990s, when the hand of invitation was extended to the Indian government, was because the Vaccine Action programme signed by Rajiv Gandhi and Ronald Reagan just a few years ago ran into a great deal of controversy within and without government; defence ministry officials said here is an effort to take DNA materials from Indians to be turned against India as an agent of biological warfare, and all sorts of rubbish.

[How history repeats itself!]

Adding to this, some private institutes in the US were involved in smuggling anti-rabies vaccines into developing countries. All of this spooked the scientific establishment and which, the book argues, led to us staying away from the Human Genome Project.

… And we missed the bus. Today we say we are able to map the genome of some man from Jharkhand at a fraction of the cost – it is at a fraction of the cost because most of the work has already been done. There is some historical legacy there that unfortunately continues to haunt us.

[Susmita Mohanty mentions ISRO’s famous reluctance to share information about components of its civilian space programme.]

Jahnavi Phalkey, 1:12:26 – There’s also a little bit of politics to it. The information that NASA and ESA share is backed by a very, very, very strong politics of sharing. What can and cannot be shared are clearly divided.

Jahnavi Phalkey, 1:13:57 – If you begin with Robert Clive, we have a history of about 300 years of building suspicion. And to dismantle that kind of suspicion is going to take lots of work. I’m not saying to not have participated in the Human Genome Project but that it’s not a good thing to share or that we embark on certain projects. I think we might be erring on the side of caution.

Arun Mohan Sukumar, 1:17:58 – There are different kinds of technocracies, and the three people surveyed in the book [who represented those kinds] are M. Visvesvaraya, Vikram Sarabhai and Nandan Nilekani. They forged three different organisational structures within government (of course Visvesvaraya did so before independence), and they had different views of technology. I wouldn’t say there were all political animals but they certainly had a good appreciation of politics which was crucial to their success.

For example, Visvesvaraya was a very astute navigator of colonial-era politics but then resigned as the diwan of Mysore over what he perceived as anti-Brahmin protests in the Madras presidency and the threat of that spilling over into Mysore. Finally, after independence, his views were totally marginalised by the establishment of the time.

Sarabhai was in currency throughout but also in many respects was able to tell the leadership what it wanted to hear and at the same time insulate his own team from politics to the extent that ISRO today has a separate recruitment process. Some degree of autonomy was built-in.

Nilekani’s work on Aadhaar goes the exact opposite way: he is very clear that he does not want scientists or technologists running the programme beyond the infancy… He was very sure at the beginning that an IAS officer should be running UIDAI. We can debate the merits of the decision but the fact is, in his view and the view of the team, the technocracy could only survive if it was built from within government. Whereas when Sarabhai died, Satish Dhawan was brought from Caltech to run ISRO. It was very clear for the folks behind Aadhaar that that model would not have survived.

fin.

Featured image: The panelists (L-R): Arun Mohan Sukumar, Susmita Mohanty, Rajeev Gowda, Jahnavi Phalkey and Anu Singh.

The Nobel intent

A Nobel Prize award ceremony underway. Credit: nobelprize.org

You’ve probably tired of this but I can’t. The Nobel Prize folks just sent out a newsletter ahead of Women’s Day, on March 8, describing the achievements of female laureates of each of the six prizes. This is a customary exercise we’ve come to expect from organisations and companies trying to make themselves look good on the back of an occasion presumably designed to surmount the sort of barriers to women’s empowerment and professional success the organisations and companies often themselves perpetuate. For example, this the Nobel Prize newsletter shows off with some truly ironic language. Consider the notes accompanying the science prize winners:

“I remember being told over and over again: Women, you can do anything, so it never entered my mind that I couldn’t.” Donna Strickland was awarded the Nobel Prize in Physics 2018 for her work with laser pulses.

She was also the first female laureate of the physics prize in 55 years.

[Marie Curie] is the first Nobel Prize awarded woman and the only one to have received it in Physics as well as Chemistry.

… because the prize committee chose not to award anyone else.

Gert Cori was the first woman to receive a Medicine Prize.

… because the prize committee chose not to award anyone else.

This is also what baffles me, especially in October and December every year when the awards are announced and conferred, respectively: Why do people take an award seriously that doesn’t take their issues seriously? Any other institution that did the same thing, as well as self-aggrandise as often as it can, would’ve been derided, turned into memes even, but every year we – millions of Indians at least, scientists and non-scientists – look to the Nobel Prizes to acknowledge Indian contributions to science, missing entirely the point that the prizes are a deeply flawed human enterprise riven by their own (often Eurocentric) politics and that have no responsibility to be fair, and often aren’t.

Dehumanising language during an outbreak

It appears the SARS-CoV-2 coronavirus has begun local transmission in India, i.e. infecting more people within the country instead of each new patient having recently travelled to an already affected country. The advent of local transmission is an important event in the lexicon of epidemics and pandemics because, at least until 2009, that’s how the WHO differentiated between the two.

As of today, the virus has become locally transmissible in the world’s two most populous countries. At this juncture, pretty much everyone expects the number of cases within India to only increase, and as it does, the public healthcare system won’t be the only one under pressure. Reporters and editors will be too, and they’re likely to be more stressed on one front: their readers.

For example, over the course of March 4, the following sentences appeared in various news reports of the coronavirus:

The Italian man infected 16 Italians, his wife and an Indian driver.

The infected techie boarded a bus to Hyderabad from Bengaluru and jeopardised the safety of his co-passengers.

Two new suspected coronavirus cases have been reported in Hyderabad.

All 28 cases of infection are being monitored, the health ministry has said.

Quite a few people on Twitter, and likely in other fora, commented that these lines exemplify the sort of insensitivity towards patients that dehumanises them, elides their agency and casts them as perpetrators – of the transmission of a disease – and which, perhaps given enough time and reception, could engender apathy and even animosity towards the poorer sick.

The problem words seem to include ‘cases’, ‘burden’ and ‘infected’. But are they a problem, really? I ask because though I understand the complaints, I think they’re missing an important detail.

Referring to people as if they were objects only furthers their impotency in a medical care setup in which doctors can’t be questioned and the rationale for diagnoses is frequently secreted – both conditions ripe for exploitation. At the same time, the public part of this system has to deal with a case load it is barely equipped for and whose workers are underpaid relative to their counterparts in the private sector.

As a result, a doctor seeing 10- or 20-times as many patients as they’ve been trained and supported to will inevitably precipitate some amount of dehumanisation, and it could in fact help medical workers cope with circumstances in which they’re doing all they can to help but the patient suffers anyway. So dehumanisation is not always bad.

Second, and perhaps more importantly, the word ‘dehumanise’ and the attitude ‘dehumanise’ can and often do differ. For example, Union home minister Amit Shah calling Bangladeshi immigrants “termites” is not the same as a high-ranking doctor referring to his patient in terms of their disease, and this doctor is not the same as an overworked nurse referring to the people in her care as ‘cases’. The last two examples are progressively more forgivable because their use of the English language is more opportunistic, and the nurse in the last example may not intentionally dehumanise their patients if they knew what their words meant.

(The doctor didn’t: his example is based on a true story.)

Problematic attitudes often manifest most prominently as problematic words and labels but the use of a word alone wouldn’t imply a specific attitude in a country that has always had an uneasy relationship with the English language. Reporters and editors who carefully avoid potentially debilitating language as well as those who carefully use such language are both in the minority in India. Instead, my experiences as a journalist over eight years suggest the majority is composed of people who don’t know the language is a problem, who don’t have the time, energy and/or freedom to think about casual dehumanisation, and who don’t deserve to be blamed for something they don’t know they’re doing.

But by fixating on just words, and not the world of problems that gives rise to them, we risk interrogating and blaming the wrong causes. It would be fairer to expect journalists of, say, the The Guardian or the Washington Post to contemplate the relationship between language and thought if only because Western society harbours a deeper understanding of the healthcare system it originated, and exported to other parts of the world with its idiosyncrasies, and because native English speakers are likelier to properly understand the relationship between a word, its roots and its use in conversation.

On the other hand, non-native users of English – particularly non-fluent users – have no option but to use the words ‘case’, ‘burden’ and ‘infected’. The might actually prefer other words if:

  • They knew that (and/or had to accommodate their readers’ pickiness for whether) the word they used meant more than what they thought it did, or
  • They knew alternative words existed and were equally valid, or
  • They could confidently differentiate between a technical term and its most historically, socially, culturally and/or technically appropriate synonym.

But as it happens, these conditions are seldom met. In India, English is mostly reserved for communication; it’s not the language of thought for most people, especially most journalists, and certainly doesn’t hold anything more than a shard of mirror-glass to our societies and their social attitudes as they pertain to jargon. So as such, pointing to a reporter and asking them to say ‘persons infected with coronavirus’ instead of ‘case’ will magically reveal neither the difference between ‘case’ or ‘infected’ the scientific terms and ‘case’ or ‘infected’ the pejoratives nor the negotiated relationship between the use of ‘case’ and dehumanisation. And without elucidating the full breadth of these relationships, there is no way either doctors or reporters are going to modify their language simply because they were asked to – nor will their doing so, on the off chance, strike at the real threats.

On the other hand, there is bound to be an equally valid problem in terms of those who know how ‘case’ and ‘infected’ can be misused and who regularly read news reports whose use of English may or may not intend to dehumanise. Considering the strong possibility that the author may not know they’re using dehumanising language and are unlikely to be persuaded to write differently, those in the know have a corresponding responsibility to accommodate what is typically a case of the unknown unknowns and not ignorance or incompetence, and almost surely not malice.

This is also why I said reporters and editors might be stressed by their readers, rather their perspectives, and not on count of their language.


A final point: Harsh Vardhan, the Union health minister and utterer of the words “The Italian man infected 16 Italians”, and Amit Shah belong to the same party – a party that has habitually dehumanised Muslims, Dalits and immigrants as part of its nationalistic, xenophobic and communal narratives. More recently, the same party from its place at the Centre suspected a prominent research lab of weaponising the Nipah virus with help from foreign funds, and used this far-fetched possibility as an excuse to terminate the lab’s FCRA license.

So when Vardhan says ‘infected’, I reflexively, and nervously, double-check his statement for signs of ambiguity. I’m also anxious that if more Italian nationals touring India are infected by SARS-CoV-2 and the public healthcare system slips up on control measures, a wave of anti-Italian sentiment could follow.

Writing itself is fantasy

The symbols may have been laid down on paper or the screen in whatever order but when we read, we read the words one at a time, one after another – linearly. Writing, especially of fiction, is an act of using the linear construction of meaning to tell a story whose message will be assimilated bit by bit into a larger whole that isn’t necessarily linear at all, and manages to evade cognitive biases (like the recency effect) that could trick the reader into paying more attention to parts of the story instead of the intangible yet very-much-there whole. Stories in fact come in many shapes. One of my favourites, Dune, is so good because it’s entirely spherical in the spacetime of this metaphor, each of its concepts like a three-dimensional ouroboros, connected end to end yet improbably layered over, under and around each other. The first four Harry Potter books are my least favourite pieces of good fantasy for their staunch linearity, even despite the use of time travel.

The plot of Embassytown struggles with this idea a little bit, with its fraction-like representation of meaning using pairs of words. Even then, China Miéville has a bit of a climb on his hands: his (human) readers consume the paired words one at a time, first the one on the top then the one on the bottom. So a bit of translation becomes necessary, an exercise in projecting a higher dimensional world in which words are semantically bipolar, like bar magnets each with two ends, onto the linguistic surface of one in which the words are less chimerical. Miéville is forced to be didactic (which he musters with some reluctance), expending a few dozen pages constructing rituals of similes the reader can employ to sync with the Ariekei, the story’s strange alien characters, but always only asymptotically so. We can after all never comprehend a reality that exists in six – or six-thousand – dimensions, much the same way the Higgs boson’s existence is a question of faith if you’re unfamiliar with the underlying mathematics and the same way a human mind and an alien mind can never truly, as they say, connect.

Arrival elevates this challenge, presenting us with alien creatures – the ‘heptapods’ – the symbols of whose communication are circular, each small segment of the circumference standing for one human word and the whole assemblage for meaning composed by a non-linear combination of words. I’m yet to read the book by Ted Chiang on which the film is based; notwithstanding the possibility that Chiang has discussed their provenance, I wonder if the heptapods think a complex thought that is translated into a clump of biochemical signals that then encode meaning in a stochastic process: not fully predictably, since we know through the simpler human experience that a complicated idea can be communicated using more than one combination of simpler ideas. One heptapod’s choice could easily differ from that of another.

The one human invention, and experience if you will, that recreates the narrative anxiety encoded in the Ariekei’s and heptapods’ attempts (through their respective authors’ skills, imagination, patience and whatever else) to communicate with humans is writing insofar as the same anxiety manifests in the use of a lower order form – linearity – to construct a higher order image. Thus from the reader’s perspective the writer inhabits an inferior totality, and the latter performs a construction, an assimilation, by synthesising the sphericity and wholeness of a story using fundamentally linear strands, an exercise in building a circle using lines, and using circles to build a sphere, and so forth.

Writing a story is in effect like convincing someone that an object exists but having no way other than storytelling to realise the object’s existence. Our human eyes will always see the Sun as a circle but we know it’s a sphere because there are some indirect ways to ascertain its sphericity, more broadly to ascertain the universe exists in three dimensions at least locally; the ‘simplest’ of these ways would be to entirely assume the Sun is spherical because that seems to simplify problem-solving. However, say one writer’s conceit is that the Sun really exists in eight dimensions and goes on to construct an elaborate story of adventure, discovery and contemplation to convince the reader that they’re right.

In this sense, the writer would draw upon our innate knowledge of the universe in three dimensions, and our knowledge and experience of the ways in which it and isn’t truthful, to build an emergent higher-order Thing. While this may seem like a work of science and/or fantasy fiction, the language humans use to build all of their stories, even the nonfiction, renders every act of story-telling a similarly architecturally constructive endeavour. No writer commences narration with the privilege of words meaning more than they stand for in the cosmos of three dimensions and perpetually forward-moving time nor sentences being parsed in any way other than through the straightforward progression of a single stream of words. Everything more complicated than whatever can be assembled with two-dimensional relationships requires a voyage through the fantastic to communicate.

A new beginning

When The Wire was launched on May 11, 2015, we (the editors) decided to organise the site’s content within six principal categories: politics, political economy, foreign affairs, science, culture and law.

In the five years since, the Big Three categories — politics, political economy and foreign affairs — have come to dominate The Wire‘s identity as a digital news site, even as our science category has acquired a voice of its own and performed much better than we expected. And yet, given the crush of ‘conventional’ news, science has not been able to voice at its fullest on the crowded pages of The Wire.

To fix this issue as well as to give our science stories the freedom — and responsibility — to constitute their own publication (of sorts), we launched on February 28 The Wire Science as its own beast: https://science.thewire.in.

While we remain strapped for resources, we recognise that it’s a necessary step in the road to the top: an Indian independent, fully reader-funded, science news, analysis and commentary website. That said, we will begin populating the new site with shorter, longer and different types of stories that we can already afford and which now have the breathing room they need.

As always, please engage with The Wire Science, share the stories you like, comment and discuss on Twitter and Facebook, send your bouquets and brickbats to science at thewire dot in, and please donate (especially if you can). This is all we need for the trek. 🙂

Freeman Dyson’s PhD

A close-up photograph of Freeman Dyson.

The physicist, thinker and writer Freeman Dyson passed away on February 28, 2020, at the age of 96. I wrote his obituary for The Wire Science; excerpt:

The 1965 Nobel Prize for the development of [quantum electrodynamics] excluded Dyson. … If this troubled Dyson, it didn’t show; indeed, anyone who knew him wouldn’t have expected differently. Dyson’s life, work, thought and writing is a testament to a philosophy of doing science that has rapidly faded through the 20th century, although this was due to an unlikely combination of privileges. For one, in 1986, he said of PhDs, “I think it’s a thoroughly bad system, so it’s not quite accidental that I didn’t get one, but it was convenient.” But he also admitted it was easier for him to get by without a PhD.

His QED paper, together with a clutch of others in mathematical physics, gave him a free-pass to more than just dabble in a variety of other interests, not all of them related to theoretical physics and quite a few wandering into science fiction. … In 1951, he was offered a position to teach at Cornell even though he didn’t have a doctorate.

Since his passing, many people have latched on to the idea that Dyson didn’t care for awards and that “he didn’t even bother getting a PhD” as if it were a difficult but inspiring personal choice, and celebrate it. It’s certainly an unlikely position to assume and makes for the sort of historical moment that those displeased with the status quo can anchor themselves to and swing from for reform, considering the greater centrality of PhDs to the research ecosystem together with the declining quality of PhD theses produced at ‘less elite’ institutions.

This said, I’m uncomfortable with such utterances when they don’t simultaneously acknowledge the privileges that secured for Dyson his undoubtedly deserved place in history. Even a casual reading of Dyson’s circumstances suggests he didn’t have to complete his doctoral thesis (under Hans Bethe at Cornell University) because he’d been offered a teaching position on the back of his contributions to the theory of quantum electrodynamics, and was hired by the Institute for Advanced Study in Princeton a year later.

It’s important to mention – and thus remember – which privileges were at play so that a) we don’t end up unduly eulogising Dyson, or anyone else, and b) we don’t attribute Dyson’s choice to his individual personality alone instead of also admitting the circumstances Dyson was able to take for granted and which shielded him from adverse consequences. He “didn’t bother getting a PhD” because he wasn’t the worse for it; in one interview, he says he feels himself “very lucky” he “didn’t have to go through it”. On the other hand, even those who don’t care for awards today are better off with one or two because:

  • The nature of research has changed
  • Physics has become much more specialised than it was in 1948-1952
  • Degrees, grants, publications and awards have become proxies for excellence when sifting through increasingly overcrowded applicants’ pools
  • Guided by business decisions, journals definition of ‘good science’ has changed
  • Vannevar Bush’s “free play of free intellects” paradigm of administering research is much less in currency
  • Funding for science has dropped, partly because The War ended, and took a chunk of administrative freedom with it

The expectations of scientists have also changed. IIRC Dyson didn’t take on any PhD students, perhaps as a result of his dislike for the system (among other reasons because he believed it penalises students not interested in working on a single problem for many years at a time). But considering how the burdens on national education systems have shifted, his decision would be much harder to sustain today even if all of the other problems didn’t exist. Moreover, he has referred to his decision as a personal choice – that it wasn’t his “style” – so treating it as a prescription for others may mischaracterise the scope and nature of his disagreement.

However, questions about whether Dyson might have acted differently if he’d had to really fight the PhD system, which he certainly had problems with, are moot. I’m not discussing his stomach for a struggle nor am I trying to find fault with Dyson’s stance; the former is a pointless consideration and the latter would be misguided.

Instead, it seems to me to be a question of what we do know: Dyson didn’t get a PhD because he didn’t have to. His privileges were a part of his decision and cemented its consequences, and a proper telling of the account should accommodate them even if only to suggest a “Dysonian pride” in doing science requires a strong personality as well as a conspiracy of conditions lying beyond the individual’s control, and to ensure reform is directed against the right challenges.

Featured image: Freeman Dyson, October 2005. Credit: ioerror/Wikimedia Commons, CC BY-SA 2.0.