It’s widely accepted in India that humanities and social sciences isn’t given the same due as sciences. But the two still are linked, and the implications of humanities on the sciences are rarely studied. Here’s where Science and Technology Studies comes in.
An academic field that sits at the intersection of these fields, it looks at how science and technology emerged from society, how it shapes society, what the risks are, etc. In a nutshell, it looks at understanding the relationship between science, politics, societal challenges and law and policy. A pioneer of the field, Sheila Jasanoff, a Pforzheimer Professor of Science and Technology Studies at the Harvard Kennedy School, speaks to TNM about the origins of the field and the need for one, policy playing catch-up with the progress of science, data collection in democracies and more.
Sheila, who is originally from India, says that human values are at the centre of science. “In the US, STS has grown hand in hand with engineering, which hasn’t been the case in India,” she says.
The origins of STS, Sheila says, was in the late 1960s as part of the youth turmoil and rebellion against the Vietnam War. “By the late 1990s, it begins to be institutionalised in lots of universities. Now, one of my particular interests is both to try to understand why STS is so unknown in India and to try to counteract it,” she says. This, given India’s current situation, makes for an interesting case, she adds.
A social scientist, Sheila says that science is a very forgetful discipline when it comes to its own mistakes, but has — especially in biology — acquired a degree of authority where “everyone turns to the inventor of any new technology for ethical prescriptions and dispensations for which they have zero competence,” she says. This was something she even addressed in her book, Can Science Make Sense of Life?
When one talks about the intersection of science & technology and policy & law, a common assumption is that policymaking plays catch-up with science. Disagreeing with this assumption, she says there often is policy, but science runs ahead because nobody has bothered to check. Similarly, Sheila says many things are not standardised, and science often has human bias also built into it, which should have been addressed.
“Why are we still using a system that in a way perpetuates gender inequality in the bowels of a machine? You need an army of people like me in all countries of the world who are studying these technological systems in a critical way to then say that policy is not because we decided to let private companies build their machines and we didn’t standardise them. That’s policy too. Not having a policy is also policy, and much of what we see in the tech world today has grown in a vacuum of policy. In a sense, we created a world in which it looks like policy is playing catch up,” she says.
On governments using technology
Just as she says a system of policy playing catch up has been created, she asks, “One of the astonishing stories about India is the rapidity with which the Aadhaar system was introduced, and more than a billion people signed up. Was there a nationwide debate about any of the social dimensions — like the error proneness of the system. If you’ve been denied something, how does that feedback into the system? What does that do to your credibility if you’re operating with a card that’s been denied?”
Technology and surveillance are not things that are an uncommon part of human history, and she says that one must not be surprised that governments will use technology to whatever their ends of the moment are.
“But how you hold them accountable then is a deeply political issue. It will not be acted upon unless people become aware of the levels at which something called the ‘political’ can get built in,” she says, citing the example of research on facial recognition software in the USA, which has shown that white faces are recognised better than black faces.
“In order to understand modernity and the kind of societies that we live in now, we have to go back and understand quite deeply all the different levels at which our technological systems have social assumptions built into them. We are so seduced by the promises of the digital now that we attribute a kind of smartness to the mere act of digitising but the smartness is in the designer,” she says.
She adds that because it’s in the designing of the system, the values and biases get built in already. By the time one gets the black box of the engineered object, it becomes very complicated, she says.
On data protection
While Sheila admits that she hasn’t studied the Indian data protection bill in detail, she says, “My first cynical question would be protection for whom and against what. Is it protecting people in power so that they can use the data without accounting for it?”
There is precedent for what happens when the state controls data, she adds, but says that concern about data protection at large didn’t exist till private companies started messing up on a colossal scale.
“The Indian government perfectly recognises the power of data, and it’s not minded to let other people have control of it and certainly not citizens themselves. I think that for public interest lawyers, this is an extraordinarily important moment in which to recognize that the bland word, data, is really a stand-in for civil rights in every sense,” she says.
“This is an area that we’re walking into with our eyes not open and we just need to act and not just get railroaded into positions that will become very hard to undo when and if they’re already in law,” she adds.
Furthermore, Sheila says that in order for people to look up and take notice, putting it to them in a different manner could make a difference, especially with regard to tech companies and how much data they collect.
“Getting people to see just how much of the technological world is outside of their control, but it could be in their control, is one of the big lessons of democracy. “Women go on marches and say Take Back the Night. I think technology in a democracy is a form of night and we need to take it back,” she says.