Note: This Week in Disruptive Tech brings to you five interesting stories that highlight a new development or offer an interesting perspective on technology and society. Plus, a curated set of links to understand how technology is shaping the future, here in India and across the world. If you want to get it delivered to your inbox every week, subscribe here.
The colour of money
Is it ethical to accept money for your high profile research lab from a bad person (who, by the way, seeks no publicity out of the donation)? There are two ways to approach the question.
If you are a deontologist you will go by some kind of code—like Warren Buffett’s newspaper rule—and you will say no, if it violates that code.
If you are a utilitarian, you will weigh benefits against the costs. The benefits: The funding can support top talent in science and technology and potentially solve some of the most wicked problems facing the world today. The costs: the bad person will get legitimacy by association with your prestigious institute—at least among a group that knows of the association. If the news comes out about your relationship it will hurt your reputation.
It looks like MIT went by some kind of utilitarian logic when it accepted money from Jeffrey Epstein. MIT Media Labs director has now resigned. Tough questions are being asked of those who accepted money from Epstein.
The lesson from all these is not that utilitarianism is bad, but utilitarianism without transparency can be bad.
- How an elite university research center concealed Its relationship with Jeffrey Epstein | The New Yorker
- Why MIT Media Lab thought it was doing right by secretly accepting Jeffrey Epstein’s money | Vox
- On Joi and MIT | Lawrence Lessig
The bite of a mosquito
Many years ago, an NGO, concerned about smoke-filled kitchens of the poor in a village (they lead to lung disease and cause early deaths), decided to install chimneys that will take the smoke right out of the houses. It seemed to work. But, sometime later, almost all houses that installed the chimneys reported collapsing roofs. It turned out that the smoke which caused disease in women and children in the house also killed termites in the beams supporting the roofs. With smoke gone, the termites came back, and the roofs started collapsing.
It’s difficult to escape the law of unintended consequences. A few years ago, millions of genetically modified mosquitoes were released in a city in Brazil. The idea was that female mosquitoes mating with the GM mosquitoes will not be able to produce viable offspring, thus reducing the risk of diseases like Zika and Dengue. However, a study by Yale showed that GM mosquitoes have passed on genes to local mosquitoes. And what’s worse, after a small decline, the number of mosquitoes rebounded after 18 months.
Nassim Taleb, a proponent of precautionary principle in such matters, had this to tweet.
Of course, Bill Gates doesn't get that neomania works with software with reversible errors, not with complex systems.— Nassim Nicholas Taleb (@nntaleb) September 14, 2019
- Transgenic mosquitoes pass on genes to native species | Yale News
- The prophet of unintended consequences | Strategy + Business
- The law of unintended consequences: Shakespeare, cobra breeding, and a tower in Pisa | Farnam Street
In 1995, a man named Timothy Ray Brown was diagnosed with HIV. Twelve years later, he underwent a stem cell transplant to treat leukemia, and followed the same procedure a year later. The donor of the stem cells had defective copies of a gene CCR5. It’s rare to find people with such mutant genes, but those who have it are naturally resistant to HIV. Receiving stem cells with those genes, Brown was cured of HIV and is considered the first person to be cured of the disease.
Since it’s rare to find people with mutant CCR5, a group of Chinese scientists edited the gene using CRISPR, and used the edited genes for stem cell transplantation on a patient infected with HIV. They could edit about 17.8% of a donor's stem cells, and post transplantation, it made up 5% to 8% of recipients stem cells. The idea is these cells will replicate and ultimately cure the man of HIV. We don’t know yet what the unintended consequences are.
- Chinese scientists have tried to cure Hiv with CRISPR gene editing: ‘The genie is out of the bottle’ | Newsweek
- A simple guide to CRISPR, one of the biggest science stories of the decade | Vox
Man-machine interfaces and society
In a new report, iHuman: Blurring Lines between Mind and Machine, top scientists at the UK's Royal Society call for people to engage more deeply with science and technology.
“All technologies create both benefits and risks and this report seeks to set out a course towards maximising the former while minimising the latter. However, the unavoidable point for opinion formers, decision takers, policymakers and the public is that they have the opportunity to shape the future of neural interface technologies. These technologies are being developed now. Investment is accelerating. The impacts will be profound—and if they are to be positive ones, society needs to be engaged early and often.”
The biggest challenge is not scientists and technologists not willing or being able to connect and engage with society (even though they are a big part of the problem). The biggest challenge really is that the loudest voice in the space are of a group who claim to speak for society, but who can’t see beyond their social group or funding opportunities.
- iHuman perspective: Neural interfaces | Royal Society
- What can we do about the science communication crisis? | Scientific American
- When you give your biometrics to Modi | Manu Joseph, Mint
The problem with global platforms
For ages, human beings have lived their lives drawing thick circles around their neighbourhoods, workplaces, towns, cities and states. The internet promised the death of distance, and even on that platform, as Pankaj Ghemawat pointed out in his book World 3.0 most interactions happen within national boundaries. The world is not flat.
Big Tech businesses, which have built global platforms, are feeling the pulls and pushes from various directions. The US is aghast that Russia intervened in its 2016 elections by spreading misinformation on platforms such as Facebook, YouTube, Twitter and other platforms, and doesn’t want it to repeat in 2020. Russia meanwhile has complained to Facebook and Google that the ads they carry on their platforms interfere in the local elections.
Meanwhile, tech companies are strengthening their editorial control over platforms. Google announced that it has changed its algorithms to prioritise original news reporting. Facebook is hiring editors to determine how news will be displayed. Snapchat is creating a channel for election debates.
Whether these moves will help control propaganda on the platform, or merely raise expectations only to disappoint further is a more difficult question to answer.