Speed: Can technology slow down the spread of lies and abuse?
Most people would accept that free flow of information leads to human progress and that technology often aids that flow. The moveable type, invented by Johannes Gutenberg in mid-15th century, kick-started an era of mass communication, changing the social, political and economic structures of the world for the better by bringing the cost of printed books down dramatically. More recently, in the 1990s, many welcomed the advent of the internet because they believed it will have a similar impact. The internet has lived up to its promise to a large extent. It has cut down distance and time. Ideas travel faster. There is more pressure on the government and big business. Online courses from Coursera or EdX let us learn from some of the best teachers in the world at the tap of a screen. Quora, Stack Exchange and a host of other platforms get our questions answered from experts, wherever they are.
However, the same technology that spreads information fast can also spread lies and hatred equally fast. “By the time a fact-checker has caught a lie, thousands more have been created, and the sheer volume of ‘disinformation cascades’ make unreality unstoppable,” Peter Pomerantsev wrote in Granta recently. Many are willing to believe these lies because they confirm their prejudices.
Telling lies and spewing hate often go together, and one can see them in full display on Twitter. “We suck at dealing with abuse and trolls on the platform and we've sucked at it for years. It's no secret and the rest of the world talks about it every day,” Dick Costolo, then Twitter CEO, wrote in an internal memo last year. Costolo is no longer the CEO, but the trolls have stayed and continue to rule. (You can read Twitter’s response here.) Earlier this year, Microsoft launched an artificial intelligence (AI) chatbot called Tay, which learns from other tweets. It turned into a racist jerk in less than 24 hours.
Such hatred in what Pomerantsev termed the ‘post-fact’ world can have serious consequences for democracy—with the agenda taken over by a small, passionate and voluble minority who use technology to get together, amplify their voice and gain outsized influence. That’s the basic argument of Vyacheslav Polonski, network scientist, Oxford Internet Institute, in his piece entitled “The biggest threat to democracy? Your social media feed” on the World Economic Forum website.
Can technology solve some of the problems that it created? Today, such checks mostly happen through human intervention. On Facebook, you can report spam and abusive posts. What if technology could recognise lies and hatred, and go slow on spreading it? One step towards that would involve recognising them in the first place. However, when it comes to recognising and understanding human language, technology is still in its early stages. As anyone can find out from talking to Siri (or any of its rivals), machines aren’t quite there yet.
It’s true that a lot of progress has been made. IBM’s supercomputer Watson won the quiz show Jeopardy back in 2011. Google, which captured the search market with its PageRank, analysing the links to a page to gauge its importance, now uses RankBrain to understand the content of a page to give more relevant results. Google as well as its rivals are at the forefront of the race to make machines understand natural language—with their investments in AI, machine learning and a host of related technologies. An MIT Technology Review article by Will Knight says it’s not going to be easy. Even if it manages to do that eventually, a tougher problem would be how to ensure freedom of speech while checking its abuse. Human beings might have to learn that first before they teach it to machines.
Automation: What it takes away with one hand, does it give with another?
Japan’s SoftBank Robotics recently hit the headlines in the US for a rather cute reason. It has taken its humanoid robot Pepper to the b8ta showroom at Palo Alto this month. The 4-feet-tall robot can read customers’ movements, answer questions, and can learn from it all. It has been around in Japan and Europe for a couple of years now. It can effectively replace some of the boring work that sales people do (and do some other activities better than humans). This characteristic places Pepper at a stage where we are more likely to accept robots among us—let machines do the boring stuff, so we can focus on the more interesting and more important stuff.
That’s also one of the biggest lessons from technology so far. In an essay in The Wall Street Journal that traces the impact of technology on jobs, Irving Wladawsky-Berger talks about ATMs. ATMs might have helped in creating more banking jobs because by reducing the cost of operations, it helped banks expand their branch network, employing more people. That might turn out to be the case with disruptive technologies too. However, we still haven’t found answers to two important questions: 1) How will we deal with those who lose out in the short term? Some jobs seem to be more vulnerable than others. And 2) will this rule still hold good when technology is growing at an exponential rate? What if technology starts performing interesting and important jobs better than humans?
The most optimistic answer that one can see is: technology, thankfully, is nowhere close to doing that.
EdTech: The visible and invisible hands in education
While technologists worry about how to make machines learn faster and better, a former Googler Max Ventilla’s main concern is how to make children learn better and faster. AltSchool, the startup he founded in 2014, uses technology to provide highly personalised education to students. The idea that personalised education is more effective is almost intuitive. But the process involves much iteration and much pain—and a lot of money too. AltSchool has raised $133 million so far for the startup that employs about 160 people, and has five schools in its network. An essay entitled ‘Learn Different’ in The New Yorker gives a fascinating view of AltSchool’s model.
In the second phase of its growth, Ventilla wants to scale it up through partners. Start with small, independent, private schools—and expand the base that would use its platform.
How to keep your job when robots take over | Fortune
“With AI, there will always be a need for people to code and build the machines, which will lead to a new wave of innovation and jobs that will pay more. Our focus now should be on the training and education to provide the displaced workers with the skills they need to keep up with the jobs of tomorrow.” Tiger Tyagarajan, CEO, Genpact
Uber’s first self-driving fleet arrives in Pittsburgh this month | Bloomberg
“We are going commercial. This can’t just be about science,” Uber co-founder and chief executive Travis Kalanick
Technology vs. human - Who is going to win? An interview with Gerd Leonhard | Forbes
“The reality is that if you ask the question IF technology can do something or not, the answer will almost always be ‘yes’, already—there is pretty much no limit to what technology can do in the very near future. The WHY question will replace the HOW question.”
India’s ascent: Five opportunities for growth and transformation | McKinsey
“Twelve powerful technologies will benefit India, helping to raise productivity, improving efficiency across major sectors of the economy, and radically altering the provision of services such as education and healthcare. These technologies could add $550 billion to $1 trillion a year of economic value in 2025.”
Software programs the world | a16z podcast
“‘All of a sudden you can program the world’— it’s the continuation of the software eating the world thesis we put out over five years ago, and of the trajectory of past and current technology shifts… This episode of the a16z Podcast covers all things distributed systems—encompassing cloud and SaaS; A.I., machine learning, deep learning; and quantum computing—to the role of hardware; future interfaces; and data, big and small.”
Beyond CRISPR: A guide to the many other ways to edit a genome | Nature
“Many labs use CRISPR–Cas9 only to delete sections in a gene, thereby abolishing its function. ‘People want to declare victory like that’s editing,’ says George Church, a geneticist at Harvard Medical School in Boston, Massachusetts. ‘But burning a page of the book is not editing the book.’”