What, me trust?

June 24, 2019: A roundup of news and perspectives on disruptive technology from around the world. In this issue: Deepfakes, Facebook’s Libra, CRISPR, Slack IPO and Oxford University

N S Ramnath

[Photo by Bernard Hermant on Unsplash, cropped from original.]

Note: This Week in Disruptive Tech brings to you five interesting stories that highlight a new development or offer an interesting perspective on technology and society. Plus, a curated set of links to understand how technology is shaping the future, here in India and across the world. If you want to get it delivered to your inbox every week, subscribe here.

Identity politics

Take a look at the two photographs below.

To your left is Yumi, a new brand ambassador for Japanese skincare brand SK-II. In a video Yumi says she is “totally obsessed with skincare”, and “ready to chat anytime any place”. On your right is Katie Jones. She is Russia Eurasia Fellow at the Center for Strategic and International Studies, according to her LinkedIn profile. She studied at the University of Michigan, and is connected to the who’s who of Washington, including a deputy assistant secretary of state, Paul Winfree, an economist who is being considered for a seat on the Federal Reserve, and a host of others from top think tanks in Washington.

What’s common between the two? Both images are generated by artificial intelligence (AI). Just spend a couple of seconds watching the video, and you will figure out Yumi is not real. Even if you don’t you will hear Yumi herself say so. However, Katie is a different story. AP investigated her claims, and revealed that the image is most probably AI generated, and that the account was probably run by Russsian intelligence to lure people with access to information.

Encountering realistic fake people online is one thing. And encountering real people saying fake stuff is another. You might have come across this speech that Mark Zuckerberg never gave.

In his 1995 book, Francis Fukuyama argued that the difference between rich societies and poor is the prevalence of trust. Trust reduces friction, increases efficiency and thus aids prosperity. Lack of trust could do the opposite. Will it undermine the gains that the new technologies gave us?

Dig Deeper

  • Experts: Spy used AI-generated face to connect with targets, by Raphael Satter | AP
  • Cult beauty brand SK-II’s new spokesperson is a total fake, by Ruth Reader | Fast Company
  • Deepfakes and the new disinformation war, by Robert Chesney And Danielle Citron | Foreign Affairs


Doing the right thing vs. doing things right

Denis Rebrikov, a Russian molecular biologist, wants to implant embryos into HIV positive women. Before that, he will edit a gene, called CCR5, that allows HIV to enter the cells. Thus, he will reduce the chances of the baby getting the virus from its mother.

[Denis Rebrikov]

A Chinese scientist, He Jiankui, did something similar last year, only to be condemned by most of his peers and society at large. Editing genes is a risky business and is subject to the law of unintended consequences. It emerged that He might have made the kids—Lulu and Nana—more susceptible to West Nile virus, which can lead to neurological disease and reduced their life span. He also chose HIV infected fathers who only rarely passed their virus to children. In short, he probably made things worse for all concerned.

Rebrikov insists that what he is trying to do is different. His approach to editing genes is different, and he is choosing HIV positive women, with whom the risk of passing on the virus is high. In short, he insists he will do the thing ‘right’ this time.

However, the big question around CRISPR edited babies right now is not on whether there is a right way to do it. It’s whether it’s the right thing to do in the first place. We don’t know what the consequences are. When we don’t have enough information, the best course of action is inaction.

As Nassim Taleb says in his new working paper: “Absence of information is, simply, uncertainty. As an example, if you are unsure about the reliability of the airline, you drive or take the train; if you do not know whether the water is poisonous or not, you just avoid drinking it. Many modelers fail to realize that model uncertainty and disagreements about, say, a certain policy, is itself potent information that command the maximally prudent route.”

Dig Deeper

Libra and the question of trust

When most of us saw Facebook as a force for good, some of my friends, such as Karunakar Raykar, have been insisting for a long time that it can’t be trusted. They deleted their accounts back in those days, and took pains to explain why it’s up to no good. When Facebook bought WhatsApp in 2014, it promised the messaging app will remain independent. Many believed in Facebook. But, my friends rushed to their phones to delete the app. Facebook proved them right just two years later when it said WhatsApp will share data with Facebook. Some users saw it as a betrayal of trust. My friends smiled and said, we told you so.

When Facebook announced its ambitious plans around its cryptocurrency Libra last week, it was questioned on two grounds. One, can Facebook even pull it off? In Axios, Felix Salmon pointed out to its two previous initiatives, neither of which was short of ambition: Facebook Mobile and Facebook Credits (which this column touched upon last month). Two, can we trust Facebook to do the right thing? It’s selling Libra as a decentralised system. But, in fact, it’s an elite club. As the title of Ethereum founder Joe Lubin’s piece in Quartz says:  “Facebook’s cryptocurrency is a centralised wolf in decentralised sheep’s clothing”.

It was summarised by Naval Ravikant, one of the sharpest observers of tech and society. He tweeted:

“Libra is a fine digital currency, but it’s not a cryptocurrency.

Cryptocurrencies have value precisely due to decentralization.

Muddling the two may be in Libra’s interests in the short term, but is against everyone’s interests in the long term.”

Jane Jacobs in her fantastic book Systems of Survival: A Dialogue on the Moral Foundations of Commerce and Politics argued that commerce and politics came with their own set of values. When you mix the two, there is a risk of creating “monstrous moral hybrids.”  Libra faces that risk.

Dig Deeper

  • Facebook’s cryptocurrency is a centralised wolf in decentralised sheep’s clothing, By Joe Lubin | Quartz
  • Don't bother worrying about Libra, by Felix Salmon | Axios
  • Why USV is joining the Libra association, by Fred Wilson | AVC

Pop psychology

If you meet an entrepreneur who speaks glowingly about how the stock price of a company jumped on the first day of trading post IPO, make sure you question his ability to be rational. The first day “pops” are a fraud on entrepreneurs. It means the offering was underpriced, and the money could have potentially come to the company.

Venture capitalist Bill Gurley recently tweeted a thread:

“One perspective - CrowdStrike (and other way underpriced deals) are the true definition of a ‘broken’ IPO. The stock was grossly under-priced by over 80%. As there were 20.7mm shares sold, this delta represents $575mm (20.7*27.8) that is absent from the company's coffers."

“This same math holds for the Zoom IPO. Roughly 24mm shares, & the stock is underpriced by $26/share. Priced properly the company could have over $600mm in its bank account with the exact same dilution."

“Imagine if a CFO/CEO gave away a half a billion dollars? Or simply squandered it. How would that be viewed? This is similar, but it's institutionalized, and therefore everyone is numb to it. And the press views a ‘pop’ as success, which is just poor financial comprehension.”

So, when Slack used a direct listing route, he was happy.  “Kudos to @stewart & the team @SlackHQ for pushing forward with direct listing model.”

“With a traditional IPO, allocation is determined based on personal relationships, your firm’s brand, how well you know the banker, etc. Believe it or not, with a traditional IPO, a willingness to bid the highest price WILL NOT assure you allocation in the IPO.”

The question comes back to trust. Who will you trust more?

Dig Deeper

  • Slack’s unusual offering isn't designed to have a first-day pop. Here's why most IPOs still factor in a pop, even though they can cost startups billions, by Troy Wolverton | Business Insider
  • Slack Goes Public: What We Learned From Its Direct Listing Debut By Anne Sraders | Fortune


Techies and fuzzies

Take a look at the top 7 universities in the world, and top 7 universities in India according to QS World University Ranking, counting the number of times technology or science is used.



Massachusetts Institute of Technology

Stanford University

Harvard University

University of Oxford

California Institute of Technology

ETH Zurich - Swiss Federal Institute of Technology

University of Cambridge

Indian Institute of Technology Bombay

Indian Institute of Technology Delhi

Indian Institute of Science

Indian Institute of Technology Madras

Indian Institute of Technology Kharagpur

Indian Institute of Technology Kanpur

Indian Institute of Technology Roorkee

In the world, it’s three out of seven, and in India it’s seven out of seven. Is this an indicator of the big gap between techies and fuzzies in India?

It’s not that it doesn’t exist in the West. But there is a growing realisation that the gap, in the age of artificial intelligence, is not doing anybody any good. That realisation is backed by efforts to bridge the gap. The most recent one is the record £150 million donation by Blackstone boss Stephen Schwarzman to University of Oxford. Last year he contributed $350 million to the Schwarzman college of computing at the Massachusetts Institute of Technology. He has said the two donations are complementary, and that he hopes it will bring  Oxford’s “world leading expertise on ethics and philosophy” to AI.

N. Dayasindhu of itihaasa Research and Digital and a Founding Fuel contributor recently pointed me to the interesting work being done in this area at IIT Madras. A colloquium held at the institute earlier this year dealt with interesting topics.

Dig Deeper

  • The Fuzzy and the Techie: Why the Liberal Arts Will Rule the Digital World, by Scott Hartley | Buy the book on Amazon
  • Everyone’s talking about ethics in AI. Here’s what they’re missing, by SA Applin | Fast Company
  • Leading your organisation to responsible AI, by Roger Burkhardt, Nicolas Hohn, and Chris Wigley | McKinsey Quarterly

Was this article useful? Sign up for our daily newsletter below


Login to comment

About the author

N S Ramnath
N S Ramnath

Senior Writer

Founding Fuel

NS Ramnath is a senior writer and part of the core team at Founding Fuel, and co-author of the book, The Aadhaar Effect. His main interests lie in technology, business, society, and how they interact and influence each other. He writes a regular column on disruptive technologies, and takes regular stock of key news and perspectives from across the world. 

Ram, as everybody calls him, experiments with newer story-telling formats, tailored for the smartphone and social media as well, the outcomes of which he shares with everybody on the team. It then becomes part of a knowledge repository at Founding Fuel and is continuously used to implement and experiment with content formats across all platforms. 

He is also involved with data analysis and visualisation at a startup, How India Lives.

Prior to Founding Fuel, Ramnath was with Forbes India and Economic Times as a business journalist. He has also written for The Hindu, Quartz and Scroll. He has degrees in economics and financial management from Sri Sathya Sai Institute of Higher Learning.

He tweets at @rmnth and spends his spare time reading on philosophy.

Also by me

You might also like