I met Scott Hartley a few times last year. We bumped into each other at various lit fests. The Aadhaar Effect that I had co-authored with my colleague NS Ramnath was just out and Hartley’s The Fuzzie and the Techie was being talked about as well.
I thought the theme he wrote about as intriguing. The central hypothesis of his book is that those who have come from a liberal arts background are better equipped to navigate the complexities of the contemporary world. I thought it a contrarian narrative. I have always maintained that a strong grounding in the pure sciences is important to everyone. Incidentally, over the last few months, this theme has been the subject of some vociferous debate internally as well.
When I met Scott and after having read his book, I asked him if he would be interested to engage in podcast on the theme with me.
I enjoyed listening to him as he took my questions head on. What we did agree upon is that there are no absolutes. The truth is nuanced.
I hope you enjoy listening to this conversation. And all of us on the team would very much appreciate if you share what thoughts Scott’s comments trigger in your mind.
Why purely technical ability is not the ticket to the future
Charles Assisi: I read your book, The Fuzzy and the Techie with much interest. Over the last few weeks, many of us have been discussing your book. On the one side Scott, there are a bunch of people on the team who strongly believe in what you have articulated—that the next century will belong to fuzzies, as you like to call them. I’m a sceptic. I have articulated it in as many words and I have said that this is a challenger narrative. So what I’d really like to start from Scott, is that, if I were to go back to your growing up years, for instance, you've had the privilege of being mentored by some very fine minds, such as Esther Wojcicki…
Scott Hartley: Yes, that’s right. It’s a pleasure to be here with you, Charles. We like to call her Woj. That was her nickname, because her Polish last name was difficult for us to say. She is the mother of both the CEO of YouTube [Susan Wojcicki] and also [the founder of] 23andMe [Anne Wojcicki], the genomics company. So she’s sort of dubbed the godmother of Silicon Valley, if you will.
Charles: Exactly. That’s where I was coming from. You had the privilege of being mentored by someone like her in your very early years. And if I were to argue that you can make the case for a strong grounding in the liberal arts or humanities, because you have had privilege on your side, and you can safely ignore the hard science and technologies—how may you take to that? This incidentally, is an argument that is also used closer home when I speak to friends, who graduated out of premier institutions such as the IITs, who have chosen to branch out into the other liberal streams. They can. Since they already built a rock solid base for themselves, they can branch out into whatever they choose. How would you take this argument?
Scott: Taking a step back, the title of the book is The Fuzzy and the Techie. The premise of these terms actually come from Stanford University, where these were the light-hearted ways of referring to social science or humanities majors as fuzzies and, hard science or technical majors as techies—I'm going back to the 1970s. Really though this is a false opposition. The book is not one versus the other. This [is not] a battle of the fuzzies versus the battle of the techies. What this is, is really going back to 1959, to a famous lecture that was delivered at Cambridge University by Charles Percy Snow. CP Snow lamented the chasm that was happening between the sciences and humanities, saying, we've divided our world into people who study thermodynamics and engineering, and people who study Shakespeare and Hamlet. And it’s not one versus the other. It’s really the intersection of the two. Great minds, as F. Scott Fitzgerald said at one point, have the ability to hold two opposing ideas in the mind at the same time and still retain the ability to function.
We’re in this knee-jerk one versus the other mentality today. And that was really the notion behind writing the book — taking a look at Silicon Valley where the narrative — it’s very similar to the narrative in India, where I lived for over a year — where if you have technical skills, if you study the science, technology, engineering and math (STEM) majors, you suddenly have this golden ticket to the future. And that’s fundamentally false.
The truth is that the leaders of all these great companies have some technical ability to navigate our world today, but they also really have an ability to take technology and apply it to the most meaningful human problems that exist. And really the story behind The Fuzzy and the Techie is saying, the value of technology is in solving human problems. And to solve human problems, we fundamentally need an understanding of the humanities, the social sciences, how we come together and self-organise, how we navigate society together, our own psychology, what we need as people and individuals. These are really the drivers of the best tech companies.
The observation that I had in Silicon Valley was this one-sided narrative that if you study computer code, you suddenly have this ticket to the future. In actuality, when you look around some of the best founders, whether it be Stewart Butterfield, who's the CEO of Slack — Stewart has two degrees in philosophy, no engineering degree. If you look at the CEOs of YouTube, Susan Wojcicki, who is a history and literature major, or Peter Thiel, who is famous for founding PayPal and Palantir, and for being the first investor in Facebook — Peter Thiel has a law degree and a philosophy degree. Reid Hoffman, who founded LinkedIn, also is a philosopher.
If you go down the list of some of the best founders in Silicon Valley, you actually find that they are global thinkers who’ve partnered with techies, partnered with somebody who's coming out of IIT. But certainly, it’s not this monolithic notion that you have technical skills, therefore, you can become a driver, a steward of innovation tomorrow. If anything, I think the rise of purely technical folks — if you look at some of the data, the job growth is actually, according to David Deming, who’s an economist at Harvard — David has done research showing that the job growth, at least in the United States, is not in engineering, it’s actually a combination of what he calls high math and high social. So it’s really putting an emphasis on soft skills and social skills. Purely technical ability is actually not the ticket. It’s both sort of fuzzy and techie. And that’s really the driver behind why I wrote the book and what the book is about.
The liberal arts is actually the study of the sciences and humanities in equal measure
This notion that liberal arts is just humanities is also false. The liberal arts comes from the Latin phrase ars liberalis, which was really about stretching the human mind by exposing the mind to a plurality of subjects. These subjects included logic, mathematics, biology, they also included history, geography, and literature. So, the liberal arts is not just the study of poetry. The liberal arts is actually the study of the sciences and humanities in equal measure.
Self-driving cars isn’t about tech. It’s also about anthropology and sociology
Charles: That brings to mind Walter Isaacson’s biography of Leonardo da Vinci. And how he portrayed Da Vinci’s character and diverse interests — the way you portrayed it, actually.
Scott: Yes, the Walter Isaacson book is really incredible. When you look beneath the surface of innovation, whether it’s going back to Da Vinci, or we’re talking about the really cutting edge stuff today like autonomous vehicles, robotics, we tend to think of these as monolithically techie topics, right? We think self-driving cars is about sensors, it’s about computational power, it’s about using LIDAR sensors in video to better understand what’s happening on the road.
What’s interesting though is, if you go to Silicon Valley, where the head of the Nissan autonomous vehicle research centre is actually an anthropologist named Melissa Cefkin. Melissa has a PhD in anthropology from Rice University in Texas. And it's maybe surprising to people to say, ‘well, why would an anthropologist be the head of a self-driving car, autonomous vehicle unit for Nissan?’ And the reason is, because it’s not purely a technical problem. It’s actually a problem of intuiting and understanding how humans communicate. If you are Nissan and you are exporting cars to many countries around the world, you have to understand the tacit ability of humans to communicate what a hand gesture means somewhere, or what a head nod means, or just verbal communication that may happen.
What’s fascinating about this is, something that we think of as a purely technical problem, actually gets its roots in understanding human communication. So there’s a huge anthropological or sociological challenge there.
Similarly, with robotics. End of life care is one of the primary use cases for robotics where they’re in hospital systems. Yet, the trust around robotics is one of the biggest blockers to adoption within these organisations. It’s no wonder that there are people like Catie Cuan, who was a trained ballerina, who’s now getting a PhD in mechanical engineering, focusing on ballet choreography for robots — basically training graceful manoeuvers to robots. The reason is, because scientific studies have shown that graceful manoeuvers lead to greater human trust of those robotic processes.
So, we think of self-driving cars and robotics as these techie things, yet, there are these incredible roles and opportunities for folks that have come from other backgrounds.
Similarly, in data science we think if we have more information, it leads to more knowledge and wisdom. But we forget that this argument goes back to Plato, Sir Francis Bacon, to Voltaire saying, to judge a person by their questions, not by their answers or their information. Today, if you if you look at — for example, predictive policing is something where we often point to having data and being able to deploy it to report crimes in advance. Actually, the data that we have on crimes is not omniscient. It’s not where crimes happen. It’s where crimes have been reported in the past. And obviously, if you’re in certain communities in Mumbai, or Delhi, or Bangalore or any other city in India, where crimes are reported is highly reflective of the socio-economic status of communities where people feel safe, where people feel welcome to report crime. So, certain crimes are chronically underreported and others are over reported. So there’s a huge context that’s sociological, that’s criminological. That’s reflective of religion and economics. That’s deeply rooted in what the data is. So if we believe that this is purely a technical problem, and that we can create an algorithm that is somehow objective, that somehow creates fairness in our society, we’re fundamentally flawed in thinking that, because ultimately, these are sociological, anthropological issues that require a plurality of insights.
Technical literacy is certainly necessary, but it’s definitely not sufficient
So, if we have somebody who’s only studied data science and computer code, suddenly trying to deploy an algorithm that says, I can understand crimes, we're in a major loss for the context beneath the code. And this code is not objective. This code is reflective of the inputs and the people and the biases and the backgrounds that go into codifying things into ones and zeros. All this is to say that technical literacy is certainly necessary, but it’s definitely not sufficient.
Does the developing world need a deeper, sharper focus on tech?
Charles: I see where you’re coming from. But if I were to still hold on to my position for a moment — and this has much to do with the geography where I am, in Mumbai in India, and there are a lot of imponderables. Let me put it this way: In the Western world, technology has had a certain time to move from the industrial age to the digital age, and the masses, the population had a certain time frame to absorb all of it; the economy followed a certain trajectory. Now, if I look at what is going on around me, the ecosystem around me, the technologies are leapfrogging an entire generation. And you have an entire new ecosystem coming in. This was bought out in stark contrast when I was working on writing The Aadhaar Effect about the kind of changes that will happen to society. There's this massive debate going on about what kind of upheavals will this cause in society. And should this be slowed down? Now, how may you view it? In that you [in the Western world] had this luxury where you can absorb all of it. But in this part of the world, you need to study technology far more deeply, have a deeper grounding in it. And you need to understand it far more sharply before you take that big jump into the liberal arts.
Scott: It's interesting. I think that there’s always an infrastructure layer of technology being laid down. There’s certainly a role for deeply technical, scientific inquiry. The book is not in opposition of that reality. If anything, I’ve cut my teeth and spent my career as a fuzzy, as somebody who’s studied philosophy, and political science, political theory, international relations. Yet the last 20 years of my career have been at companies like Google and Facebook and advising startups, investing in startups, working in venture capital, where I have met literally thousands of entrepreneurs. And my observation was that some of the best entrepreneurs that I've met with, who have built some of the biggest companies across the world, not just in the United States — they tend to have both an appreciation and familiarity with technology to be certain, but they also have a broader curiosity, a broader passion for observing the world around them, and being able to apply that technology to solve what they think to be some of the most meaningful problems.
One example is a colleague of mine, who studied computer science, but came to Columbia University in New York. He’s originally from India and now lives in Bangalore, and has just raised money from Y Combinator in Silicon Valley to build what is basically a platform to onboard delivery workers across India who work for Zomato, or Dunzo, or Uber Eats, or a number of different delivery programmes.
What he observed was a kind of sociological challenge — there was high turnover for people that were entering the formal economy, people who spoke a plurality of languages — as anyone knows, India is as complex a society and country as any. I think in many ways, it’s much more similar to Europe, in the plurality of languages, backgrounds and culture than it is to the United States that’s more of a monolithic kind of uniform place. Yet a lot of people get that wrong.
His name is Madhav [Krishna]. He basically observed that there is this friction where people are trying to enter the formal economy, but they don’t quite know how to do it. He also witnessed that there are over 200 million people in India who use WhatsApp. And so he built an AI tool that helps delivery or anyone basically enter the formal economy by chatting with the company — it’s called vahan.ai. Basically, by word of mouth, and by sort of spreading this, enabling somebody to speak or to type into Vahan — whether they're typing in Telugu, or in Hindi, or Malayalam — basically allowing people to speak their own language, allowing people to communicate via the platform that makes sense to them — WhatsApp. And then allowing them to potentially get a job as a delivery person for one of these companies that are looking to hire close to a million people a year.
So, of course, it’s a technical product. He’s building artificial intelligence to build a chat bot that can onboard somebody in 10 languages over WhatsApp. And he has an engineering team. But the insight and what actually gave this company the ability to raise money from someone like Vinod Khosla, and gave it the ability to go to a place like Y Combinator, and the thing that excited investors, the thing that excited people was applying technology to a really fundamentally meaningful human problem that millions of people in India are facing around entering the formal economy.
You can say this is a technical product, and of course it is, but the insight and I would say the driver of what will make this company successful, is really a keen awareness and a keen empathy with the people around him, and observing what the opportunities were in India, and then building a technology to solve a fundamental human pain point.
I think that there are kernels in there of his exposure to and his curiosity about so many things broader than just technology that really will give him the chance to be a successful entrepreneur.
It’s not that emerging markets must focus on technology and developed markets [have the] privilege of focusing on reading poetry and literature. It’s much more nuanced than that. There is an infrastructure layer of technology that will always be at the forefront, whether today that’s cryptocurrencies or Bitcoin, or artificial intelligence or deep learning or machine learning. Of course, there are jobs for the top IIT graduates, and they’ll do quite well.
The argument is basically that you don’t have to be a technical person steeped in the perfect degrees from IIT Delhi or IIM Kanpur or wherever it might be. You can take your experience and your observations of the world, you can be a student of life. And you can also these days learn enough through these incredibly diverse and democratic platforms online, to basically level up in technical skills where you have sufficient ability to hire somebody around you, you have sufficient ability to story tell, to raise capital from investors around a big idea like Vahan, where you don’t need to be able to build the whole thing yourself.
You need not be a machine learning expert to apply machine learning to a problem that you see in the world
My takeaway is that, observing Silicon Valley through the 1990s, there was a lot of infrastructure being laid, whether it was the bare bones of the internet. But then it quickly became an application layer, where the opportunities were building dotcom websites that were solving different problems. They weren’t necessarily writing the HTTP protocols, or the TCP IP protocols, or creating Netscape. Those were the infrastructure layers. And then, a million entrepreneurs were building the application layer where they were looking at what problems to solve.
Today, the same thing applies. You need not be a machine learning expert to apply machine learning to a problem that you see in the world. Or, you need not be an expert in some sort of AI to be able to say, I believe that there are processes within my work organisation where there must be some technology that I could steer to help solve some process that is highly repetitive.
The ultimate irony is that some of the fastest processes to be automated have to do with coding itself
The other thing that I would say in closing here is that when we think about automation, when we think about the people on the fringe of society — what are the skills that allow us to have a path to relevance in tomorrow’s world — the ultimate irony is that some of the fastest processes to be automated are routine processes. And some of those actually have to do with coding itself.
Does the entrepreneur need to blend fuzzy-techie in his mind?
Charles: That’s an interesting point you make Scott. The example you quoted, Vahan — what to your mind is working well for them? Is it a fuzzy and the techie working together? Or is it the mind of the entrepreneur there — is that a combination of fuzzy and the techie, as you like to put it? Are they amalgamated in one head, in one brain?
Scott: That’s a great question, Charles. I've seen both be successful.
Where I think success is difficult, where we kind of lose the argument in the book is, the book is not saying — I’m not arguing that one ought to be, at least in this transactional sense of where opportunity is, as far as entrepreneurship, or economic development — there certainly is an argument for someone to just be a poet for poetry’s sake, and for the beauty of, of art and culture, and the importance that plays in society. That that is fundamentally an important part of our world. I think that there is an incredible place for purity of study in all these different domains.
As it pertains to entrepreneurship, and the ability to found and create meaningful companies, I’ve seen both — people like Madhav Krishna, who is a technical person, but has a deep appreciation, a deep curiosity, has studied a number of different things, and has travelled the world, has paid keen attention to some of the problems and challenges that he's witnessed in India, and then around the world, to be able to apply technology to those problems. He’s an example of somebody who’s technical with a very kind of fuzzy appreciation—he blends both pretty well himself.
In other teams, I've seen a highly technical person, and a highly un-technical person who has a deep ability to story tell, a deep ability to hire and communicate a vision.
I don’t think that there’s one that’s better than another. I would say, though, if you have two people, three people who have no broader interest than just writing computer code — I’ve invested, unfortunately, in those companies, I’ve worked with a lot of those companies. And I have to say, those are typically the companies that do the poorest, in my experience.
We think of a CEO as chief executive officer, but it’s really the chief evangelist or the chief storyteller
[They have an] under-appreciation of some of the je ne sais quoi—the design, the communication, the ability to story tell, hire.
We think of a CEO as chief executive officer, but it’s really the chief evangelist or the chief storyteller. And if you have somebody who lacks some of that ability, they may be as technical as can be — but Larry [Page] and Sergey [Brin] without Eric Schmidt would not be Google. Facebook without Sheryl Sandberg would not be Facebook. LinkedIn or Slack without Reid Hoffman or Stewart Butterfield would not be successful. So, I think it’s a blend of these two sides, for sure.
Charles: You made the point earlier on, you’re familiar with India, and you’ve seen the Indian obsession with technology. And with all things pure sciences, and engineering. I started out by asking you and trying to articulate a case for the pure sciences. Where do you reckon this comes from? Is this an Indian or an Asian way of looking at the world? In your experience as an investor, as a writer, what have what are your observations?
Scott: What’s interesting in some of the feedback over the course of writing and speaking about the book over the last year or two, the overwhelming observation that I've had is — for example, we did a panel at Stanford University with Marissa Mayer, who was the first female engineer at Google. She’s a computer scientist. She was also the CEO of Yahoo. And one of the things that was fascinating is, going into the panel, a couple people said, are you nervous, you’re speaking with Marissa Mayer, who’s a computer scientist, you’re speaking with the head of computer science at Stanford, these people are surely going to disagree with your premise. And actually on the panel, not only was there agreement, but we had these breakthroughs where people like Marissa said, I’ve never actually said this out loud. I’ve never thought about this. But one of the ways that I’ve been able to develop products at places like Google and Yahoo — she basically said, the course that she thinks about the most, the course that has become relevant in her life the most over the 20-25 years, since she graduated with a slip of paper that she put in a frame on the wall that says ‘computer science’, the class that she said was the most impactful was actually a theatre arts class.
You might scratch your head and say, how is it that somebody who works in engineering and product development at a tech company thinks a theatre class was useful 20 years later? And the reason she said is, if you watch a Bollywood film from the 1970s, or you watch a Bollywood film today, with Hrithik Roshan and people running around, the expectation of how you had to unpack a love scene in a 70s Bollywood film, the love scene would go on for 20 minutes of dancing through beautiful flower fields in Switzerland. Today, it’s a single glance across a bar at a trendy Mumbai nightclub. See, the expectation of how you unpack, explain something, the presumption of knowledge in your audience is so different. So she basically said, when she studied Broadway plays in the 50s, again, you’d have a 10 minute dancing scene to explain that a couple was in love. Whereas today, it’s a 10 second glance that the audience understands and understands what’s going on. She said the same thing — five years ago, if you were building a tech product, a mobile app, you would have to explain through maybe a 10-step process, how somebody was supposed to swipe or scroll through a Facebook newsfeed or an Instagram newsfeed. Today, because of the ubiquity of these platforms, and the ease of [use] — anyone in a village, in any part of India could show you how to scroll through Facebook. It’s second nature at this point. Yet, that was something that needed to be explained five or 10 years ago. And so she said, what was fascinating to her was, she was thinking metaphorically across these different domains, and really gleaning an appreciation and sort of an insight for good product development through this experience with studying theatre.
At great companies like Apple, I've talked to people as well. There was a lecture by Joshua Cohen. He was brought in after the death of Steve Jobs to try to retain some of the culture of Apple. Josh is a moral philosopher who studied under John Rawls at Harvard. He was brought in to basically lecture to engineers at Apple about things like landscape architecture. One of the lectures that Josh would deliver at Apple was on the development of Central Park and how landscape architect Frederick Law Olmsted developed Central Park to have all paths that are curved, because you want to democratise the ability to experience nature for people that didn’t have the privilege of going out into the woods. And so you might say, well, what’s the purpose of having a moral philosopher talk to engineers at Apple about the landscape architecture of Central Park, but there’s some metaphorical — there’s some breakthrough where maybe there is some beauty to the appreciation of drawing these parallels, connecting these dots, where if you look at a company like Apple, and you say, how have they been so innovative, how they created such beautiful products? How have they really driven the innovation engine forward? Of course, we have Xiaomi in China, and we have other companies that have followed quickly in their footsteps and develop cheaper, faster products. But really, the pioneers of creating these beautiful products were the people at Apple. And you wonder, what is it that gave them that ability? I think it's some of this orthogonal thinking, I think it’s this appreciation as Steve Jobs had, to calligraphy and the beauty of fonts that created the WYSIWYG interface. Otherwise we’d be staring at dark Unix screens with code on them, had it not been for Steve Jobs’ appreciation for the beauty of sans serif and serif fonts and sort of bringing that to the, to the user interface of the personal computer. So, all these little overlaps and epiphanies point to the importance of having a balance between these two sides.
One final point I would make is, if you talk to somebody who is deep scientists — for example, if you talk to a physicist, whether that physicist is in China, India, Britain, or the United States — for example, you talk to Italian cosmologist Carlo Rovelli, and I had the chance to speak with him when he was here. He’s an incredible physicist, but also a philosopher. Where this circle comes all the way around when you talk to a deep scientist, is that these are not mutually exclusive principles, right? If you’re going into the depths of cosmology, there comes a point where fact runs into faith, and where faith becomes philosophy.
What is the merging of computer science and philosophy? Well, that’s artificial intelligence
I was talking to somebody the other day, and they said, the reason they became a computer scientists was because they love mathematics. And they also love foreign languages. And they thought of computer science as a blend of a foreign language in math. Then he said, and then I developed this passion for not just computer science, but for philosophy. And then I thought, what is the merging of computer science and philosophy? Well, that’s artificial intelligence. So we have these new terms that are not so dissimilar from old terms like architecture, where architecture is art and mathematics put together. So we have these things that are sort of already blended, fuzzy and techie, we just maybe think of them in monolithic terms like physics. But in fact, cosmology is really a blend of particle and theoretical physics and philosophy.
Charles: I'll have to concede that one. Going back to my college days, I have to admit that probably some of the most fascinating conversations I've had, have been with my teachers and with my seniors, biochemists and physicists, who had their heads deep into philosophy and poetry and who would keep us awake through the nights. And they used to be absolutely fascinating conversations. After listening to you, I don't think I'll come in the way of my daughters and me arguing about whether or not you should get into the humanities or take up the pure sciences. I suspect that argument will not stand to scrutiny. It's been a pleasure listening to you. Thank you so much for taking time out.
Scott: The pleasure is all mine, Charles. Thank you so much for having me.