The moral crisis at Facebook

Is it because of its business model, its culture, or something deeper?

N S Ramnath

[Mark Zuckerberg on stage at Facebook's F8 Developers Conference 2015. The screen shows users numbers updated for Facebook, Instagram, WhatsApp, Messenger, and Groups. Photo by Maurizio Pesce from Milan, Italia (CC BY 2.0), via Wikimedia Commons]

These are tough days for Facebook. It has been accused of causing addiction, violating privacy, spreading fake news, aiding hate speech and violence, and undermining democracy—and throwing muck on those who criticise it.

It has set off intense debates on what caused the crisis and how to fix Facebook. Some say that the problem is with its business model, and others point a finger to its culture.

Business Model vs Culture

Jonathan Haidt, a professor at New York University Stern School of Business, best known for his moral foundation theory, recently said he knew people inside Facebook, and they were good people who were interested in ethics and the ethics of business. The problem was with Facebook’s business model, which had “a fundamental flaw”. Tim Wu, another respected professor and author of The Curse of Bigness: Antitrust in the New Gilded Age has argued the solution is turning Facebook into a non-profit like Wikipedia.

It’s easy to understand where Haidt, Wu and other business model determinists come from: Your business model determines most of the decisions you take as a company. And when your business model depends on collecting more and more data from your users, and selling that to advertisers; and when your investors have made their bets that this will go on and on, you end up taking the decisions that Facebook has. In short, you can’t serve two masters. And the person who doesn’t pay you, can’t be your master.

The opposite view is that these problems entered Facebook tagging along with its leaders, who set its culture. A recent PBS documentary, The Facebook Dilemma, traces the company’s culture back to its early days—and it is defined by statements and slogans such as:

“Move Fast and Break Things”

“Done is better than perfect”

“What would you do if you weren’t afraid”

“In a world that’s changing really quickly, the only strategy that is guaranteed to fail is not taking risks.”

Similarly, a piece in The New Yorker by Evan Osnos (author of Age of Ambition: Chasing Fortune, Truth, and Faith in the New China) earlier this year placed the blame on CEO Mark Zuckerberg for not having the will to change Facebook. The question, Osnos wrote, “is not whether Zuckerberg has the power to fix Facebook but whether he has the will; whether he will kick people out of his office—with the gusto that he once mustered for the pivot to mobile—if they don’t bring him ideas for preventing violence in Myanmar, or protecting privacy, or mitigating the toxicity of social media.”

It’s not just Zuckerberg, but the entire leadership team. After The New York Times revealed that Facebook used a PR firm to discredit its critics (the story started with these lines: “Sheryl Sandberg was seething”), a lot of blame was directed at the author of Lean In, a book that inspired thousands of women professionals. The title of a story in New Republic read: The Punctured Myth of Sheryl Sandberg. The gist of these arguments go like this: Leaders define a company’s culture. They preferred scale and speed over other values, leading to Facebook making a series of wrong decisions.

Who then is the villain? Business model or culture?

The Three Buckets

To find an answer, getting a better grasp over its problems is not a bad place to start. Facebook’s problems can be placed into three buckets.

1. The addictive nature of Facebook: Facebook used subtle psychological triggers to make its users spend more time on the platform. The notification button that makes you want to click to see more, or the craving for ‘likes’ as soon as you post a photograph work on the same principles that a casino works to keep its patrons on the floor. The more time you spend, the more data Facebook can collect and the more data it has, the better it can know you. And that’s important for many reasons. Some of them apparently good (Facebook has in the past nudged its users to donate their organs, to go to the polling station, and to even stop some from committing suicide). But primarily it is neither for the good of the users or society. It is for the good of its advertisers. When a senator asked Zuckerberg how Facebook made money while being free, he replied with a smirk: Senator, we run ads.

It’s easy to underestimate its power. One study showed that Facebook could influence your mood by showing joyful or depressive posts.

In the movie The Social Network, Sean Parker, founder of Napster tells Zuckerberg: “You don’t even know what the thing is yet. How big it can get, how far it can go. This is no time to take your chips down. A million dollars isn’t cool, you know what’s cool?.... A billion dollars”.  

In real life though, seeing how big Facebook got, the founding president of the company said, it worked by “exploiting a vulnerability in human psychology”. And at another point, he exclaimed, “God only knows what it’s doing to our children’s brains.”

2. Facebook’s susceptibility to being misused: When it emerged that Cambridge Analytica managed to get data of about 50 million Facebook users and share it with third parties, the news hit global headlines. Similarly, when it emerged that ads placed by Russia to influence American elections reached 150 million people, it caused global outrage. Yet, Facebook’s platform has been used for worse purposes for much longer. For example, in 2010 there were cases of Facebook being used to publish hit lists of people involved in drugs in Colombia. In Germany, a study found a high correlation between hate messages in Facebook and physical violence against refugees. Sri Lanka, Myanmar, Libya, and of course India—in country after country we have seen cases of how Facebook and WhatsApp were used to spread fake news and fan hatred that polarised societies and led to violence.

3. Omissions and commissions by Facebook: Facebook’s own role in all these has been a series of things it did and should have done. It has moved too fast in hacking its way to growth, and too slow in responding to the complaints about the casualties of its speed. Its acquisition of WhatsApp and Instagram are great examples of its ability to see the future, and its transition to a mobile first world is a great example of how fast and how well it can execute. But, Facebook seldom displayed the same vision and zeal when it came to recognising the problems on the ground even after complaints. The title of The New York Times investigation read: Delay, Deny, Deflect.

The Infinite Loop

These three are interrelated.

Facebook made its platform addictive to expand its user base, to drive engagement. That very scale attracted more people, many of whom pushed the psychological levers even harder. After all, hatred is a stronger passion; doomsday stories have a bigger audience (plane landed safely won’t catch our eyeballs; plane crashed will). Some used the very same levers with ill intentions, to push their cause. Facebook, meanwhile, had developed a love for accelerators and a blind spot for brakes. As an organisation, it was designed for scaling fast, to constantly expand its user base.  

It’s a cycle. Its culture and business model reinforce each other. So, it doesn’t matter where one starts—whether one fixes the culture first or the business model. What matters is whether the actions are big enough to stop the cycle.

The debate therefore is really about what fix is effective enough. Will breaking up Facebook into smaller Facebooks help? Maybe, maybe not. In the long run, the very same network effects and the platform approach and the data that helped Facebook achieve scale so fast, will help one of the smaller Facebooks achieve the same even faster.

Will turning Facebook into a “public benefit corporation”, as suggested by Tim Wu help? Will clipping the wings of Mark Zuckerberg, and having people more powerful than him in the company help? Maybe. There are no clear answers, in part because there are no precedents that offer us a way to deal with the problem in the digital era.

The New Data Capitalists

What makes the past an inadequate guide for the present is that something fundamental has changed. Data is the new oil, it’s the new asset class, the new capital. That’s a reason why, as the popular meme goes, “the world’s largest taxi company owns no vehicles; the most popular media owner creates no content; the most valuable retailer has no inventory; and the world’s largest accommodation provider owns no real estate”.

They own data. Data is both wealth and power. The people who run these companies are not just capitalists, who have to use their wealth to wield power. They are data capitalists and get both on a platter.

This has two implications.

One, data should be treated like money, but for many reasons it cannot be. Unlike money, data is non-rivalrous—i.e., it is simultaneously available to multiple users. I can give you my data, and still keep it. Unlike currency, it is not fungible. It’s difficult to standardise—which makes it difficult to value. It is hard to determine ownership not only when it is crunched, but even when it is generated. Without my Fitbit, I won’t be generating a whole bunch of data, and yet I would consider that data mine, and not Fitbit’s. Yet, I know Fitbit has more data about me, and can do a lot more with my data than I can. Facebook is no different.

Two, who is data empowering? Users and the society or the companies and their investors?

The fundamental problem with Facebook is that it doesn’t acknowledge that users are the owners of data. Facebook assumes that, in return for an opportunity to connect to our friends, users have ‘given’ their data.

Consider this chat from the days when Zuckerberg built Facebook from his dorm at Harvard:

Zuck: Yeah so if you ever need info about anyone at Harvard

Zuck: Just ask

Zuck: I have over 4,000 emails, pictures, addresses, SNS

Friend: What? How'd you manage that one?

Zuck: People just submitted it.

Zuck: I don’t know why.

Zuck: They “trust me”

Zuck: Dumb fucks

Both Facebook and Zuckerberg have changed a lot since he typed those words as a 19-year-old kid, but we only have to try exporting our Facebook social graph to any other social network of our choice to know that it’s Facebook data that still has the power, not us.

The fundamental problem with Facebook is that our data has empowered Facebook, more than it has empowered us.

One good way to understand the issues with Facebook is to contrast Facebook with India's own approach to data.

Platforms for the People, by the People

India’s approach to solving some of these problems has been different.

For example, Aadhaar turned a very critical element of the digital economy—digital identity—into a public infrastructure. Nandan Nilekani and his team worked on the premise that data belongs to an individual, and the state is merely a custodian of that data. When Charles Assisi and I were reporting on Aadhaar for our book The Aadhaar Effect, we heard about the debates within the team about eKYC, and if they were exceeding their brief by launching the service. The clinching argument was that the individual has the right to data held by UIDAI and UIDAI is obligated to share that data upon the individual’s request.

The data empowerment and protection architecture is an extension of that logic. It gives a framework to build applications giving more control to individuals on how others can access their data and how they can gain out of it. Nilekani’s presentation remains one of the best explanations of the framework.

Societal platforms is another framework that allows for new organisations to emerge—where the value of the data goes to society rather than large organisations.

Senator, We Run Ads

It might be unfair to judge Facebook by the standards of societal platforms. But, we should judge it against the definition of platforms in general. Bill Gates, one of Zuckerberg’s mentors, once said: “A platform is when the economic value of everybody that uses it, exceeds the value of the company that creates it.”

For this to happen, either the value of Facebook has to come drastically down, or it has to find ways to provide economic value to all of its users. Can it? Will it? That’s not clear. What’s clear is it should start by acknowledging that data should ultimately empower society.  

About the author

N S Ramnath
N S Ramnath

Senior Editor

Founding Fuel

NS Ramnath is a member of the founding team & Lead - Newsroom Innovation at Founding Fuel, and co-author of the book, The Aadhaar Effect. His main interests lie in technology, business, society, and how they interact and influence each other. He writes a regular column on disruptive technologies, and takes regular stock of key news and perspectives from across the world. 

Ram, as everybody calls him, experiments with newer story-telling formats, tailored for the smartphone and social media as well, the outcomes of which he shares with everybody on the team. It then becomes part of a knowledge repository at Founding Fuel and is continuously used to implement and experiment with content formats across all platforms. 

He is also involved with data analysis and visualisation at a startup, How India Lives.

Prior to Founding Fuel, Ramnath was with Forbes India and Economic Times as a business journalist. He has also written for The Hindu, Quartz and Scroll. He has degrees in economics and financial management from Sri Sathya Sai Institute of Higher Learning.

He tweets at @rmnth and spends his spare time reading on philosophy.