Fixing the Social Media Machine

Written by Caitlin Carlson, Center for Business Ethics Faculty Fellow, Associate Professor of Communication and Media at Seattle University
February 14, 2022

Siva Vaidhyanathan, a cultural historian and the Robertson Professor of Media Studies at the University of Virginia, wants to drive a stake through the heart of Facebook. The problem with Facebook, says Vaidhyanathan, is Facebook. And it’s not a problem he thinks corporate social responsibility or government regulation is currently up to the task of fixing.

Caitlin CarlsonLast month, I had the pleasure of speaking with Professor Vaidhyanathan at a virtual event hosted by the Center for Business Ethics, “Fixing the Social Media Machine” (which is also now available to watch on demand at the end of this post).

To start my conversation with Professor Vaidhyanathan, I asked him to situate social media within media history. He was quick to warn against talking about "social media" writ large versus talking about Facebook, which presents unique challenges on its own. It has 3 billion users worldwide. That number is even bigger when you consider the other platforms such as Instagram, Messenger, and WeChat, owned by Facebook’s new parent company Meta. Twitter, by comparison, has just over 400 million users.

Facebook's scale is one of the most significant problems Professor Vaidhyanathan sees with the platform. That, combined with its surveillance of users through data collection and the algorithmic amplification of certain content, is what makes Facebook so dangerous. In the United States, we’ve seen the harm Facebook can cause when Russia used it to interfere with the 2016 presidential election and, more recently, in its promotion of disinformation around the 2020 election and the COVID-19 vaccine. These recent problems underscore the initial worries raised in Professor Vaidhyanathan’s 2018 book Antisocial Media: How Facebook Disconnects Us and Undermines Democracy.

Given these harms, you would think we'd walk away. But Facebook, says Vaidhyanathan, is like potato chips. It’s built to hook us. It gives us just enough good, in the form of personal connections with long-lost cousins or college roommates, that we keep coming back for more. And “the more” is often the extreme viewpoints favored by the algorithm, which Vaidhyanathan says is built to “quash the reasonable.” The algorithm prioritizes controversial content to keep people on the platform as long as possible to sell those eyeballs to advertisers. Given that, it’s no wonder that hate speech and disinformation flourish on Facebook.

When asked about whose responsibility it is to fix this problem– the users, the company, or the government – Vaidhyanathan isn’t sure any of these entities are in a position to make meaningful changes.

Individual users deciding to leave the platform, particularly in the U.S., would hardly phase tSiva 2021 headshothe company given their current user numbers. Moreover, advertisers, not users, are Facebook’s real customers. Right now, advertisers are very happy, so there's very little motivating Facebook to address the harms it causes.

Professor Vaidhyanathan also said that if we're looking to corporate social responsibility to motivate Facebook to fix itself, we will be disappointed. Corporate social responsibility requires businesses to act in ways that benefit society, and Facebook’s leaders think they’re doing just that. Mark Zuckerberg and other top executives at Meta sincerely believe that Facebook and Instagram make the world a better place. This perspective makes it nearly impossible for them to honestly wrestle with the damage their platforms have caused by undermining journalism or spreading disinformation.

Drawing on the research of my politics colleague and Center for Business Ethics Faculty Fellow, Onur Bakiner, I asked Professor Vaidhyanathan about the agency individuals working in these companies have to make incremental changes. Vaidhyanathan said that only those who share the “Facebook is saving the world” ideology are likely to be successful enough in the company to make any meaningful changes. In other words, to get to the top, you’ve got to drink the Kool-Aid. By the time you've been there long enough to amass any real power and influence, you've likely lost sight of what needed to be changed in the first place.

This opened up a larger conversation about corporate social responsibility and social entrepreneurship, both of which have dominated business programs in the United States. Professor Vaidhyanathan said that the message we’re giving to the privileged, educated, cosmopolitan young person is, “don't bother with the clergy, the classroom or government. Private enterprise is the more exciting destination. You can have a comfortable and rewarding life, celebrate profits by day, and sleep well at night.” 

I asked Vaidhyanathan what he told his students, many of whom go on to work at platforms like Facebook. He says he can hardly fault anyone for making a living but would encourage young people also to consider becoming activists or regulators. He believes that corporate social responsibility's popularity has come about mainly because of the now widely held perception that the government is not up to the task of fixing collective challenges.

social media click likesWhen asked whether the government has a place in regulating social media, Vaidhyanathan said addressing the surveillance and privacy problems of social media is the most viable path for success. We discussed whether the European Union’s Global Data Protection Regulation could serve as a model for the United States. Vaidhyanathan isn't so sure we have the political will to make that a reality. However, he recognized that allowing people to own their data and decide what to do with it could make Facebook more accountable to users rather than advertisers. Professor Vaidhyanathan also said that governments should require more transparency from platforms regarding how our data is collected and used.

Until something changes, however, Professor Vaidhyanathan anticipates that Facebook and its parent company Meta will continue to make decisions that are best for their bottom line without being forced to wrestle with the harms their platform causes to democracy or society.

Share on Facebook and LinkedIn!