Facebook Disputes Claims It Fuels Political Polarization And Extremism

Apr 1, 2021
Originally published on April 1, 2021 4:05 am

Facebook is making changes to give users more choice over what posts they see in their news feeds, as the social media company defends itself from accusations that it fuels extremism and political polarization.

The changes, announced Wednesday, include making it easier for people to switch their feeds to a "Most Recent" mode, where the newest posts appear first, and allowing users to pick up to 30 friends or pages to prioritize. Users can now limit who can comment on their posts.

The goal is to "give people real transparency in how the systems work and allows people to pull levers," Nick Clegg, Facebook's vice president for global affairs and communications, told NPR's Morning Edition. "You'll be able to, in effect, override the algorithm and curate your own news feed."

Facebook has come under escalating scrutiny over the impact of its platform on society since the Jan. 6th assault on the U.S. Capitol by a pro-Trump mob, which was planned and documented on social media sites including Facebook.

Many critics have zeroed in on the role of Facebook's algorithms, which determine what posts users are shown and what groups and accounts they are recommended to join or follow, and how they may push people toward more inflammatory content.

In his interview with NPR and in a 5,000-word Medium post published on Wednesday, Clegg clapped back at that criticism.

"Central to many of the charges by Facebook's critics is the idea that its algorithmic systems actively encourage the sharing of sensational content and are designed to keep people scrolling endlessly," Clegg wrote in the Medium post.

While "content that provokes strong emotions is invariably going to be shared," he acknowledged, he said that was because of "human nature" — not Facebook's algorithms.

"Facebook's systems are not designed to reward provocative content. In fact, key parts of those systems are designed to do just the opposite," he wrote.

Clegg disputed claims that social media contributes to political partisanship, saying academic research into the matter has been "mixed." He also defended the benefits social media provides, from personalized advertising to "a dramatic and historic democratization of speech."

Clegg told NPR his intention was not to blame Facebook users for the divisiveness on the platform, but to highlight the "complex" interactions between humans and technology.

"It's foolish to say it's all the users fault, but equally to say it's all somehow a faceless machine's fault," he said.

"People want simple answers to what are complex issues. I'm urging us nonetheless to try and grapple with the complexity of this, and not ... reduce it to some faceless machine that we blame for things that sometimes lie deep within society itself."

Facebook CEO Mark Zuckerberg gave a similar defense of the platform last week at a congressional hearing about the spread of extremism and misinformation on social media.

When lawmakers pressed him about whether Facebook bore responsibility for the Jan. 6 attack, Zuckerberg pinned blame on the rioters and on former President Donald Trump.

"I think the responsibility lies with the people who took the action to break the law and do the insurrection," he said. "And secondarily with the people who spread that content."

Facebook's stepped-up efforts to defend its platform and promise users greater transparency and control come as the prospect of new Internet regulations looms larger.

Several bills are circulating that would attempt to hold companies including Facebook more responsible for the content posted by their users and the real-world consequences of online activity.

Facebook itself is calling for reform in an extensive ad campaign, and Zuckerberg laid out his vision for updating regulations in his testimony last week.

"Everybody accepts that new rules of the road need to be written," Clegg told NPR.

Editor's note: Facebook is among NPR's financial supporters.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

NOEL KING, HOST:

One line of criticism against Facebook is this. The platform captures and holds our attention while its algorithms dictate what we see. And so the company just has too much influence on the national conversation. Those critics say Facebook has to take some responsibility for the polarized times that we're living in. Nick Clegg is Facebook's vice president of global affairs and communications. Yesterday on Medium, he posted a 5,000-word essay addressing critics. We called him on Zoom to talk about it. And full disclosure, Facebook is one of NPR's financial supporters.

NICK CLEGG: This essay tries to deal with some of those wider concerns about social media, concerns about social media and the impact on polarization in society and so on, but also and more specifically, focuses on this fundamental issue. Who's in the driving seat? Is it us, you know, human beings? Or is it the machines? Is it the algorithms? And what I seek to explain is that, certainly, as far as Facebook is concerned, people are often, if you like, much more in control or much more in charge than they sometimes feel like they are.

And what I announced in this article is some additional controls which give people real transparency in how the systems work and allows people to pull levers. So for instance, in a new feature, which I announced today, you'll be able to, in effect, override the algorithm and curate your own news feed composed of posts and groups and people who are your favorites. And so it's all about, in a sense, enhancing the control that users have as they use these sophisticated, new social media communication tools.

KING: I understand how that might help in some senses. But what if I'm the user who wants a bunch of posts about how to overthrow the U.S. government?

CLEGG: Well, if you are invoking violence, then of course that will be removed. And you won't be able to post that altogether. And we remove a significant amounts of content all the time where it breaks the company's own rules. For instance, just on COVID, we remove misinformation which could lead to imminent physical harm since March of last year. So over the last year, for instance, we've removed more than 12 million pieces of content on Facebook and Instagram where we feel that the information or the post would, you know, promote fake preventative measures or exaggerated cures, which would harm people. So if that's what you want, well, you can't do that on Facebook and Instagram. If you want to discuss politics, of course, you know, we live in a free society, thankfully. And you're free to do so, you know, on social media, just as much as you're free to do so sitting around your kitchen table.

KING: Toward the end of your essay, you warn against blaming Facebook's algorithm for divisiveness and for hatred. You write, quote, "we need to look at ourselves in the mirror and not wrap ourselves in the false comfort that we've simply been manipulated by machines all along." In your view, how much of the solution and how much of the blame lies with individual users?

CLEGG: This really isn't about apportioning blame. The point I was trying to make was that there are very, very popular ways of communicating, you know, messaging apps - iMessage, Telegram, Signal, WhatsApp and so on - where, you know, millions, billions of people use that, you know, every second of the day to communicate with others. And yet there's no algorithm involved in those. And yet they're also, of course, a route by which people say unpleasant and hateful things as well as beautiful and uplifting things. And so the point I was just trying to make was that it's just foolish to say it's all the user's fault, but equally to say it's all somehow a faceless machine's fault. It's the interaction between the two.

And the research, which I cite in the piece, you know, suggests that the reasons, for instance, for polarization in the U.S., you know, precede - polarization was developing decades before social media was even invented. And I guess what I'm trying to do, which is difficult because sometimes, quite understandably, people want, you know, simple answers to what are complex issues - I'm sort of urging us nonetheless to try and grapple with the complexity of this and not try and all reduce it to some faceless machine that we blame for things that sometimes lie deep within society itself.

KING: Nick Clegg is vice president for global affairs and communications at Facebook. Mr. Clegg, thank you for your time.

CLEGG: Thank you. Transcript provided by NPR, Copyright NPR.