01.12.2022

Did Facebook Contribute to Jan. 6? Whistleblower Weighs In

Read Transcript EXPAND

CHRISTIANE AMANPOUR: Now, speaking of electoral integrity, critics say that Facebook was a prime tool used to spread misinformation and sew doubt about the 2020 election. And since Facebook whistleblower, Frances Haugen, came forward with leaked documents, she says there’s now evidence to prove it. And here she is speaking to our Walter Isaacson.

(BEGIN VIDEO CLIP)

WALTER ISAACSON: Thank you, Christiane. And Frances Haugen, welcome to the show.

FRANCES HAUGEN, FACEBOOK WHISTLEBLOWER: Thank you for inviting me.

ISAACSON: It has been four months since the information you released has become public, caused a huge national scandal, articles in the “Wall Street Journal” around the world, congressional hearings. What’s changed since then?

HAUGEN: There have been activists who have been working for years, articulating most of the problems that were found in these disclosures. Butevery time they came to Facebook or they brought it to the public that these problems existed, Facebook actively denied they were real. They gas lit them. The thing that has changed is we now have evidence that Facebook knew itself that human trafficking was running rampant on the platform. That Facebook knew that they weren’t taking care of terrorists’ content appropriate inappropriately. That Facebook knew they were under investing in languages that weren’t English. That’s the difference now, is that we know that when Facebook said these problems weren’t real, they were real and then, Facebook knew about them.

ISAACSON: Last week, we commemorated the anniversary of January 6th. To what extent do you think Facebook’s actions were responsible for that insurrection in the U.S. Capitol?

HAUGEN: Facebook has known for a long time, at least several years, that choices it makes in how it designs its product. Not calls about individual pieces of content, but choices on how it designs a product itself have real-world implications for safety. Before the November 2020 election, Facebook surveyed a variety of settings in how its products were configured and came to the conclusion that there was a range of settings where the system was being optimized for growth over safety. An example of this is massively amplifying live video even though it knew it was bad at being able to supervise those videos. That’s why you often see instances of say graphic violence that occurs on news because Facebook doesn’t have a way to make sure they comply with their terms of service. Facebook came and said, given we know we have these vulnerable spots in our product, maybe we shouldn’t hyper amplify content that we can’t adequately supervise or maybe we shouldn’t maximize for a virality to the same extent when we know there’s a moment of potential instability. After the 2020 election passed, Facebook reset all those safety settings. They went and optimized again for growth over safety. And because there was no one paying attention in December of 2020, Facebook had just dissolved the civic integrity team, the team that was responsible for election safety. It meant that even though many experts were saying there is a large movement forming online, there is ways in which Facebook is amplifying the most extreme messages, no one was awake at the wheel over the holidays to put those safety settings back to where they were before the 2020 election. And I think that that instance of not paying attention and not being ready to act in a moment of crisis is something that Facebook is responsible for.

ISAACSON: So, tell us what’s happened since the January 6th insurrection. Have things gotten better? Did they learn from this?

HAUGEN: In the immediate aftermath of January 6th, we’re talking about the day of, the day after, they’ve reinstated many of the safety systems that were in place for the November 2020 election. But over the months after that, they reset over and over again back to the hypergrowth settings. I haven’t worked at Facebook since May of 2021. So, I can’t tell you what – – how Facebook is operating today. But until we had transparency into Facebook’s operations, we can’t ensure that Facebook is prioritizing the safety as much as it needs to.

ISAACSON: Doesn’t it really go Mark Zuckerberg and Sheryl Sandberg of are you going to prioritize safety or not?

HAUGEN: It seems there’s a pattern — there’s a meta problem at Facebook, or Meta as they call themselves now, of that they always are focusing on the next big thing and they don’t spend enough time making sure the things that they’re already doing are being done at the level of rigor they need to. For example, last fall, the disclosures I brought forth outlined a number of very, very serious problems of Facebook, everything from them only taking down 3 to 5 percent of hate speech to knowing that their platforms were being used by cartels and terrorists and then, not adequately moderating that content. What was Facebook’s response? They chose to pivot and focus on video games, to focus on the meta verse. That is a choice that Mark is responsible for and he hasn’t shown the level of responsibility and leadership for making sure Facebook solves its own problems before moving on to new greener fields.

ISAACSON: In response to your claims this fall, Facebook said it’s spent $5 billion a year keeping its platform safe. And here is a quote, “As a company, we have every commercial and moral incentive to give the maximum number of people as much of a positive experience as possible on our app.” What’s your response to that effort? I mean, 5 billion is a lot of money.

HAUGEN: I think the number that they should mention is how much profit do they make per year. When Facebook made the quote, they were on track to make $45 billion a year in profit. There’s this question of, you know, would Facebook be really suffering if they only made $40 billion of profit or $35 billion of profit? When you think about it, you know, Facebook claims to support on the order of 50 languages around the world, in a world that has 5,000 languages. And 3.1 billion of the people in the world are on Facebook. If you spoke one of the languages that wasn’t supported, would you say the $5 billion was enough money to spend on safety or should you spend another million or $2 million to make sure your language was supported as well? The fundamental issue is Facebook is unaccountable. And as a result, they can’t be trusted to set that level of safety spending at an appropriate level because none of us get to see what actions are actually done to keep us safe.

ISAACSON: There is something called Section 230, which is as you know full well, pretty platforms like Facebook somewhat exempt from private legal action from being held liable for what people post on there. Should we tweak or change that type of law so that Facebook itself is responsible if private citizens feel they have a cause of action against Facebook?

HAUGEN: I am against changing 230 with regard to individual pieces of content because it’s not possible to run services like Facebook or many of the other things we take for granted, the internet, if any individual piece of content can result in a lawsuit. But I do support the idea that if Facebook has made a long series of consistent decisions to optimize for growth over profit, it should have to take responsibility for that.

ISAACSON: Well, is that true? Is that what they’ve done?

HAUGEN: Unquestionably.

ISAACSON: So, then, what should the responsibility be?

HAUGEN: There — I — so, one of the challenges here is that because we are dealing with such an opaque system, it is difficult for us to see those patterns of behavior. We have to hypothesize them unless they — you’ve been on the inside and you’ve seen them happen. What we see is that — and we have documents in the disclosure covering this pattern that there’s basically an expectation internally that if you have a safety solution that will decrease misinformation, but it comes at even slight cost to frequency that you visit Facebook, to the number of pieces of content you view, that’s considered a solution that’s dead-on arrival. If Facebook consistently chooses to optimize for growth over things like our information environment, they should be held responsible for that.

ISAACSON: Now, you’ve said that the profits are driving this. Is there some regulatory thing that the Federal Trade Commission or Congress should do where you say, you’re a corporation that’s using a profit driven mechanism that you know is harming people and you know is harming the environment and you’re doing it intentionally simply because you’re prioritizing growth?

HAUGEN: I do believe there’s an opportunity for, say, the SCC to step in because the SCC — one of the responsibilities of the SCC is to ensure that public companies are not lying to the public. And we’ve seen repeatedly through the filings that we have submitted with SCC by my legal team that Facebook has said publicly one thing about say child safety and a different one internally in their research, or said one thing publicly about misinformation or COVID and said a different thing internally. And that’s illegal. Like, we can’t have corporations lying to the public. But I do believe there’s an opportunity for us to have a more systematic way of doing transparency where, right now, it’s unacceptable that academics can, in detailed ways, document problems with Facebook and then be reliant on Facebook to either validate them or not validate them, right? The fact that there is no independent accountability is unacceptable because right now, Facebook knows they have to publicly report their profit. So, they optimize for their profit. They don’t have to publicly report the volume of misinformation or whether or not people are overexposed to the information.

ISAACSON: Do you think the people at Facebook, especially on high, see the collective cost drive democracy and the individual cost?

HAUGEN: I think one of the very dangerous things about how Facebook is currently governed is Facebook is largely populated internally, its engineering teams, by people who are quite privileged. These are people who most of their friends are college educated. Most of them have comfortable lives. They don’t see a negative Facebook because their friends all post pleasant content. And so, one of the things that has repeated over and over and over against harm types on Facebook, this could be violent imagery, it could be hate speech, it could be misinformation, it doesn’t matter which, is most users are OK, maybe 60 percent, 70 percent. But the top 10 percent or 5 percent is hyper exposed, you know, over and over again, that they see vaccine misinfo on their news feeds. If you are a Facebook employee and when you log on to Facebook, it’s always a very pleasant experience, it’s very hard for you to feel the urgency of the need to fix things like misinformation or the urgency to deal with these rabbit holes people are falling down.

ISAACSON: Wait, wait. That makes no sense to me. I know maybe they can see a nice wonderful Facebook feed but they must know, I mean, it’s just been so well documented what happened to our democracy, for example, what’s happened to our vaccine awareness, for example. I mean, are they that clueless?

HAUGEN: A refrain that you hear quite often inside of Facebook and major executives like Bosworth have publicly — he’s, I believe, the CTO now as of maybe six months ago, four months ago. They’ve publicly come out and said, there have always been problems in the world, Facebook is not responsible for those things. People have always crazy —

ISAACSON: Is that a valid argument?

HAUGEN: I don’t think that’s true. And one way to think about it is Facebook’s own research shows that one of the dangers is something called engagement-based ranking, that’s where you — prioritize content, you give more reach to content that gets more reactions, you know, more comments, more likes, is that you end up giving the most reach to the most extreme content because people are drawn to engage with extreme content. Mark Zuckerberg in 2018 wrote this publicly in a whitepaper. Facebook has come out to said, the solution to this problem is AI, right? That we can have computer systems that will remove the worst things and that will keep us safe. The problem with that solution is they have to rebuild those safety systems over and over again by language and Facebook’s own internal data says that the way Facebook has configured them, they’re so afraid of making mistakes that in case of things like hate speech, they only get 3 percent to 5 percent of hate speech.

ISAACSON: This is also a huge problem internationally. And we don’t focus on it quite as much, because, you know, our own democracy got in peril. But let’s take Ethiopia, for example. What roles has Facebook play in the problems happening there?

HAUGEN: Ethiopia is a great example of how Facebook’s strategy of using AI to solve safety problems doesn’t extend to the most fragile places on earth. The most fragile places in the world are often linguistically diverse. They often speak languages that might be spoken by 5 million, 10 million, 20 million speakers. Ethiopia has 100 million people and six major language families in 95 dialects. Facebook currently — or when I left Facebook, Facebook, only supported two of those languages for any of their safety systems. When you make — when you choose to run the system hot and use AI to fix things, you open up the door where you just are flying blind for most problems in most places in the world.

ISAACSON: So, what’s been the consequence in Ethiopia?

HAUGEN: You see — right now, Ethiopia is facing a civil conflict for different ethnic groups engaged in violence. I believe it’s the Tigrayans are being targeted currently. And people are distributing misinformation that is dehumanizing folks on both sides of the conflict. That is calling for violence. You also see things like coordinated action where people come out and create a larger sense of public pressure like, you know, having artificial accounts, you know, mass pile ons of commenting when people, you know, make calls for peace, mass reporting. So, maybe someone is trying to deescalate the situation, you cannot just slander that person online, you can also mass report them, which can cause their account to get disabled. All of these things are seen in this conflict and it’s the same kind of thing that happened (INAUDIBLE) during the ethnic violence incident there a few years ago.

ISAACSON: The Biden administration is currently in talks with Russia, and a Facebook report released recently said that Russia is still the largest producer of disinformation on social media. What should be done to deal with the Russian influence?

HAUGEN: That is such a good question and it’s one of vital national security. One of the things that I have talked about repeatedly is the fact that Facebook is significantly less transparent and that has significant national security implications. When I worked with an investigator (INAUDIBLE) inside of Facebook, that’s the people responsible for catching things like people using the platform for spying or using the platform for information operations, the kind Russia uses to distribute that information. One of the things I was shocked by was that because Facebook doesn’t give any data out, often, Russian information operations are caught on Twitter because Twitter has a fire hose of tweets that people can analyze. And there’s maybe 10,000 researchers in the world that are always analyzing those tweets. Because Facebook has so little data, often, information operations on Facebook are caught using Twitter’s data. One of the most basic things we need is a larger partnership between private security researchers, between governments around the world and Facebook on mechanisms for us to have more eyes on task, more people looking for these information operations because Facebook hiding behind the curtain actually actively threatens all of our safety.

ISAACSON: Frances Haugen, thank you so much for joining the show.

HAUGEN: My pleasure. Thank you for inviting me.

About This Episode EXPAND

U.S. Deputy Secretary of State Wendy Sherman offers analysis of tensions between Russia and Ukraine. Journalist Tom McTague discusses recent calls for Boris Johnson’s resignation. State Senator Kathy Bernier (R-WI) explains how the Big Lie could come back to haunt her party. Whistleblower Frances Haugen explains how Facebook’s prioritization of “growth over safety” contributed to the Jan. 6 riot.

LEARN MORE