Read Transcript EXPAND
CHRISTIANE AMANPOUR: Whether it is conversations around race or COVID-19, social media has helped spread distrust, division and misinformation. The White House has taken on the social media giants with YouTube the latest to face its wrath. Now, a new book called “An Ugly Truth,” by reporters, Cecilia Kang and Sheera Frenkel reveal what they claim are Facebook`s deliberately dark strategies and its battle for domination. Here they are speaking to Hari Sreenivasan.
(BEGIN VIDEO CLIP)
HARI SREENIVASAN: Christiane, thanks. Sheera Frenkel, Cecilia Kang, thanks so much for joining us. I want to start with kind of the tension that is bubbling right now between Facebook, the administration and really everyone, which is the disinformation and misinformation about vaccines that`s spreading right now. We know that the majority of people who are in hospitals and dying today, as we record this interview, are all unvaccinated. And for a big subsection of that population, they get their news from social media and again, a larger subsection of that, they probably say they heard it on Facebook. The administration has tried to turn up the heat on Facebook and tried to say this is, there is responsibility here. Cecile, let me start with you. What responsibility does Facebook have to police the misinformation and disinformation about vaccinations on its platform?
CECILIA KANG, CO-AUTHOR, “AN UGLY TRUTH: INSIDE FACEBOOK`S BATTLE FOR DOMINATION”: Yes. Hari, the White House sees misinformation about the COVID vaccines as the last hurdle to solving the pandemic. And they believe that Facebook has a lot more within their control than they are exercising in controlling this misinformation spread. So, there is responsibility and that it is the biggest social media platform. And when the White House asks Americans why aren`t they getting vaccinated, as you said, they say they learned about misinformation — they learned theories about the harms of the vaccines, that are false, often on Facebook. Facebook says that it is trying and it does have absolutely responsibility. But the question is how hard are they trying to actually clamp down on the spread of misinformation. Not only the efforts that they do in trying to promote authentic information but what are they actually doing on the misinformation side?
SREENIVASAN: Sheera, one of the responses that Facebook was — had to the administration was, look, we have done all this good. We have connected so many millions of people to good information about vaccines, about where they can get a shot. This is the kind of stuff that we`re doing as a social media platform. But at the same time, you know, I interviewed Imran Ahmed month ago and the Center for Countering Digital Hate had put out this report that showed how literally a dozen people are responsible for, what, 70-plus percent of the misinformation that is happening on their platform. So, is Facebook not reading that report? Are they in denial? What are you hearing?
SHEERA FRENKEL, CO-AUTHOR, “AN UGLY TRUTH: INSIDE FACEBOOK`S BATTLE FOR DOMINATION”: You know, that report has been out for some time from the Center for Countering Digital Hate. It shows that, you know, over 65 percent of it is being seen on social media. And people within Facebook have absolutely seen that report. I think what it shows, and really it is a pattern that repeats itself in our book, is that Facebook is often quite you know, reactive, rather than being proactive about these problems. Instead of, at that point, perhaps contacting the White House themselves or launching a plan and saying, right, here are the 12 people. Here is what we have already done, here is what we hope to do to collect more data on how misinformation spreads, they sort of, you know, continue with their status quo. And what we know from our own reporting is that they reached a point last week where the White House was asking them for information they just didn’t have because they had not set up those systems that could track how far misinformation was spreading about the COVID vaccine on their platforms.
SREENIVASAN: President Biden recently accused Facebook of killing people. They kind of walked back that statement a little bit. But what is it something that government can or should do to try and stop misinformation?
FRENKEL: You know, I think that government often gets lost in very specific discussions about really specific pieces of misinformation. And in a way, that`s missing the forest for the trees here. You know, there are a lot of questions around, you know, what pieces of individual content could be moderated or even specific people, as we saw the White House list, that disinformation doesn`t. Instead, the White house could be having a discussion about why does Facebook still recommend that people join anti-vaccine groups? Why does Facebook`s algorithm still push you into those groups, if you join one? I, yesterday, went and joined one group that was, you know, natural cures for the common cold. And within one click, Facebook`s algorithms were pushing me into a group that was anti-vaccine? Now, if I was the White House, I would be having discussions with Facebook about that, which is something that they have already said they want to change and do differently. And so, I would be thinking more about how does Facebook as a platform spread that movement rather than individual players within the movement?
SREENIVASAN: Cecilia, you know, whether it is information or misinformation about the vaccines, whether it is about groups that are organizing to plan an insurrection against the capitol, I mean, time and time again, what you detail in this book is that there are these deliberations between people inside Facebook and Facebook took intentional actions that did change the view for billions of people of what they saw. I mean, they sort of proved that it was possible, that they could turn the dial, so to speak, or flip some switches and change things.
KANG: Yes. It was really an important moment for us in discovering, as we were reporting this book, that Facebook has a lot of power within its control and that they keep this from the public. That they are not transparent about how they do turn those dials. In that instance, they had decided to change the rankings of the newsfeed to prioritize more authoritative sources like news outlets, like CNN, the “New York Times,” the “Wall Street Journal” et cetera. And that that actually put and surfaced more authoritative news to the top of the newsfeed. It shows that there is within their control this particular pattern. And it also shows, importantly, that they are not explaining and revealing to public what they are doing. But another very important pattern that we discovered is that often times Facebook, these big decisions are made by a small group of people within Facebook and particularly the CEO, Mark Zuckerberg, who has really seized so much more control, not just on the technology, but on these really important policy decisions over content and speech and the spread of misinformation online.
SREENIVASAN: Maybe this is overly simple or overly complex question but, what determines on Facebook what it is that I see versus what it is that you see?
KANG: This is a really key takeaway that we hope our readers of the book will learn from, which is that Facebook is a public square with algorithms that surface the most emotive content to top of newsfeeds, that — and algorithms also direct its users to groups and other pages based on their activity. And that algorithms always bias towards the most emotive content. And what I mean by that is it content that spurs some sort of emotional reaction, be it anger or fear, happiness, what have you. But those — that kind of content gets you to like, to put an emoji on some sort of post you see and to share. And that`s a kind of engagement that Facebook needs very importantly to feed its business. The Facebook is in the business of gathering your data based on how much input you put into the site, how much engagement you have on the site. And so, the analogy of it being a town square is not quite right. We have somebody quote in our book, Renee DiResta (ph), who is quite smart on this, who really aptly says, freedom of speech does not mean free speech of reach. And the reach part is where the algorithms comes in, and that`s to how Facebook differs from any other communications platform that we`ve seen in history.
SREENIVASAN: You know, Sheera, one of the things that`s interesting is that if you go with this notion that Facebook can moderate and turn down things, say, for example, under the threat of imminent violence, right, that our protections of political speech or free speech are going to apply less if you are threatening the lives of someone who are threatening violence. Well, that — if they can do that in America, what about the rest of the world that is using Facebook, in some cases, to actually cause harm on groups of people? I mean, we`ve seen this play out, whether it is in Sri Lanka or India or Myanmar.
FRENKEL: I`m so glad you asked that. I mean, we have a whole chapter about Myanmar because that really is the worst-case scenario. Where Facebook launched to a country which didn`t have a local and free press it, it didn`t have NGOs and civil disobedience groups which could openly counter some of the hate speech that began to run rampant on Facebook. And we saw what happened. Facebook in that country had one content moderator that spoke just Burmese, they weren`t even in the country, who was responsible for moderating all of the content from the country, Myanmar, where over a hundred languages are spoken. And, you know, in Myanmar, we obviously know that led to a genocide. But it`s — looking across the world, in most countries, Facebook just doesn`t have the same teams of content moderators. They don`t have people that speak every language in Indonesia, for instance, or in India for that matter. And so, the question begs, you know, how do you responsibly make yourself the preeminent social media company in the world, launching into all these countries all over the world, without giving the same kind of resources to content moderates to make sure that whatever you are doing here in the United States, being on top of the hate speech, being on top of elections, you can responsibly do that in every country where you have launched.
SREENIVASAN: Cecilia, you both write a lot about the relationship between Sheryl Sandberg and Mark Zuckerberg. And in a way, how she came in really to juice up the business but also balance some of the strengths and weaknesses that Mark had. But in the past couple of years, we`ve seen Sheryl less and less. What is her status within the company right now from the folks that you are in touch with who are still there?
KOTB: Yes. I mean, I think there are two ways to look at her status right now. First of all, she still leads an incredibly important part of the business, which is the actual revenue creation. The profit center, which is the advertising business. And she is by all accounts internally and by shareholders a great success. The company is thriving. $85 billion in revenues and a $1 trillion valuation. She has, however, run up against Mark Zuckerberg`s need or concern about all the public fallout, over all, the scandals at Facebook. Mark Zuckerberg has taken over a lot of the responsibilities that early on in their relationship he left to her because he frankly thought they were not interesting, they are not interesting responsibilities, over policy, communications, legal, security, for example. So, he has lost some confidence in her as well as taken over more control of those kinds of roles. So, her role on the other part of the business that is very public facing and really deals with the issues and Facebook at the intersection of the society and politics are much more influenced by his decision making. We understand that she`s become much more isolated.
SREENIVASAN: I want a quote a response that Facebook had to your book. It says, this book tells a false narrative based on selective interviews, many from disgruntled individuals and cherry-picked facts. The fault lines that the authors depict between Mark and Sheryl and the people who work with them do not exist. All of Mark`s direct reports were closely with Sheryl and hers with Mark. Sheryl`s role at the company has not changed. Sheery, do your sources tell you otherwise?
FRENKEL: They do. And I mean, I would point out that we spoke to over 400 people for this book, the vast majority of them still work at Facebook. We didn`t, at any point in time, go seeking disgruntled ex-employees. We went seeking people who were in the room, who witness the relationship firsthand that could speak to it. And I think that if Facebook were to be honest about its upper echelons, it would say quite a few people have been hired at the very top and that they currently do jobs within the company that used to be handled by Sheryl Sandberg, and more to the point, Mark Zuckerberg currently does many of the jobs that were once handled by Sheryl Sandberg. And so, her role has very much changed.
SREENIVASAN: The time period that you are looking at during this as also the rise of Donald Trump to be president and the actions of his administration takes. And you repeatedly have employees in there speaking up and challenging hate speech, racist speech, but repeatedly the company seems to side with the politics of the day, taking that into account as well as the business side of it. So, you see this kind of deliberations and you say, well, wait. Is that separate from what we`re actually looking at? And why does the politics matter so much as almost a false equivalence when they publish and say, here are some things that we`ve taken off from the right, oh, here are also some things that we`ve taken down from the left?
FRENKEL: Right. I mean, we see them sort of taking this political calculus into effect really at the start of Trump`s campaign for president making this decision ultimately that they will carve out an exception for Donald Trump. And that`s a decision early on made really quite ad hoc, which kind of haunts them through the Trump presidency and this idea that grows among conservatives that there is a conservative bias within Facebook. Facebook constantly tries to fight against that by really, you know, making quite a few decisions out of fear of upsetting more conservatives. I think one moment that I remember quite vividly was when they decide to take down QAnon, which is a very far-right fringe conspiracy movement and its security team makes this recommendation of, here are the QAnon groups we really think we should remove. I think it is really important to national security that we remove these groups. And they are essentially told by the policy team to wait and to find groups on the left, which would be somehow equivalent that they could take down at the same time so that Facebook could announce a larger takedown, it wasn`t just far-right groups. And it takes them several weeks to come up with this kind of bigger list. To me that was a key moment where they are showing, well, they`re slowing down action they need to take just to create some kind of really false equivalence so the right-wing media doesn`t spin it as, oh, they only took down a certain sort of type of group on the platform.
SREENIVASAN: Cecilia, you have covered technology in Washington a while. And you know the administration`s change and things go into favor and out of favor. But here we are now perhaps because of how Facebook was used in the campaign against Hillary Clinton for the ascension of Donald Trump. The left seems far less enthused. The shine has come off Silicon Valley a little bit for them. And on the right, you have political conservatives saying that our speech is being squelched. So, it almost seems like a sort of a strange bedfellow`s moment where you have some bipartisan agreement that there needs something done. And at the same time, you have an administration that has staffed up folks that are really been paying attention to anti-trust, they have been paying attention to kind of the spread and power of social media. With all of that going forward change, is it likely that anything will change for Facebook?
KANG: Yes. I mean, I do think that change will have to come from the outside. I don`t think you are going to see change from internally. Mark Zuckerberg is not stepping down any time soon and he doesn`t have accountability inside. From the outside, it will take some time but there is incredible enthusiasm and energy right now in Washington. And as you said, Republicans and Democrats are united in their real desire to rein in big tech companies like Facebook. There is going to be some obstacles. Even though you have Jonathan Canter, who is appointed to the DOJ, Lina Khan at the FTC and people like Tim Wu at the White House who are very vocal big tech critics and who have called for the breakup of big tech companies. They are still encumbered in some way by the courts, as well as — the Supreme Court as well as other courts, as well as the lack of legislation that really can fit internet companies into a filter of enforcement action today. These are laws that was created — that were created during the big oil and big steel days. And so, there are some big challenges ahead. But I think if history is a guide, it is inevitable. It`s inevitable that these companies with so much power and dominance won`t see some sort of regulation at some point.
SREENIVASAN: What is the most, you know, surprising or disturbing thing that you learned when you were doing this reporting? Sheera, let me start with you.
FRENKEL: You know, I think it was just how much people tried to sound the alarm. For me — especially, the Myanmar chapter, to have someone in their own offices telling them a genocide is not just likely, it`s probably going to happen if you don`t change something. This is someone in their building in Menlo Park telling them this. And to have the company still not take action, I myself was shocked at that. I was shocked to just internally how many Facebook employees have been trying to do good, have been trying to tell their bosses that they see major problems and major issues. And I think all of us should sort of be aware that there are really good people who work within the company who are trying to do better. It is just unfortunate they are often not listened to.
SREENIVASAN: Cecilia?
KOTB: One of the things that surprised me the most is how this is so much Mark Zuckerberg`s company. I mean, I think I suspected that in the beginning. But to hear that he was making the calls on a doctored video of House Speaker Nancy Pelosi and he was making the calls on the former president calling for Americans to drink bleach and take UV lighting to cure COVID and that he was saying that those posts should remain up. I mean, that was actually surprising to me to hear that he was, in many ways, the final decider on these really important decisions.
SREENIVASAN: The book is called the “An Ugly Truth: Inside Facebook`s Battle for Domination.” Authors Sheera Frenkel and Cecilia Kang, thank you both.
FRYER: Thank you.
KANG: Thank you, Hari.
About This Episode EXPAND
Dr. Matshidiso Moeti and Jeremy Farrar discuss global vaccine distribution. Jeremy Farrar and Robin Rue Simmons discuss the efforts for reparations for the Black community. Cecilia Kang and Sheera Frenkel discuss their new book “An Ugly Truth: Inside Facebook’s Battle for Domination.”
LEARN MORE