Read Transcript EXPAND
CHRISTIANE AMANPOUR: What is your mission? What is the mission of this Oversight Board, first and foremost?
HELLE THORNING-SCHMIDT, CO-CHAIR, FACEBOOK OVERSIGHT BOARD: Well, Facebook has given voice to millions of people across the world who would otherwise not have a voice. Facebook uses every day to share videos of cats and discussions of politics, everything which is very good. But we also know that there is a downside to Facebook, because Facebook can be used to spread speech that is hateful and harmful and deceitful. And until now it has been Facebook itself that were to regulate which content gets to stay up on the platforms and which content gets removed. Ultimately, this has been a decision for Facebook and Mark Zuckerberg. In 2018, Facebook recognized that that was not a sustainable way of doing these — taking these decisions and launched this idea of an Oversight Board. An Oversight Board to consistent of independent members, we have an independent structure, so we don’t have any obligation towards Facebook. And a decision that we will be taking will be binding for Facebook and final. So, it is that independence and the binding decisions that Facebook has to follow that attracted me to this Oversight Board because I agree with Facebook that Facebook should not be taking these very, very important decisions on their own. And now, they are no longer doing that.
AMANPOUR: So, look. Let me just read a couple of commentaries and have you responded to them. As you know, Kara Swisher, she’s one of the best tech reporters around and she has written to this effect in “The New York Times,” the Oversight Board has all the hallmarks of the United Nations except potentially much less effective. It may be beyond the capabilities of anyone given Facebook and its founder and chief executive, Mark Zuckerberg, have purposefully created a system that is ungovernable. So, just respond to that because your jurisdiction is fairly limited. You can’t deal with algorithms, you can’t deal with all the stuff that, you know, people have problems with. I mean, you’re only at the beginning anyway going to be dealing with particular content that’s already been taken down and those people want it put back.
THORNING-SCHMIDT: Well, over time we will be dealing also with content that Facebook decides to remove that other users want to remain up and also adds, of course. And over time, we will also impact, I think, Facebook’s own community standards. The guidelines they use now to decide which content gets removed or taken down. And that’s also dealing with the algorithms, doing this.
About This Episode EXPAND
Christiane speaks with co-chair of the Facebook Oversight Board Helle Thorning-Schmidt about moderating the company’s content. She also speaks with Nikole Hanna-Jones about her Pulitzer Prize-winning essay as part of the 1619 Project; and Anya Hindmarch about the handbags she is designing for frontline workers. Walter Isaacson speaks with Richard Haass about global cooperation.
LEARN MORE