Read Transcript EXPAND
CHRISTIANE AMANPOUR: No doubt, Facebook will play a big role in getting out the candidate’s messages, meanwhile the global behemoth is facing its toughest trial yet. The 34-year-old CEO and Chairman Mark Zuckerberg reins over an empire that is more populous than any country on earth. But he’s facing serious and ever mounting questions about how his platform is used to spread lies and hate and the bare knuckles tactics he’s been using to respond. Here’s what he said in an interview just yesterday.
(BEGIN VIDEO CLIP)
MARK ZUCKERBERG, CEO AND CHAIRMAN, FACEBOOK: Well, look, there are always going to be issues. If you’re serving a community of more than two billion people, there’s going to be someone who is posting something that is problematic, that gets through the systems that we have in place no matter how advanced the systems are. And I think by and large, a lot of the criticism around the biggest issues has been fair, but I do think that if we’re going to be real, there is this bigger picture as well which is that we have a different world view than some of the folks who are covering this —
LAURIE SEGALL, CORRESPONDENT, CNN: But if we’ve given the world a voice, look at what’s happened in the last year. You’ve had elections in the last years. Elections manipulated, hate speech that’s gone viral and turned offline, it certainly seems like this mission has been accomplished in many ways and there’s a whole new set of problems that, perhaps, you guys didn’t foresee and now we’re in a very complicated place where there’s not an easy solution.
ZUCKERBERG: Yes. These are complex issues that you can’t fix. You manage them on an ongoing basis.
(END VIDEO CLIP)
AMANPOUR: A lot of people will be hoping they can be fixed. Few people have Zuckerberg’s ear and understanding his business like Tim O’Reilly. Over decades, he’s been the mediator for honest conversations in the tech industry and he can count among his accomplishments coining the term, Web 2.0.
AMANPOUR: He sat down with our Walter Isaacson to break down what’s gone awry and how to move forward.
(BEGIN VIDEO TAPE)
WALTER ISAACSON: Tim, thank you for joining us.
TIM O’REILLY, FOUNDER, O’REILLY MEDIA: Oh, it’s great to be here.
ISAACSON: Let me dive right into what is a big question of our time. Why has Facebook and Twitter and some of these platforms suddenly become so divisive of our society rather than connecting us the way that they were supposed to?
O’REILLY: I think the thing that’s so important to understand about these platforms is they are object lessons on how our modern society is built and how it goes wrong. We have a mythology that we live in a world of free markets, but, in fact, we live in a series of designed ecosystems and these tech platforms are just the latest and most powerful examples of it. So designers make mistakes.
ISAACSON: So what’s a mistake that was made at the original Facebook?
O’REILLY: It wasn’t really the original Facebook. It was just as time went on, Facebook learned that the way to get more attention was to, you know, show people more of what they like, and they had a theory that that would make people closer together. Mark really believed that.
ISAACSON: Mark Zuckerberg thought it was going to connect people, which it has done.
O’REILLY: Which it has done, but we saw gradually that there were untoward affects and it spiraled out of control and Facebook is rapidly trying to come to grips with this. I spent time with Mark and he’s taking it very, very, very seriously.
ISAACSON: But what was built in to it was an incentive for engagement.
O’REILLY: That’s right, and the point is that that incentive turned out to have the wrong impact and one of the —
ISAACSON: Well, give me an example. In other words, engagement tends to be something that would inflame me so it sort of became an inflammatory platform.
O’REILLY: Yes, that’s right. It turns out that what engages people are things that make them mad. I mean, “Fox News” realized this long before Facebook, but they have built an algorithmic system for reinforcing that engagement by showing people more and more of the things that they like.
ISAACSON: Well, not just things they like, it’s things they’ll engage in, which is things that often get them upset and inflame them and they retweet.
O’REILLY: That’s right. But it’s this algorithmic reinforcement. You show people more of what they respond to and of course that becomes a cycle.
ISAACSON: Was one of the problems that it is all based on advertising revenue?
O’REILLY: I think you can have that cycle regardless, but yes, I think the need to grow revenue is in some sense, the master algorithm of these companies and it’s the master algorithm of our society. That’s really the point that I try to make in my book, which is that, it is a real learning moment here for us. If you can see that Facebook got their algorithms wrong and we’re asking them to change them, why can we not see the design choices that led, for example, us to incentivize drug companies to sell opioids, for example, leading to the opioid crisis? It’s exactly parallel. We literally have a system of incentives in place that told companies that it is okay to maximize for shareholder value, it is okay to tell the FDA, hey, downplay the risk of addiction here. We tell companies —
ISAACSON: Now did the company —
O’REILLY: That only one thing matters.
ISAACSON: How did we get to a system where the algorithm, not just about technology, but of all of our platforms seems to be focused on this one thing?
O’REILLY: I think it’s really what we believe shapes what we do, our policy makers came to believe something, you know. After World War II, for example, we believed that we wanted full employment, we believed that we needed to rebuild — we learned the lessons of World War I. We didn’t want to go down that path again. We wanted to rebuild Europe and Japan after World War II. We weren’t going to go down the slippery slope of a great depression. So we put in place policies for that and then we forgot. And then we had a theory that said, well, we really have to improve performance of our companies and we’re serious of people kind of putting out this idea of shareholder value and people said that sounds like a good idea. Let’s try it. In fact, it worked at first.
ISAACSON: But do you really think that was the main cause of Facebook going down this route that has led us —
O’REILLY: No, no. The point I’m making is that when you design a system, you have a theory about what works and we are designing incredibly complex systems today that we don’t really understand and that’s the real fear of AI.
O’REILLY: It’s not of the rogue AI that’s independent of us that becomes artificially intelligent, it really should be that we’re building these hybrid systems of humans and machines that are incredibly complex that we don’t fully understand, so we’re all like Mickey Mouse in the “Sorcerer’s Apprentice” in Disney’s version. We have this idea, we got the master spell book and we’re trying out some spells and after a while, we suddenly find out that things aren’t turning out as we expected and so the —
ISAACSON: Is that because its algorithm driven too and we lose a bit of the control?
O’REILLY: That’s right. I think we have to understand that our society is increasingly algorithm driven and it’s not just the tech platforms. It’s really across our systems, but tech also gives us the recipe for success.
ISAACSON: You once said that technology is the canary in the coal mine, explain what you meant by that?
O’REILLY: Well, the point really is that we very often as we talk about the problems of Facebook and Twitter today, we act as though it’s just Facebook and Twitter. It’s just a problem with tech, and my point is that they are just showing us in a very obvious way what happens when you have these high-speed — I call them in my book hybrid AIs, because they’re hybrid artificial intelligence, massive collections of humans in the case of Facebook, two billion humans connected in this network and basically, the human intelligence is augmented in some good ways, but also amplified in unexpected ways by the algorithms which are being designed. And it’s a little bit like the early days of flight when they were trying to figure out how to fly. We’re trying to figure out how do you weave billions of people into this dynamic system and we have not figured out the equivalent of aeronautics yet.
ISAACSON: Can an algorithm with racist?
O’REILLY: Absolutely. That’s of course one of the things we’ve learned increasingly as we look at the design of algorithms, the data that you feed into them particularly as we move into AI style algorithms which learn from the data. If you feed them biased data, they will come out very biased and that’s sort of another version of what we see here on Facebook, the fact that the machines can amplify a human bias —
ISAACSON: In other words, by reinforcing what already excites us, the algorithm learns and feeds us more of that which then reinforces our biases.
O’REILLY: That’s right. Well, in the case of these learning algorithms, you have to understand — let’s say a predictive policing algorithm. In that case, it’s not necessarily dynamic, it’s just that the algorithm is actually — as they call trained by feeding it lots and lots of historical data and it says people of color are more likely to commit crimes, well that’s because for 40 years, they’ve been arrested at higher rates because of biased policing. And so if that turns out to be the case, the predictive policing algorithm is also going to repeat that process, you know. A white person gets picked up with drugs, they get a slap on the wrist. A black person goes to jail and oh, well, guess what, I got encoded …
ISAACSON: But what if the algorithm is not just bias but it goes against some of our values?
O’REILLY: To me one of the really big opportunities here is that the algorithm is in many cases a mirror for our values and once we have encoded it into an algorithm, it can show us what our values actually are —
ISAACSON: Until we could tweak it.
O’REILLY: Yes, that’s right.
ISAACSON: Or even say, hold on hold on.
O’REILLY: That’s right.
ISAACSON: We now see what’s gotten encoded and we don’t like it. So what advice have you given Mark Zuckerberg to make the platform better?
O’REILLY: Well, the first piece of advice I’ve given him is to stop this idea that he can somehow discover the values of all the people in the network and algorithmically reflect those values. I said, first and foremost, it has to reflect your values because you, and when I say you, I don’t mean, you just Mark personally, but you the organization, are, in fact, curating the news feed. Facebook and Twitter actually have been making choices about what to reinforce and those choices are a reflection of their values and their value so far has been, we want more attention and that value has turned out to be not very good values. So now they have to say we need a much more complex set of values, so my advice has been you have to really interrogate your values and you have to decide, these are the things that we’re going to encode into our system because we’re going to respect the laws of the countries in which we operate. Oh, but these might be unjust laws from some countries that we’re not going to respect.
ISAACSON: Things like Facebook are now in the middle, they’re curating, they’re taking responsibility for what they do but they’re sort of a platform where anybody can speak. Do we need a new set of rules for these hybrids?
O’REILLY: I think absolutely, we need a new set of rules, because they are, in fact, not creating the content but they are curating the content and so they have to be responsible for what they curate and how they curate.
ISAACSON: Does that Facebook should have taken off Alex Jones?
O’REILLY: The question I don’t know should be take-off anything. The question should be how do you promote it because if, for example, you are doing a good job of taking multiple factors into account, you might say, well, lots of people want to see this, but, you know, lots of people seem to want to see Nigerian scams, too, but we don’t show those.
ISAACSON: But people want to see all sorts of things from dogs playing the piano or whatever, so you have to put your political values in.
O’REILLY: But this is not a question of political values. You look at this and go, this is clearly — is this information for profit? It’s not actually political speech. It’s commercial speech that’s attempting to deceive people.
ISAACSON: What about Twitter? What do you think went wrong there, if anything to make it seem to be a place where a lot of bullying and hatred and divisiveness have come to the floor?
O’REILLY: I think in each of these cases, the companies have abdicated — basically, again, with the wrong theory, the wrong theory that they were neutral platforms and also an incentive in the CDA exception for being a platform —
ISAACSON: The Communications Decency Act says if you police your content, you can be held liable for something that goes on. If you take your hands off attitude you’re not liable for what goes on. That’s oversimplifying. It’s not what the law intended, but that was the consequence.
O’REILLY: That’s right, and it’s another great example of you can give people the wrong incentives in the design of the system and in this particular case, based on that theory, they said, oh, okay, we need to be hands off.
ISAACSON: If you could tweak the law just a little bit, what would it be?
O’REILLY: I think first of all, to say, there is a class of platform that is not responsible for the content, but they are responsible for the curation and that we have to decide what is the responsibility for the curation if you promote things that are harmful to your users.
ISAACSON: That’s a very interesting and useful distinction. We have platforms, we have publishers and you’re saying create sort of a third concept which is curators and you have some responsibility but not total responsibility for what’s on.
O’REILLY: That’s correct.
ISAACSON: And allows a Web 2.0 to emerge.
O’REILLY: What you have is responsibility for the curation algorithms that you make. And so think about it a little bit in the case of fraud and abuse. If you promote a fraud and people are taken in by it, you should be liable.
ISAACSON: And on a positive note, tell me some of the things you’re really optimistic about.
O’REILLY: The thing I am most optimistic about is the human ability to make better choices and to learn from our mistakes and when I look at how we’ve dealt with past technological disruption, we went through a very dark period in each time as people were struggling and then we figured it out. Think about the first industrial revolution, think about the children being forced to climb chimneys and, you know, work in factories and we basically got over that and we started sending them to school instead. You look at the difference between the choice made after World War I and world war II and we made much better choices to rebuild the world and I think that we’re about to face a really big set of tests in climate change, for example. We will either rise to those or we will fail miserably, but I like to think that it’s going to lead to an amazing rethinking of our society. We have another great set of challenges around this rise of new technologies that will do more of what we used to call white collar job and they give us again this enormous opportunity to rethink the fundamentals of our economy, to rethink who gets what and why and how do we distribute the fruits of that immense productivity, because civilization has improved every time we have made humans more productive and the question is not, should we keep doing that? It’s just like how do we direct it? Do we direct it to solve new problems? Do we direct it to make everyone more prosperous? And when we do that, we have a very, very bright future.
ISAACSON: Tim, thank you for joining us. Appreciate it.
O’REILLY: Thank you.
About This Episode EXPAND
Christiane Amanpour speaks with Leon Panetta, former U.S. Secretary of Defense and former Director of the CIA; and Oby Ezekweseili, a Nigerian presidential candidate. Walter Isaacson speaks with Tim O’Reilly, Founder and CEO of O’Reilly Media.
LEARN MORE