11.23.2020

Wired EIC: Facebook is “Sticking Band-Aids on Problems”

While technology and social media have kept us connected throughout the pandemic, they have also caused indisputable damage. Nicholas Thompson is editor-in-chief of WIRED Magazine and argues that to prevent the spread of harmful misinformation, sites like Facebook need to change their algorithms. Thompson speaks with Walter Isaacson about how information gets distorted and how we get trapped in ec

Read Transcript EXPAND

CHRISTIANE AMANPOUR: So, now, while technology and social media have kept us connected throughout the pandemic, they have also caused damage. Nicholas Thompson is editor-in-chief of “WIRED” magazine and he argues that to prevent the spread of harmful information, sites like Facebook need to change their algorithms. And here he is talking to our Walter Isaacson about how certain stories get distorted and how we remain trapped in our echo chambers.

(BEGIN VIDEOTAPE)

WALTER ISAACSON: Thank you, Christiane. And, Nick Thompson, welcome to the show.

NICHOLAS THOMPSON, EDITOR-IN-CHIEF, WIRED MAGAZINE: Hi, Walter. It is a great pleasure to be here.

ISAACSON: “WIRED” recently featured a story about a small town in Washington State that feared antifa was coming in, and it was about the whole spread of misinformation. Tell me about that story.

THOMPSON: Yes. It’s one of the most compelling, small stories that illuminates this huge issue in American life, and it’s a town Forks, Washington where the residents were convinced that this bus was full of antifa protests and, you know, riot were going to come, horrible things were going to happen. And it was just a couple of people going camping. And what’s illuminating considering and we live in these little filter bubbles where tiny bits of misinformation appear and then spread, and inside of the bubbles we become completely convinced of totally different realities and it is part of the nature of how social media works and it can have real consequences. Fortunately, the story in Forks, Washington ended up without anyone getting hurt. Scary moments. But they got through it, but who knows what happens next.

ISAACSON: Do you think that Facebook has some responsibility with the information that they amplify to try to do more to make sure it’s correct?

THOMPSON: That is one of the great questions of American civic life right now. I mean, I think, first off, absolutely, yes. Facebook does have a responsibility, both to its business model, right. I think Facebook would be a better business were it more true to the people who have put their trust in Facebook, and Facebook is a company founded in the United States. It does have a responsibility to have — to play a positive role in democracy as opposed to a negligent role in democracy. The hard question then is what exactly should they do? The approach they have taken is to label misinformation and lies, to remove some of the people who spread misinformation and lies to sometimes swat out groups, but my fundamental critique of Facebook is that the core problem is the way the algorithm works and the way the algorithm has been trained over the last decade, the core news feed algorithm, and they need to fundamentally change the way they think about it and the signals that feed into it so that it doesn’t have all these consequences. Right now, they are sticking band aids on the problems caused by the core algorithm and they need to completely rejigger it.

ISAACSON: You know Mark Zuckerberg as well as any other journalist, you’ve covered him lots and then interviewed him for years. What do you think he really a feels about this problem?

THOMPSON: I think that he doesn’t think that Facebook is causing as much harm as I do. I think that there are — he probably weighs — each of the solutions that I would propose has trade-offs, right. There are always trade-offs between keeping Facebook — having a Facebook that supports democracy, but to do that you may push back on some idealism about free speech, right. And so, there may be some moments where he weighs free speech over safety, where I would weigh safety over free speech, right. There may be points where there are some points shift in values. But I think the fundamental difference would be I think that he may be in a filter bubble of his own where he doesn’t see some of the danger that’s been created and where he believes that Facebook doesn’t bear as much responsibility as I think it does for some of these problems.

ISAACSON: But do you think Facebook has about harmful to American democracy?

THOMPSON: Yes, I do think Facebook has been harmful to American democracy. Yes.

ISAACSON: Compare the way it works with Reddit.

THOMPSON: OK. So, there’s some data, some very interesting data. And the data suggests that on Facebook people are more likely to become more partisan, right. The more time they spend on Facebook, the more likely they are to become extra partisan. If you go in and you’re moderately conservative, you spend 10 hours on Facebook, you will come out extremely conservative. Reddit seems to have the opposite effect. Now, I haven’t looked at all of the data that happened in 20 studies, but this seems to match what one would think would happen because Reddit is just a list of information, and stories get voted up and they get voted down based on a user’s sense of quality. In Facebook, there’s a much more complicated algorithm that guides you into a graph of information based on the people you know and the people you interact with. So, what happens on Facebook is you’re more likely to start to get pushed in a bubble of, OK, these people think the same way, so you interact with them more often. OK, and that bubble will become a little more extreme and a little tighter. On Reddit, it’s more of a list, you’re more likely to confront counter information.

ISAACSON: And so, Facebook could solve its problem by — instead of just trying to play whack a troll and fixing this post or that post by having algorithms that did not nudge us towards more and more like-mined and more and more extreme people.

THOMPSON: Right. And the beauty is Facebook with its infinitely complex algorithm could nudge us in the opposite direction. If Facebook said tomorrow, you know what, we really think filter bubbles are a terrible problem. We’re going to start showing you more information from people you disagree with as opposed to people you agree with, that would profoundly change the nature of conversations on Facebook. They haven’t made that choice to do that, but it would be a good thing to try.

ISAACSON: Why haven’t they made that choice?

THOMPSON: Well, I would say two reasons. The fundamental one is, through my conversations with executives at Facebook is they don’t believe the supposition that I’m basing some of my arguments on. They don’t believe that Facebook puts you into filter bubbles. They think Facebook introduces you to more conflicting information or information that challenges your prior assumption. They think it does a better job of that than say cable news does, and they have — they will site surveys that show that people who get their news from, cable TV are more partisan than people who get it from Facebook. I don’t believe that. There’s information that counters it. But I think the top executives genuinely don’t think that there is a big problem. Secondly, Facebook is a huge company with a lot of different problems and a lot of different priorities, and I don’t think that they have settled on this approach to solving this particular problem as a top priority.

ISAACSON: As you know, Section 230 is a part of the law that “WIRED” has written about quite often that sort of inoculates all social media platforms from being responsible for what people post there. Do you think there should be a fundamental change in that law?

ISAACSON: Actually, no. So, you know, the beauty of Section 230 is it has two parts. The first is that it says every internet provider has a certain shield from most of the content that people post on it, right. So, I’m the editor of “Wired.” If someone posts a comment on wired.com, I am not legally liable for it. If I was legally liable for it, it would be a world of hurt. I don’t know if I could run comments on wired.com. It’s a much bigger problem at Facebook, right. People are posting stuff all the time. And anything that created a legal liability for Facebook, this company could be sued for, you couldn’t possibly run that company. So, that’s part one of the law. Part two of the law is that the company can take good-faith efforts to remove content from its platform without removing the — without losing the protective shield. So, they can go and they can say, hey, we don’t want pornography, right. We’re going to remove all the pornography. We’re going to remove all the terror content, and maybe we’re going to remove some lies about, you know, information about the election, and they are allowed to do that under Section 230. So, I actually think Section 230 is a good law. I think it’s well structured. I think changing it would create huge problems. I want Facebook to behave differently, but I want to force them to do that by changing a law which whose existence, I think, is part of the reason why these big tech companies are based in the United States and not elsewhere.

ISAACSON: But when I was at “Time” magazine we used to have to take responsibility even for the letters to the editor or the posts that were on “Time’s'” website. And you go all the way back to the 1960s, the great case of Times v. Sullivan, the “New York Times” had to take responsibility for an advertisement that people bought in the “New York Times.” Why do people who disseminate and then amplify information have to take no responsibility? Shouldn’t there be some way to tweak the law so that there is at least some vestige of responsibility that you have?

THOMPSON: Well, there is some vestige of responsibility. There are some categories where they can get into trouble. But I think the better parallel to “Time” magazine would be you bore responsibility for any letter that was sent in that you published, but imagine if you bore a legal responsibility for every letter that was sent to you, right, the thousands of letters that came in, some of which may have revealed personal information that violated somebody’s legal rights. The problem with Facebook is all of this information can be posted by anybody, that there’s no editorial decision and Facebook couldn’t implement a review team, right. At “Time,” you would read every letter, you would make a decision which of the two, four, 10, I don’t remember how many letters were run, how many letters would appear, and then the ones that, you know, were slanderous or broke the rules, you wouldn’t run. Facebook, by building this giant platform, there’s no way to have a human review everything. So, it’s a slightly different situation.

ISAACSON: And so, do you see any solution?

THOMPSON: I see — well, I see lots of solutions, right. So, there are a bunch of solutions to solve this problem. One, there is Facebook takes greater responsibility, changes its algorithms by changing the guts of Facebook, you reduce all the downstream problems. Two, you can bring antitrust regulation against Facebook. You could conceivably break Facebook into multiple companies, you could split off WhatsApp and Instagram. Maybe by doing that you create a competitive marketplace where different social networks have different priorities and they compete against each other and that leads to higher quality information streams, right. One of the problems with monopolies is they can offer low quality products. Maybe if there’s more competition. Third approach, which is what they’re doing in Europe, is you pass a bunch of laws that regulate their behavior. You don’t necessarily remove their liability shield but you make them do certain things that lead to more competitive marketplaces, better user rights. So, those different approaches can all be taken.

ISAACSON: We hear a lot these days about how the pandemic is accelerating technology, but I have a counter question to ask. We know so little still about how COVID spreads, what are the exact places that it happens, why hasn’t technology been able to help us trace epidemic, trace how people get things, know how this virus works?

THOMPSON: Well, I would respond by saying there are ways where technology has helped, right. We all have become fluent in our knot (ph), right. My 12-year-old son has learned how to go onto the internet and find the infection rate in every county in America, which he delightfully reports back to us at the dinner table. So, the amount of information available and the presentation of it has been extraordinary. Also, this virus came at us very quickly. It’s been a relatively short amount of time during which we have a number of vaccine candidates. So, there has been real technological progress. But your question is right because we have had also massive information failures because of Facebook, because of Twitter, because of the way that misinformation can spread, because of the filter bubbles and partisanship that turn a scientific issue into a political issue and made large portion of America for political reasons take a different approach to the virus than the scientific community would have said. So, I think that underlying information failure with coronavirus is the same problem underlying the democratic failure that has happened and that we talked about earlier.

ISAACSON: In other words, technology and data mining could have been used to figure out how effective mask-wearing is just by looking at hundreds of thousands of cases, and instead, technology has been used to confuse us and spread misinformation about mask-wearing.

THOMPSON: Yes, that’s absolutely true. If technology had functioned ideally during this pandemic, we would have had a whole lot of truthful trusted information spreading from the very beginning, we wouldn’t have had the bad information and we would have had a contact tracing system that works. One of the great mysteries is why don’t we have a good contact tracing system, why I can’t get an alert on my phone if I’ve been near someone that’s tested positive. I’ve installed the New York Contract Tracing app but it only came out in, what, September. Very few people have installed them. A lot of people don’t trust them. There was an element of the response to this pandemic that the tech industry could have and should have solved, but it didn’t really. And why not? Well, it’s hard to communicate across states, across tech companies, privacy concerns, there are a lot of reasons. But given the tragedy that we’ve been through, one has to say we could have done better.

ISAACSON: “WIRED” magazine writes a lot about artificial intelligence especially during the four years you’ve been the editor. Do you think China is about to move ahead of us in artificial intelligence and does that worry you?

THOMPSON: Yes and yes, I do worry a lot about that. It’s a little hard to define who is ahead but it’s unquestionably true that China as a nation has prioritized education in A.I., has provided a lot of infrastructure to A.I. companies and the country takes A.I. much more seriously than we do. So, why does that matter? Well, it matters because the nature of. A.I. will be defined partly by the countries that shape it, right. If China is the pioneer in. A.I. there will be fewer concerns about privacy, more of a focus you see, for example, face recognition tech is booming in China. And so, in some ways the value of the system that builds the tech are baked into the tech. So, that’s worrisome. It’s also worrisome militarily, right. They end up in a conflict with China, the country that has better A.I. in its military will have a huge advantage. So, I have, I guess, a two-part answer. One is, I am worried. I do think they are sort of advancing very quickly. I also hope that the policy response isn’t to push China further apart but to try to pull China closer in, which is a complicated, complicated dance.

ISAACSON: Beside editing “WIRED” you occasionally write some wonderful stories, and the one that struck me the most in the past few months was about this hitch-hiker that went off the grid, disappeared but had run into a whole lot of people and nobody can figure out who he was. Tell me about that.

THOMPSON: Oh, it’s an incredible story. So, there’s a guy. And in 2017, he starts to hike the Appalachian Trail, and he hikes all the way down to Florida over the course of the next year and some months. He meets hundreds of people, has his photographs taken. And then in July of 2018 he’s found dead in his tent in Florida, he seems to have just sort of starved to death there. We’re not quite sure why he died. And then the crazy thing, the crazy thing is that despite all these photographs being on the internet, despite thousands of people trying to identify who he was, no one can figure out his name. They used facial recognition technology, compared his DNA to known DNA databases, he had no I.D. with him, he had no phone with him. A million and half people have read this story that I wrote in “WIRED” and not one of them, as far as I know, recognized him had. Certainly, I got lots of tips, none of which panned out. In this age of surveillance and the internet, this is why I was drawn to the story, the age of surveillance and the internet, you sort of think that when somebody’s picture is put online, the hivemind of the internet will find him in about two minutes, and we’re two years and four months into this and no one knows who this guy was.

ISAACSON: Nick Thompson, it’s always fascinating. Thanks for joining us.

THOMPSON: Oh, it’s so much fun to talk to you, Walter. Thank you so much for having me on.

About This Episode EXPAND

Christiane speaks with Archbishop of Canterbury Justin Welby about the importance of the UK maintaining its international aid commitments. She also speaks with Bard College President Leon Botstein about how the college has extended education opportunities to both disadvantaged youth and inmates. Walter Isaacson speaks with WIRED editor-in-chief Nicholas Thompson about misinformation.

LEARN MORE