08.05.2019

Danah Boyd on the Spread of Conspiracies and Hate Online

Danah Boyd is a senior researcher at Microsoft and founder of the research institute “Data & Society,” where she studies how media manipulators may be responsible for mass shootings and other crisis events. She sat down with Hari Sreenivasan to explain how digital media amplifies the spread of false information.

Read Transcript EXPAND

CHRISTIANE AMANPOUR: We turn now to the insidious spread of conspiratorial and hateful messages online, the sort of content that may have contributed to the radicalization of this weekend’s murder in El Paso. Danah Boyd is a Senior Researcher at Microsoft and Founder of the research institute Data & Society where she studies a range of social and technological issues, including how media manipulators may be responsible for mass shootings and other crises events. Our Hari Sreenivasan asked her how digital media amplifies the spread of false information.

(BEGIN VIDEO TAPE)

HARI SREENIVASAN: Danah Boyd, thanks for joining us.

DANAH BOYD, FOUNDER OF DATA & SOCIETY: Thank you for having me.

SREENIVASAN: So we live in a world, I think, of, you know, where facts are presented, represented, misrepresented, and manipulating those facts is as old as, well, society, right? What’s so different about this information web that we’re enmeshed in today that we should be concerned.

BOYD: The most simple answer to that is amplification. You take any sort of misinformation or even doubt, which is actually the most powerful, you start asking questions. And your question or your proposed alternative fact can reach millions of people if you stage it right, and you can do it at one level by purchasing advertisements that, you know, anybody can access, another level by figuring out how to produce viral media, even more powerfully figuring out how to get journalists to tell your story or ask your question by engaging with them on Twitter. And so, anyone in the world has the ability to learn a set of techniques and amplify in a, you know, hypernetwork media ecosystem.

SREENIVASAN: So why is it googling or binging the answer? Looking for this question, I’m going to find my own facts, I’m going to do my own research, I’m just going to type this in. why doesn’t that work?

BOYD: So search engines return the results based on the information that’s available to them. And so, they crawl across the Internet looking for content and then they try to find ways of ranking that content. And they try to find what they call authoritative content, right? Wanting to give you the stuff that really means something to you. But search engine optimization has been gamed literally since the beginning. Most people think of Google and Bing when they think of search engines, but actually YouTube is the dominant search engine for the under 25 set. It’s where they go to learn their homework or they, you know, try to figure out a problem or how to tie a tie. And as a result, they go there with a basic question – what’s going on with this topic? And what they get there is manipulated because people who have been optimizing the search results for YouTube, but the difference is is that the amount of information for Google or Bing to soak up on the broad, public web is extraordinary compared to the small amount that’s available on YouTube. And most high-quality content isn’t available on YouTube. Depending on what you’re looking for, it can be pretty bad pretty fast.

SREENIVASAN: So what happens when there is no existing data? You research this area, this idea called data voids. If some breaking news event happens an nobody’s really been looking for that town or that area before, what happens today?

BOYD: So Michael Golebiewski at Bing was really interested in this question of absent data because search engines require you to have data to return, and he calls it data voids, and we spent a lot of time looking at different types of data voids. And some of them are just a natural byproduct of literally no one searched for a town. No one produced content about a word. Right, there’s just nothing there. But sometimes they’re staged and they can be staged in different ways. Seeing people too is they coin a term and they mix (inaudible) and their website’s filled with that term. So let’s give a concrete example. A bunch of conspiracy theorists decided to focus in on a term called crisis actor. The idea that any time somebody would appear after a shooting on T.V., it was only to manufacture the shooting that the shooting didn’t happen and that these people who were distraught because their loved ones were lost were just crisis actors. And what the conspiracy theorist did is they built hundreds of pages, videos, detailed content about this conspiracy and then they worked really hard to get journalist to cover the story. They wanted journalist to use that phrase. And .

SREENIVASAN: Just the phrase, crisis actor.

BOYD: Just the phrase, crisis actor. And they got Anderson Copper said the phrase, crisis actor, by raising the question of the conspiracy. And what happened was that, you know, millions of people poured into Google and they ..

SREENIVASAN: And started searching (inaudible) …

(CROSSTALK)

BOYD: And started searching and YouTube. And what they got was conspiratorial content. And that happens over and over again. So consider, for example, what happened in Christchurch. Right. A terrorist went and opened fire and killed 51 people. Right. Is an atrocious act of violence. But he also played the media and he played tech companies. So he played the media by making certain he had a manifesto floating around so that when journalists were trying to figure out what was the explanation to this, they immediately went to the manifesto. And they immediately published the title. The title is a hate frame that if you searched for it you would get nothing but some of the worst anti-Semitic and white nationalist content. And he also decided to troll specific people and so that the news media would repeat the names of those people and assume that they were caught up in it and the result is that these people got massively attacked.

SREENIVASAN: You know you said there was a phrase called — I want to say it correctly — agnotology, the strategic manufacturing of ignorance. What does that mean?

BOYD: A group of scholars and the term is coined by Robert Proctor and Iain Boal — coined a term call called agnotology, which is the study of ignorance. And the idea is that ignorance is not just what we don’t yet know, ignorance is sometimes actively seated. It’s put out there to achieve a particular agenda. And what they were looking at was climate denial. Right. That all of these coordinated efforts to create, you know, fake science to create doubt about climate or to create doubt about vaccines or to create doubt about the relationship between tobacco and cancer. And that of course is a political agenda that we’ve seen different governments use a tactic of propaganda for a long time. It’s a lot easier to ask questions of doubt than it is to actually try to provide alternative facts. So you say well, maybe we don’t yet know why that plane came down. Maybe we don’t yet know what happened then in that election. And that seeding of doubt is so powerful because what it also motivates is for the public then to go and self investigate, to go see if there’s something real. So think about Pizzagate. Right. By having news rooms all around the country talking about Pizzagate as a conspiracy, well, people who don’t trust the news media felt the need to go and self investigate. So what do they do, they turn to Google. And what do they find, conspiracy all the way down until we got to a point where people started visiting that pizza shop. And as we know, one of them showed up with a gun. That’s a moment where the amplification and the desire to self investigate is the act of achieving ignorance in a coordinated and systematic way. And the question is always who’s doing it, why, and why are news amplifiers; including both formal news as well as social media platforms, why are they helping amplify content that is designed intentionally to fragment knowledge.

SREENIVASAN: And you said something about how information is not network. So if you wanted to manipulate this stream it’s not like going to the card catalog and messing with the single file and scratching it out. Right. It’s — it’s now you’re talking about this — this entire interconnected nature of all of these pieces of information that have a much more powerful effect.

BOYD: Absolutely. And I think that’s also what is easier to trick in the system, which is that once you get one domino going, it’s not hard to get the rest of them going. Once you get one news room to cover a particular frame, it’s not hard to get the rest of them going. Once you get a frame in motion on Facebook, it’s not hard for that to go everywhere. And that’s what makes it hard to tamp down. Right. And that’s — you know this of course predates what we’re seeing as (inaudible) – think about anti-vaccination movements, right? It was one delegitimized study –

SREENIVASAN: Yes.

BOYD: – that has kick started a measles epidemic in New York City, right? And that’s dominos. And so, what are the dominos that we’re dealing with, and who’s motivated by what, right? Where are the different motivations across them because once you have a population who doubts vaccination, every time we have new research coming out showing that there is no correlation between autism and vaccination, you get a boomerang effect. The more you start telling people that there’s no correlation, the more there are people who believe there is. It’s one of the biggest challenges for the Centers for Disease Control. Their science is unquestionable.

SREENIVASAN: So what happens then to good sources of information? I mean, when that doubt creeps in past a certain threshold, are conspiracy sites and legitimate institutions kind of seen with the same level of skepticism where they shouldn’t be?

BOYD: I don’t know that I would say they’re seen with the same level of skepticism, but there’s not doubt that trust in –

SREENIVASAN: Right.

BOYD: – longstanding institutions declines rapidly. What it will take for the news media to rebuild trust is going to be a really hard hike.

SREENIVASAN: How do we begin that?

BOYD: You know, a lot of to comes down to actually being connected in the community, and I think about this in relationship to the decline of local news which we often think of being associated with the Internet, but it actually isn’t. It’s roots are actually in the relationship of finance. So finance, hedge funds, and private equity started taking over newsrooms in the 80s as part of takeover culture mostly to extract the real estate and turn it into condos, and the result is we saw an absolute desolation of newsrooms where – you know, corruption of them, and then, you know, once you started having online ads, it was just the nail in the coffin.

SREENIVASAN: Yes.

BOYD: And the result is is that most people in the United States don’t know a journalist. And if you don’t know a journalist, why should you trust them? So that goes back to this very, you know, nature of networks. We know George Washington, he argued intensely. He was only argued on one thing in the constitutional conventions. He mostly sat back and let other people debate it except for one thing. He argued that you couldn’t possibly have a representative in the House of Representatives represent 40,000. It had to be 30,000 people because in his mind, above that you wouldn’t know your representative and you wouldn’t trust the government which is really notable in a moment where it’s, what, 750,000.

SREENIVASAN: Right.

BOYD: Most people don’t trust the government. They trust what they can know, what they can feel, what they can touch, what’s network to them. The more that we have a fragmented society, the more we have segmented populations, the more people stop trusting it.

SREENIVASAN: What role did sort of the tech companies, the platforms have in this? I mean, we certainly ascribe a lot to how we got here, but in trying to get our way out of this, is there a way for them to become more conscientious? Is it about engineers red teaming their product, looking at what the worst care scenario is? Is it how somebody could abuse it or is it about having emphasis on staff that are asking these engineers? How do they get – how do they change course?

BOYD: Right, so you know, the platforms have amplified everything and that’s what we really need to acknowledge is that they are the amplifiers and the escalators. They have taken the good, bad, and ugly and taken it to a whole new level. And so, the question is are we trying to get them back to a point where they’re just amplifying status quo or do we need – or what kind of intervention do we see – asking them to make? And depending on where you stand politically, you’re going to have a different view on what role you want them to have in society. But at the end of the day, they’re not public institutions. They are private corporations. Is the responsibility of the tech platforms to give its users what they want in the moment so that they’ll – is their responsibility to their advertisers, which means, tricking their users to stay on platform as much as possible? Or do they have –

SREENIVASAN: And give their shareholders more value.

BOYD: Give their shareholders more value, or is their responsibility to public citizenry? If so, we have to have a very intense conversation about how to restructure companies to have a double bottom line because right now they have a single bottom line, and that bottom line is Wall Street.

SREENIVASAN: Yes.

BOYD: And what I would argue is that they may talk whatever they say, and a lot of people working in those companies, they really mean it. They think that they can uphold capitalism and do good. I’m not convinced that they can in the long term.

SREENIVASAN: But what about our responsibility? I mean, we are the ones that are using the platforms, that are making them successful. We’re buying the products. We’re playing into this formula that they figured out that says, “hey, this is a great business to be in. And oh, by the way, I’m getting all of this information on every single one of these users for peanuts.”

BOYD: Right. So again, this is a question of where does responsibility and agency lie. Is an individual the meaningful actor of standing up. An American society, which has been always committed to the individual first, thinks that if you don’t like it you boycott it. Right. You walk away from it. So you don’t like Facebook, you walk away. Market choice. And if we had market, you know, competition we would solve all the problems. And I don’t think that that’s true in an information echo system. You know what it ends up creating is known amongst scholars as isomorphism, which is the idea that you start emulating other institutions. So you know news media organizations emulate each other because they want to make certain that they cover the stories that each other covers. They don’t differ. News organizations and social media companies have started emulating each other and feeding into each other. So we have this weird moment was like if we put the onus on individuals, it’s only going to get worse. And let’s be honest, we put the onus on individuals for so many other things. Has it made healthcare better that we are responsible for figuring out all of the medical costs and making inform decisions about what doctors we go to. Is it made — you know long term debt and savings better because we are now all responsible for our own, you know, 501ks or savings rather than having pensions. What this does is this privileges those who have time, money, energy, choice to stand up and do something. And that’s fabulous but I would argue that if you want a functioning society, you have to do a meaningful division of labor and division of responsibility. And that means not putting all of the burden back on to individuals. And we want individuals now to be informed about everything. I’m sorry; an individual is never going to be as informed as a doctor when it comes to medical decisions and scientific knowledge. And expecting an individual to do that is devastating for the health (inaudible) visual at scale.

SREENIVASAN: Given that we’re transacting where we’re getting some — we think we’re getting some value. Hey, I’m getting free email or I’m getting free access to the social network. But the cost seems to be all the information that we’re transacting behind the scenes. Is there any way that we can regain a grasp of that. I mean is the genie so far out of the bottle that it’s just pointless to try or what do you do?

BOYD: Data can be used in some of the most beneficial ways possible. It can also be used in some of most egregiously abusive way possible. And advertising is somewhere in between. Right. So what’s difficult is how do we create an echo system that makes certain that data is used to benefit individuals in society to the best they can. What I think we’re coming up with — against is that we need a new form of governance. If you assume the individual to take power, it’s not going to function. If you assume nation states as individuals to govern this, it’s going to create fragmentation. If you assume that the companies can do this, you’re going to end up with a different kind of exploitation because that is the nature of late stage capitalism because think about even advertising. A company like Facebook has three choices; they can find more users, I’m not sure how well that’s going to go. They can, you know, find ways to make more per user, which means more time on site. It means more advertisements on the page. It means different ways of trying to get you to spend more money. Or they can diversify their profit-loss structures. In other words, diversify their products. And they certainly are trying as are many of the other companies. But none of those are about a sustained and stable information ecosystem for the public. All of those come down to ways of pulling more data from more people over more time or just asking for their money directly. And that’s why I say that, you know, in an information ecosystem, when the expectation is that you have to make more money every quarter on quarter, I don’t see anything but a long term devastation. It’s just matter of when we will say enough and what it will mean to say enough. And the same I would argue is true with the news media. Right. Like it is hard to produce the news when you’re supposed to turn profits every quarter, not just be stably profitable but to return more profit over quarter by quarter. It also gets more dicely into like what will get you more, you know, viewers. What will get you a broader reach rather than thinking about what it means to do sustainability. And so that’s that question of what is sustainable capitalism rather than return on investment, you know, forms of capitalist or is do we need to be thinking about other models.

SREENIVASAN: Dana Boyd, thanks so much.

BOYD: Thanks for having me.

About This Episode EXPAND

Kellyanne Conway joins Christiane Amanpour to comment on the mass shootings that happened over the weekend in El Paso, Texas and Dayton, Ohio. Daniel Benjamin and Rev. William Barber join the program to weigh in on the attacks. Danah Boyd sits down with Hari Sreenivasan to explain how digital media amplifies the spread of false information.

LEARN MORE