09.11.2020

In the Digital Age, Lies Spread Faster, Truth Will Never Win

Does our growing dependence on social media have a dark side? A new docudrama, “The Social Dilemma,” warns that it is breaking down our shared reality. Director Jeff Orlowski joins Hari Sreenivasan to discuss this, alongside Tristan Harris, who heads up the Center for Humane Technology and also appears in the film.

Read Transcript EXPAND

CHRISTIANE AMANPOUR: And talking about social media and polarization, we tweet, we like, and we share, but our growing dependence on social media has a dark side. In fact, a new docudrama, “The Social Dilemma,” warns that it breaking down our shared reality. Jeff Orlowski is the film’s director, and Tristan Harris, who makes a cameo appearance, heads up the Center for Humane Technology. Here they are talking to our Hari Sreenivasan about the documentary.

(BEGIN VIDEOTAPE)

HARI SREENIVASAN: Thanks, Christiane. Jeff Orlowski and Tristan Harris, thanks so much for joining us. Let’s talk a little bit about the state that we’re in, drawing the line — or at least some of the people that in your film have drawn the line between the rise of mobile phones, mobile technology, social platforms, and kind of how we got to where we are, and maybe some of the medical effects that were the unintended consequences.

(CROSSTALK)

JEFF ORLOWSKI, DIRECTOR, “THE SOCIAL DILEMMA”: Totally. Totally. I think there are a couple of — I would say there are two big turning points in this story, one which was when the iPhone came out and gave us individualized relationships with the Internet. So, it was no longer a desktop computer that the whole family shared. It was something that you carried with you 24/7. It was unique to you. And when you factor that in with these machine learning algorithms that we’re describing that can constantly and recursively learn and customize for you, those are the two big things that gave everybody a very personalized version of the world. And I think what we have seen here is the spectrum of consequences that happen at the individual level and at the societal level. So, as we were talking is these incentives and what’s going to get you to come back and get you to stay and spend more time on the platform, because that that’s customized for every single user, we all have these different things that it brings us to. It’s also slightly different on an Instagram platform vs. a YouTube platform vs. Facebook vs. Twitter vs. TikTok. But each of those are using that same fundamental, like, how are we going to get them to come back and spend more time. In some cases, with the Instagram story and with young girls and mental health, we are seeing rises of increasing self-harm and increasing suicide rates, not just attempted, but completed suicides, through that process. And we’re seeing all of the — all the evidence to back that up with Instagram stories and self-image comparison and fear of missing out. And that’s at the individual self-harm level. And then you can factor all the way through to the societal level with political polarization, misinformation, conspiracy theory running rampant. We’re seeing a breakdown of shared truth. And it just seems extremely harmful to me when we have this concept of customized news. Customized news is just an oxymoron, in and of itself. And yet that’s what we have scaled across the globe.

SREENIVASAN: Tristan, Jeff mentioned something interesting there, which is a notion of an objective truth, something that we kind of took for granted, that there is a fact. We can all agree on that fact. We might disagree around the policy about what to do about that. How did social media, the rise of the platforms, the algorithms, I mean, how did it get us to a place where there is such disagreement about something like the idea of a truth?

TRISTAN HARRIS, PRESIDENT AND CO-FOUNDER, CENTER FOR HUMANE TECHNOLOGY: The thing here to look at that’s very subtle is the business model of these technology companies, because in the — what is their business model? I mean, how much have you paid for your Facebook account or your YouTube account recently? Nothing. So, how is it worth $500 billion? Well, so what are they selling? They’re selling the ability to change your attention, to change what you’re thinking, feeling and shift in your feeling. And that business model means that, if I give you your own “Truman Show,” your own reality, that’s going to be more profitable than if I gave you a reality that said, here’s a bunch of information that’s more general, that’s more shared, that’s more broad. It’s not going to be as successful at getting your attention than getting you your own reality show, your own “Truman Show.” And so each of us are now 10 years into this experiment of YouTube, Facebook, Twitter personalizing information to us in this reality, and we’re not seeing the other information. So, if you imagine one version of Facebook, that was the sort of broad, let’s not call it objective truth, but let’s call it a shared reality kind of Facebook that was mostly seeing the same kinds of things. And let’s say this other new Facebook came along called personalized Facebook. Personalized Facebook just shows you what you are going to see, what you would agree with your views. That personalized Facebook will outcompete the broad shared reality Facebook. And that’s what part of this race to the bottom of the brain stem, this race to go deeper and deeper into appealing to our innate instincts. And one of those appeals is personalization. And that’s what’s shredded our reality into these million different fragments. And that’s what we have to do is — I mean, the thing I’m excited about with the film, actually, is that it will create a shared reality about the breakdown of our shared reality. And that gives us a place to stand on, so we can now proceed and say, how do we fix this?

(BEGIN VIDEO CLIP)

JEFF SEIBERT, SERIAL TECH ENTREPRENEUR: What I want people to know is that everything they’re doing online is being watched, is being tracked, is being measured. Every single action you take is carefully monitored and recorded, exactly what images you stop and look at, for how long you look at it — oh, yes, seriously for how long you look at it.

UNIDENTIFIED MALE: They know when people are lonely. They know when people are depressed. They know when people are looking at photos of your ex- romantic partners. They know what you’re doing late at night. They know the entire thing, whether you’re an introvert or an extrovert, or what kind of neuroses you have, what your personality type is like.

UNIDENTIFIED FEMALE: They have more information about us than has ever been imagined in human history. It is unprecedented.

(END VIDEO CLIP)

SREENIVASAN: Tristan, you point out that this — like most people, we say, you know what, technology is just a tool. It could be a hammer, or it could be a bicycle. The Internet is what you make of it. But you point out in this film that it’s not like the other tools that we have sort of lying around the house or in our garage.

HARRIS: I’m so glad you brought this up, Hari. And this is so critical. When you think of a hammer sitting there on your desk, there’s not a team of 1,000 engineers inside of that hammer with an A.I. pointed at you trying to figure out, what would get me to pick up that hammer in exactly the right way, at exactly the right time, and use it in exactly these specific ways to hammer those specific nails? That’s just a tool, right? A tool is just waiting to be used. And we used to live in a tools-based technology environment. I used to — I grew up on technology. I grew up on the Macintosh. As Steve Jobs used to say, Macintosh is a bicycle for the mind. It’s just this tool that is a tool for creativity, this engine. It didn’t want anything from you. If you compare that to, what is the world we’re living in now, mostly what we’re using with technology, it’s things whose business model depends on manipulating our attention and getting us to use it in specific ways for very long periods of time. So, our default technology environment is a manipulation-based technology environment. And kids are growing up in that, and they will not have known anything different. And that’s one of the things that is in the film that we’re trying to bring alarm to them, is, the digital native, this is what they’re growing up in.

SREENIVASAN: I can hear tech companies in the back of my head saying, listen, we’re not out here to try to destroy society. Look at all the amazing things that we have enabled, the communities that — of cancer survivors and the ice bucket challenge, and that, essentially, we are not optimizing for anything but engagement or advertising, because we want to make money, right? How does it get from there to, well, what seems to be optimized for this dystopian future, where everyone is living in their own little micro-bubble, and there is no shared truth?

HARRIS: Yes, it’s an interesting phrase, optimized for dystopia, because I think, as you said, the good intentions of the people that I know who build these products — I mean, I worked inside of Google for several years and tried to change these systems from the inside, unsuccessfully. And we got here innocuously. It started with a simple goal. If I’m YouTube, my goal was just to get people sharing videos, recommending videos and getting people to watch videos. What harm could come from that? Then bad foreign actors come along, by the way, not just Russia, but China, Saudi Arabia, UAE. And people start basically manipulating the algorithm. And they can start manipulating what people are believing, conspiracy thinking, et cetera. Alex Jones’ Infowars videos were recommended 15 billion times by YouTube. It’s really important that you land how big these rabbit holes go. These are not small rabbit holes. These are massive ocean-sized rabbit holes. You have flat-Earth conspiracy theory videos that were recommending hundreds of millions of times, hundreds of millions of times. So these are these are more recommendations than the combined traffic of “New York Times,” “The Washington Post,” “Guardian,” et cetera. Imagine you put on the TV and the default programming on a 10-channel television was conspiracy thinking, extremism, why you should hate someone else, and it was personalized to you. So, it wasn’t just you should hate some other person. It was, we know who you will probably hate, and we will give you information that says, here’s evidence, good evidence and reasonable evidence, about why they’re pretty bad people. And we have that running on the default television screens for about 10 years. And you wonder, why does society look the way that it does? We have sent society through the washing machine of social media’s algorithmic personalization, and it was all this accident. We think of it kind of like digital fallout. You’re just — you’re privately profiting, but the harms show up on the balance sheets of society, in everything from addiction, mental health problems, slow erosion of truth, worse journalism, shorter attention spans, clickbait, capitalism, conspiracy thinking. And that is kind of the digital fallout from this extractive race for attention.

(BEGIN VIDEO CLIP)

RASHIDA RICHARDSON, NYU SCHOOL OF LAW: We all simply are operating on a different set of facts. When that happens at scale, you’re no longer able to reckon with or even consume information that contradicts with that world view that you have created. That means we aren’t actually being objective, constructive individuals.

UNIDENTIFIED MALE: And then you look over at the other side, and you start to think, how can those people be so stupid? Look at all of this information that I’m constantly seeing. How are they not seeing that same information? And the answer is, they’re not seeing that same information.

(END VIDEO CLIP)

SREENIVASAN: You know, there’s something in the film where I think it’s Tristan that says that there’s a study that shows that, basically, lies fly six times faster on social media than the truth does. So it almost seems like QAnon conspiracy-type stuff was built for this. I mean, it is growing at a pace that is alarming to most people that are watching it, but not understood to most people realizing, wait, what is this again?

ORLOWSKI: Right. Right.

SREENIVASAN: And then, all of a sudden, you see 11 states where members of Congress or people are trying to become members of Congress who believe in this stuff.

ORLOWSKI: When you just sit on that fact for a second that lies are spreading six times faster than truth, truth will never win. Like, truth will never win in that equation. We can spend as much time as you want doing very, very thoughtful, careful reporting on certain things, and then a lie machine is just outpacing you. And you just can’t keep up with it. And that — just when you step back and look at the whole system, that’s a frightening reality.

HARRIS: I think we have to realize that when we decentralized who was producing the attention in our attention economy, it used to be that people like you were producing our attention, right? We had content producers, journalists, video creators, people who are paid or had professions and probably ethics in journalism kind of background to think about, how do we be stewards of the information environment, especially if we have mass broadcast capacity? We have granted the broadcast level powers to people in their basement with no necessary knowledge who just might be very good at Internet marketing. And we have none of this same broadcast level responsibilities. So, if you want to think about it a different way, look back to the period of yellow journalism. What happened with the attention economy with Twitter and Facebook is that they turned each of us into yellow journalists, because each of us, the more that we salacious — we say the most salacious thing, we say very quickly what we think to be true. Even if we’re not there, we just sort of claim, well, clearly, those people over there in this city that are doing this, whether it’s the Trump supporters or the Black Lives Matter supporters, no matter which side it’s for, you can make a claim and spread that five, six — it will spread six times faster. And it doesn’t have to be true. And so each of us have been converted into something that’s very dangerous.

SREENIVASAN: Jeff, the number of people who work in tech, who help design some of these systems, are we surprised that so many of them don’t let their own kids use it?

ORLOWSKI: Yes, that was a huge indicator for us. And that’s something that we have seen a number of times when interviewing people. We also learned about there are some schools that are pretty much like anti-tech schools in the Bay Area, where the children aren’t using devices on a regular basis within the school. It just seemed very, very hypocritical to be exporting technology like this to the world, and then protecting your own children from it, and yet esigning software that is explicitly tapping into youth and YouTube for kids, or recognizing that there are these age limits, but like really letting it slide and letting people underage join the platforms. This needs to be addressed societally. And to put the burden on parents to say, oh, no, you have to be in control of your own child’s usage, and you have to regulate their usage, and you have to, like, isolate them from their friends, and you have to, like, shift their entire social network themselves, that’s a huge, huge struggle for parents to have to deal with, when we are exporting this out at scale and designing things that are just bringing kids back and back and back.

SREENIVASAN: Tristan, thanks to the pandemic, there’s a lot of parents who might have been very proud about how little their kids get screens, who have had to just capitulate and say, OK, I need 15 minutes here, you can have my phone, right? What’s wrong with the argument that we’re going to adapt to this, that this is just another evolution, just like radios and televisions, smartphones and kids are going to grow up to be fine people?

HARRIS: Well, first, just the meet your question at the felt sense, I mean, it is — I can’t imagine — I’m not a parent, but I can’t imagine how hard it is to be a parent and to sit your little 5-, 6-, 7-, 8-, 9-year-old in front of a Zoom screen for hours a day. I mean, I think that this probably eats at many parents, and is to say that, I feel you that it’s an incredibly hard reality, because we’re not given a choice. That’s actually what we often say in our work is, what makes current technologies inhumane is the idea that we don’t have a choice. We’re forced to use a kind of infrastructure that doesn’t feel good, and that leads to these harms. But the fact that, if I’m a teenager, for example, and all my friends use Instagram, so even if I get off, like, oh, I’m done, it’s not about that, because, if my friends — if the only way that they talk to each other is through Instagram, I’m suddenly excluded from being part of my friendship network. And that’s one of the key things, is that it’s not just about what I do for me. It’s about — the reason why this film, I think, is so important is even you can have a whole group of teenagers, a whole group of friends, a whole group of families see the film together, and have a conversation and say, do we want to move where we’re communicating as a group off of that manipulation-based platform and to something more neutral? I mean, a good example is something like FaceTime. There’s — notice that, when you use FaceTime just to make a video call and catch up with someone you love, Apple doesn’t try to send you 500 notifications and say, here’s all the stars and the hearts, and notice this person commented, and now they’re doing dot, dot, dot. Like, they don’t do that. And why is that? Because you’re the customer. You bought the phone, and FaceTime is a tool waiting to be used. And so this is not an anti-technology conversation. I want to be very clear for parents not to be anxious just because have a slab of glass that’s in your kid’s pocket. That’s not that evil, the harm. The harm is when this stuff is not designed for us, and it puts all of these — this digital fallout on the balance sheets of society. And to your question directly, I think there’s some very crucial human attachment needs developmentally that children need to feel attachment. And when the phone starts providing too many of those benefits at a frequency and rate that’s higher than you can get in reality, your phone starts to feel like the sweetest place to go than being with yourself.

ORLOWSKI: We often reference this phrase, it’s ripping apart the fabric of society. And how far is that going to go until we adapt and get used to it? I heard somebody in the tech industry speak about, well, we just have to wait for truth to completely break down, and you won’t trust anything, and we’re going to have deepfakes, and we’re going to shift into this new world view where nobody trusts anything, and, after we get to that point, then people are going to start trusting just the people in their small circles, and that’s going to be our adaptation. And I was — I don’t know if we’re going to make it through that. So, I don’t know. I guess I just personally believe that there’s a need for shared truth and shared reality, if we’re going to make political decisions around, how should we, as a country or as a society or as a world, address fill-in-the- blank problem? We need to have a foundational grounding that we all share to talk about what we do from there. So, I don’t know if adaptation is really the main goal and hope here. We need to change these platforms.

SREENIVASAN: Tristan Harris, Jeff Orlowski, thanks so much for joining us.

ORLOWSKI: Thank you very much.

HARRIS: Thank you.

ORLOWSKI: Thanks.

About This Episode EXPAND

Christiane speaks with Gaisu Yari and Orzala Nemat about

LEARN MORE