07.29.2019

Karim Amer and Jehane Noujaim on “The Great Hack”

Hari Sreenivasan sits down with the co-directors of “The Great Hack,” Karim Amer and Jehane Noujaim, whose new Netflix film takes a hard look at the ethics of big tech and how data is used to undermine democracies.

Read Transcript EXPAND

CHRISTIANE AMANPOUR: Russia poses a particular conundrum for the Trump administration given its election meddling. Last week, as we said, the former special counsel said that Moscow’s still at it, ramping up for the election 2020. It’s weapon, of course, is the Internet – a sometime lawless space where citizens’ data is increasingly being collected without their knowledge or permission and for often alarming reasons. The new Netflix film, The Great Hack, takes a hard look at the ethics of big tech data and how it’s being used to drive wedges of fear and hate to win elections and to undermine our democracies. Co-directors, Karim Amer and Jehane Noujaim, who made Control Room and The Square raise a red flag with our Hari Sreenivasan.

(BEGIN VIDEO TAPE)

HARI SREENIVASAN: Thanks for joining us. First a quick refresher course. You know, in a way your whole movie does this, but for people who might not have been paying attention to every news headline about Facebook and Cambridge Analytica, what did Cambridge Analytica do?

KARIM AMER, CO-DIRECTOR, “THE GREAT HACK”: Cambridge Analytica realized that your phone is now listening in on you, but that that feeling of your phone listening in on your is evidence that you’re lot more predictable and persuadable than you might thing. How? Because you’re giving all of these digital footprint of yourself constantly into the system of surveillance capitalism. There is a voodoo doll basically of you that is very predictable and it is guessing what you want to do next. And Cambridge Analytica being a behavior change agency used the most powerful behavior change agency – Facebook – to collect 5,000 data points on every American voter and use those data points to find the key persuadable audience in swing states and target them with customized advertisement.

SREENIVASAN: All right, let’s take a look at a clip of your trailer.

(BEGIN VIDEOCLIP)

UNIDENTIFIED MALE: All of your interactions, your credit card swipes, web searches, locations, likes – they’re all collected in real time into a trillion dollar a year industry.

UNIDENTIFIED FEMALE: The real game changer was Cambridge Analytica. It worked for the Trump campaign and for the Brexit campaign. They started using information warfare.

UNIDENTIFIED MALE: Cambridge Analytica claimed to have 5,000 data points on every American voter.

UNIDENTIFIED FEMALE: I started tracking down all these Cambridge Analytica ex-employees.

UNIDENTIFIED MALE: Someone else that you should be calling to committee is Brittany Kaiser.

UNIDENTIFIED FEMALE: Brittany Kaiser once a key player inside Cambridge Analytica casting herself as a whistleblower.

BRITTANY KAISER: The reason why Google and Facebook are the most powerful companies in the world is because last year data surpassed oil in value. Data’s the most valuable asset on Earth.

(END VIDEOCLIP)

SREENIVASAN: Jehane, what makes us different than how campaigns have worked forever? They’ve had ways to target people. They’ve had negative ads against the opponents. How’s this different?

JEHANE NOUJAIM, CO-DIRECTOR, “THE GREAT HACK”: Well, in following Brittany, she talked a lot about working in the Obama campaign where this was kind of – it was very beginnings of using data, and the Obama campaign used data but in a very sort of remedial way at that point in time. This is different in that because of the technological advances people can be targeted. So people are no longer voting for issues. They are listening to ads that are targeted based on what the algorithm knows about you.

SREENIVASAN: Let’s take a look at a clip of Brittany describing how it worked.

(BEGIN VIDEOCLIP)

KAISER: Remember those Facebook quizzes that we used to form personality models for all voters in the U.S.? The truth is we didn’t target every American voter equally. The bulk of our resources went into targeting those whose minds we thought we could change. We called them the persuadable. They’re everywhere in the country, but the persuadable that mattered were the ones in swing states like Michigan, Wisconsin, Pennsylvania, and Florida. Now each of these states were broken down by precinct. So you can say there are 22,000 persuadable voters in this precinct and if you targeted enough persuadable people in the right precincts, then those states would turn red instead of blue.

(END VIDEO CLIP)

SREENIVASAN: The same firm, Cambridge Analytica was also working on the lead campaign. What were they doing?

AMER: So, through the testimonies of Brittany Kaiser, who is one of the central characters in the film, and when we meet Brittany Kaiser, she’s kind of in Thailand and she has fled from her former life as a key member of this company, Cambridge Analytica, which had worked on both Brexit and the Trump campaign, as well as many other elections around the world. She finds evidence in her files that Cambridge Analytica worked on Brexit and she has evidence to prove that she provided that evidence to Parliament under parliamentary privilege in her testimony. And she also, in that testimony, explained that the information system — the information technologies that were used by Cambridge were classified by the British government as weapons grade information systems that were export controlled. That is — that is what we’re talking about here. We’re not talking about just, oh well, everybody can just send an ad or two. No, we’re talking about taking military grade technologies and applying them on a domestic population without the domestic population having consent during an election cycle.

NOUJAIM: So, it’s cyops. It’s basically the kinds of tools and weapons grade tools that were used abroad, whether to try to change the minds of potential kids joining ISIS or change populations, instead those same tools were used on the British population .

AMER: And the American.

NOUJAIM: and the American population.

SREENIVASAN: Did they cheat? Is this cheating to use these technological tools today, to get your side to win a campaign?

NOUJAIM: Well if you — when you watch the film, you’ll see that that question is asked in Parliament and Christopher Wylie answers that, yes, this is cheating. And he says that — well, the next question that’s asked of him is, well, did it really work? Did it really cause Brexit? Did it really — did it really help the Leave campaign? Did it really affect Trump being elected? And his answer was, well, when you have doping, you don’t say, well, if they guy — would they guy have come in first place anyway? He cheated. And so, he should be eliminated. So, yes, according to the people that we followed, this was cheating.

AMER: Yes, I mean, I think it’s about understanding that — it’s about deciding what’s for sale. What’s happened in the advent of this new surveillance capitalism system that we live in, which is the term that Shoshana Zuboff, was the thought leader in this space, has coined, is that we have become the commodity, as David Carroll says in the film. And we, as commodities, we’ve accept — we’ve allowed for all of our recordable behavior to be bough and sold and commoditized. Now, what’s happened when it goes into an election cycle is that, you have allowed for the commoditization of the Democratic process, because you can buy and sell nodes of peoples of personalities and behavior and influence it quite effectively. And the problem with all of that is that it’s happening in darkness, there’s not transparency. We have no ability to know where the ads — what ads people saw, where — who paid for them and we’ve had not understanding of that now. Facebook has the answers, but under the way in which our data is currently regulated in this country, we have no rights to our data, right? Like, we live in a country where we’ve allowed for all of our e-mails to be read by technology platforms. We live in a country where minor’s data can be transacted. We have to start asking these questions about, how much of ourselves is for sale and what happens to the integrity of the Democratic process when so much of it becomes commoditized.

SREENIVASAN: The U.K. government said, the Parliament said, their election system is not fit for purpose, what does that mean?

NOUJAIM: It means that the election laws are not able to withstand the technological advances that have led to the tools that have been used by Cambridge Analyica and other companies to come in the elections.

AMER: I think that what we’ve — what we’re witnessing is a moment where the balance of power has shifted. And when you look at Facebook, for example, Facebook has over two billion constituents that it has access to so much of their detailed information, but Facebook is not a Democratic institution. Facebook doesn’t have any responsibility, necessarily, to all of us, the users. And Facebook has not taken any credit or responsibility, more importantly, for any of the wreckage sites that we’re seeing, both in the United States and abroad, whether it’s with the Rohyingya crisis, whether it’s with information being used and weaponized in other locations around the world.

SREENIVASAN: What’s a wreckage site, when you talk about it that way.

NOUJAIM: It’s a term we kind of made up actually.

AMER: Yes.

NOUJAIM: Because when we started making this film, it was invisible. It was how do you make the invisible, visible. This is something that’s happening in your brain and on the computer, but it wasn’t the makings of a visual movie. Well, we started to look for wreckage sites, and by wreckage sites we meant, this is what happened because of what’s happening with microtargeting and we felt like the 2016 was a wreckage site and Brexit was wreckage site.

AMER: And the wreckage — the purpose of the wreckage site is for to show this vulnerability, which is what a hack is. A hack is the exploitation of a vulnerability. I think the hack that we found to be interesting was the exploit — the vulnerability of our own minds and how our, as moral creatures, increasingly being shaped by these a-moral algorithms, how it would effect ourselves and our societies, and ask — leave with the question of who dictates the ethics of these algorithms. I don’t believe that anybody wakes up in Silicone Valley and says, hey, how am I going to wreck democracy today, right? I don’t think that’s an active conversation, just like I don’t think any oil executive wakes up and says, hey, how am I going to pollute today. But the reality is, is that we saw in the industrial age where oil was the main commodity, there was externalities and spill-over effects that had a major — have had major societal costs. The same time, we are witnessing right now, the externalities that have been caused by this love affair we’ve had with technology, where we assume that move fast and break things can always lead to just growth and innovation. And now, we’re at the reckoning point.

SREENIVASAN: There’s a lot of people who watch this film and say, Brittany, what’s going on with Brittany? Is she more contrite off camera? Because as someone who knows how campaigns work, how messaging works, it’s also possible to see the film and say, you know what, she’s playing these filmmakers, and by extension, us. That this really just part of a slick redemption tour. At the very end, I kind of see her having regrets, but you guys spend a lot of time with her. What was going through her mind and why she wanted to speak to you and how that evolution happened, even while you were filming, because you were filming during an active story.

NOUJAIM: We didn’t know what to think of here when we met her. Similar, I think, to an audience watching. And our goal was to follow her and see how this journey unfolded. It’s incredible when you meet somebody who has been in the Obama campaign and then wrote the first contract for the Trump campaign.

SCEENIVASAN: And pitched Trump.

NOUJAIM: Pitched Trump, worked on Brexit, was — had certain ties to Julian Assange. Was investigated by Mueller, so in that way we had a human being, in this invisible story we had this human being that was going to take us into rooms that we would never otherwise get access to. And so, our job as filmmakers was to follow that journey, and she was — she allowed to — there was never a moment when she shut us out or asked us to turn of the camera. And that was — that’s such a gift as a filmmaker.

SREENIVASAN: Yes.

NOUJAIM: But, it’s a — she’s led a complicated life. And we — we were excited to jump off the cliff with her, but she’s been in complicated situations and made complicated decisions.

AMER: And I think that it’s never been more important than in these times, these polarized times, that we have stories that allow us to see how redemption is possible. And what redemption looks like is not — is — in our context for this film is about how people from different walks of life get together to tackle a problem like this. Here we have a film a — David Carroll, a professor who decides to say, I’m not going to wait for the Mueller Report to figure out what did or didn’t happen. I’m going to start asking questions and I’m going to ask a very simple quest of, can — do I have the right to have my own data? Do I have the right know what you know about me and how I was or wasn’t targeted? Right? And goes on this quest where he decides to hold power accountable and ends up suing Cambridge Analytica.

SREEVISAN: Let’s take a look at a clip of a — the other characters from David Carroll.

(BEGIN VIDEO CLIP)

UNIDENTIFIED MALE: I was teaching digital media and developing apps, so I knew that the data from our online activity wasn’t just evaporating. And as I dug deeper, I realized these digital traces or ourselves are being into a trillion dollar a year industry.

(END VIDEO CLIP)

SREENIVASAN: You know, what you might call surveillance capitalism, Silicon Valley might just say, this is just the cost of innovation. What is the danger, what is the threat, in that bigger picture, when our data is transacted?

NOUJAIM: Well, it’s a bad deal. We’ve made a bad deal, right? I mean we think that this stuff is free, but yet – and we’ve never – yes, we’ve never paid anything to Facebook or Google, but at the same time, they’re the most successful companies that exist today. But it goes back to the question of what is for sale, and it goes back to the question of consent and what are we consenting to. We’re just clicking those user agreements, right, without even reading them. We would never do that with a written contract, but we do this, you know, many times a day in order to make our lives a little bit easier. So in a way, we’re all complicit, but in that we are becoming the commodity. And we have to ask ourselves how much ourselves is – should be for sale.

AMER: I think there’s two things that, to me, help understand this conversation. You know, we – one is the word consent, right, which has never been more debated and redefined than – in today’s society than now, right? And currently, the relationship with Silicon Valley, whereby, you know, you give up all your privacy, as the admission feeds the connection world, is a non-consensual relationship. It is not a relationship where the user fully understands what is happening to them. It is an exploitative relationship, and the sooner we understand that the better. The engineers of the future would not have the Valley without the open society, and we have to ask them, are they committed to building a world that allows for ethics and innovation to work hand in hand, or do we just say that, you know, we have to only have technology and forget ethics? The problem with that, in my opinion, is that as important as technology has been to innovate and expand our capacity, we have always needed ethics to preserve our humanity.

SREENIVASAN: You’re taking a look at this specific use case, because it’s been publicized and we know about it. What about all the other companies that are trafficking in our data? What about the next election and the companies that are probably selling to campaigns right now?

NOUJAIM: Absolutely. 2020 is around the corner, and both sides of the aisle will be using this technology.

AMER: But I think the movie’s really about whether we can have a free and fair election ever again, which I think is important. I think why that’s particularly important to us, and maybe why the urgency in the film feels quite sharp, is because we have – we come from Egypt. We’ve seen that Democracy is not some god-given right as some people may think in this country. Democracy is fragile, and democracy can be taken just as easily as it’s available. And when it – when the – when the core of the Democratic process is under assault, it should not be a partisan issue. This should be a rallying cry for people from all sides of the ally, and proof of that actually is that privacy is not a partisan issue. Privacy, time and time again, has support from both Republican and Democrat lawmakers. So I think we are in a moment of new awareness where people are asking more difficult questions, as you – and there’s beginning to be an accountability movement. And I think we as citizens have to now take the opportunity to demand a new social contract, but I think that social contract is not going to be between us and the government as it used to be. In this era that we live in, it’ll be between us, the government and the tech platforms.

SREENIVASAN: Karim Amer, Jehane Noujaim, thanks so much for joining.

NOUJAIM: Thank you so much.

AMER: Thank you so much for having us.

NOUJAIM: Thank you for having us.

About This Episode EXPAND

Peter Navarro tells Christiane Amanpour about attempts to end the year-long trade war between the U.S. and China. Julia Ioffe and Husain Haqqani join the program to discuss significant developments concerning the United States in Russia and Afghanistan. Hari Sreenivasan speaks with the co-directors of “The Great Hack,” Karim Amer and Jehane Noujaim.

LEARN MORE