Read Transcript EXPAND
WALTER ISSACSON: It’s Alex Stamos. A wonderful guy. He was chief security officer over Facebook and as we went up to the 2016 election. This is not a job in retro spec that you wanted to have. And we talked about the evidence started coming in. That the Russians were both hacking our election and then using young kids to try to mess with our minds through Facebook. And the frightening thing about his interview was he said nobody’s trying to stop it. The republicans have blocked any plans to try to halt this. Facebook is making some efforts but not enough and it’s too late to stop it for the midterms this year, but we got to start working now if we don’t want the Russians to hack us in 2018. Here, listen to a little of what he had to say. You were chief security officer for Facebook in the 2016 election which seems like a pretty tough job especially in retro spec. When did you start getting the sense that something was going definitely awry with Russians hacking our election.
ALEX STAMOS, CHIEF SECURITY OFFICER AT FACEBOOK: The actual direct hacking didn’t happen on Facebook. But what did see after the hacks against the DNC and Trump (inaudible) email, Colin Powell’s email, we saw them come back and create personas — fake personas of American individuals behind an organization that they called DCLeaks. They set up their own WikiLeaks effectively.
ISSACSON: So when did you discover that people were setting up fake news sites to give disinformation?
STAMOS: So the — that (inaudible) activity was in the fall and we shut it down. During the entire election there was this whole debate around fake news and a real question of who was behind it. But it wasn’t until after the election that we really dove in to trying to figure out is fake news mostly a government phenomenon or something that’s financially motivated. And it turns out, as we figured out kind of in the spring of 2017 the majority of what people were calling fake news, the Pope endorses Trump, Hillary has cancer kind of stories. The majority of those stories were actually driven by financially motivated spammers. Who were trying to create lots and lots of click-baity headlines to make money on ads?
ISSACSON: So in 2016, I remember people talking about Russia trying to influence our election with these things. You didn’t know about it?
STAMOS: We knew that they were pushing stories. We didn’t — at that point, did not know about the large cluster of activity that was found in 2017.
ISSACSON: Or the people at the Obama administration, Homeland Security people all knew Russia was putting out — trying to meddle in the election through the internet research agency. I’m surprised you didn’t know.
STAMOS: Well, to be clear, they were talking specifically around the hack and leak campaign and we got some information from the government around that. We got nothing from the government around the IRA.
ISSACSON: How could you not know that these were fake accounts coming from weird people in Saint Petersburg using proxy servers?
STAMOS: I think one of the issues was we were looking for the traditional kinds of hacking activity that we saw from JRU. We were looking for people to break into accounts to steal data, to spread malware. There — the creating of groups, who’s entire job it is to look for organized propaganda activity, that didn’t exist either at Facebook or honestly anywhere in the industry at the time.
ISSACSON: (Inaudible). It was a large coordinated group of people who weren’t really who they said they were. It was clear these weren’t real American’s having real opinions.
STAMOS: It wasn’t clear. And nobody — there was a lot talk about Russian activity and there was obvious Russian activity around the DNC hacks and the Podesta hacks but there was nobody of data that we could use to find the IRA. That was never — whatever the government had they didn’t share.
ISSACSON: So you don’t know — you can’t find out where — if large amounts of information are being pushed through Facebook, you can’t find out where it’s coming from and say hey, that’s weird. This doesn’t sound right?
STAMOS: Well, nobody is sitting there approving these post. Right? Like these platforms allow people to communicate in real time. What you can do is later look for is there activity that looks like it’s coordinated? Are these accounts tied together in a weird way or do you have geographical links. But those links are not always obvious. And it actually took a pretty large effort that Facebook took on voluntarily and internally to go find and stop that activity. But honestly it was in the end a tiny portion of the overall discourse.
ISSACSON: It did seem to have an effect.
STAMOS: Well it has — it certainly has an effect.
ISSACSON: And it had an effect in which direction? What were they trying to do?
STAMOS: Well, again, there’s two totally different .
(CROSSTALK)
ISSACSON: I’m talking about the IRA.
STAMOS: Right. The GRU campaign directly targeted Hillary Clinton. The IRA campaign is much more dispersed. It started before the election and it’s lasted afterwards. Their goal is to get people not to trust each other online in America. They want us to believe that our political opponents are the craziest — you’ll hold the craziest wildest versions of any argument. And so that’s why they will pretend to be, for example, a black lives matter activist. And in doing so will push forward much more radical positions than anybody who is legitimately part of that movement. And then on the other side they’ll have a pro police group that will specifically reference that group saying, Oh my god, look .
(CROSSTALK)
ISSACSON: But they’re both fake.
STAMOS: But they’re both fake.
ISSACSON: I’m going to ask a broader philosophical question because we wondered into it.
STAMOS: Yes.
ISSACSON: If we were trying to fix the social media ecosystem that was supposed to bring us together and make us better, wouldn’t we try to find ways to have a little bit less anonymity. Allow people some freedoms to maybe use a suedo-name but make sure that just as in the real world if you really did something bad you could be tracked down?
STAMOS: In most cases I think people can be tracked down under lawful process on most of the major sites. And you’re right, this is an interesting problem and that you have the spectrum where you have more anonymity means more freedom but also a lot more abuse versus sites with less anonymity. And I think Facebook is actually on the far side of less anonymity compared to almost any other major platform. You know like the Twitters and the YouTubes in the middle where people are suedo-anonymous but don’t have to tie that account to the real identity. And on the most anonymous you have the four chans and eight chans and you can see from anybody who’s visited those sites they become pretty aggressively and hostile and poisonous communities very very quickly.
ISSSACSON: But so we see then a spectrum where the more pure anonymity you have, the more hostile and aggressive people become. It’s just a sociological phenomenon we discovered with the invention of social network. And then you get to a Facebook which has a little bit less anonymity. Shouldn’t we have some way where we can have civil discourse that’s even more authenticated than Facebook is?
STAMOS: Maybe I — you know I think one of things you got to think about is we’re talking about this from a very U.S. centric perspective and something like 90 percent of Facebook users are not American. Right? And we got to be careful about — you know here and I can post things on Facebook against our government under our real names and not face legal repercussions. That is not true for a huge chunk of people in the world.
ISSACSON: And you’ve said that you take some responsibility in retro spec. What do you feel responsible for that you’d wish you had done differently?
STAMOS: I wish we had had dedicated teams looking just for the propaganda activity. We were a security team, I’m the chief security officer. We were focused on information security. But it turns out that the vast majority of harm that happens online is not tied to technical security issues, it’s the technically correct use of our products to cause harm, and that includes disinformation, misinformation. So I do wish that we had constitute of that – that team before the election that we had seen the warning signs.
ISAACSON: Do you think Facebook and Twitter have good teams like that now?
STAMOS: I can’t speak to other companies. I think Facebook has built a really good team. The question is like, how much is enough is always going to be a difficult – is it big enough, is it good enough I – is always going to be a hard thing to answer. Because the truth is, is we’re also reacting to what happened in 2016. There’s so much focus specifically on Russia, specifically on the U.S. election 2016 that we’re kind of missing the big picture of what has happened since then. And the – the texture of how disinformation, misinformation has worked in elections since the U.S. is actually quite different.
ISAACSON: You said when you get signs that something’s going to arrive, you call the government-
STAMOS: Yes.
ISAACSON: You call government agencies. I know if a missile hit the Facebook campus, you know exactly who to call. If a burglar came in, you’d call the local police. Is there a one sort of 9-1-1 number you call in the U.S. government or is it too diffuse the way the U.S. government handles this?
STAMOS: Yes, I think this is actually a problem in the U.S. is that we have – we don’t have a single agency whose reasonability is to prevent cyber attacks against American assets, including this disinformation and misinformation. But also including traditional security threats. And effectively where we’ve ended up is the F.B.I. is playing that role. And there’s a lot of really good confident people there. But they are – they are structured as an organization to investigate crimes after they happen and to indite people that are within the reach of the U.S. legal system. And the truth is that’s just not how cyber attacks work. It – there’s nobody whose really motivated to prevent things from happening, to build the relationships with the companies that can prevent the activity in the first place and perhaps in that case earn the opportunity for legal action.
ISAACSON: And do you think part of the problem is as you said, congressional republicans after all this happened just said OK, that was fine, were not doing anything about it.
STAMOS: Yes. As a country, were not going to be able to respond to this problem unless we all agree it’s a problem, right? Just imagine Pearl Harbor happens and half the country believes it was a hokes. So we – there’s no way we could of moralized the homeland and win World War II.
ISAACSON: Yes.
STAMOS: And that’s where it feels like we are right now, that we’re in a place where a significant percentage of the country believes that nothing happened in 2016. And to be honest, of all the people in government, I think congressional republicans have some of the most responsibility for that because they have slowed down investigations, they have reduced the (inaudible) people have. And it would be very very powerful to see both republican and democrats speaking with one voice that this is not something the United States is going to allow and passing legislation to secure our elections to give the proper authorities to regulate the online ad ecosystem. But they don’t do it, because they can’t get passed the idea that that calls in the question of the election of 2016. And we just got to say OK, Trump’s president, he won, let’s move on and think about where we’re going. Because the – the other risk they’re running here is I think republicans think that this is always going to be a good thing for them. Countries getting involved in our election, and that’s not-
ISAACSON: Yes. And when North Korea, Iran and China do it and start electing democrats-
STAMOS: Right, right. And the republican playbook is out there. Anybody who is paying attention, who reads the reports that we put out, the government’s put out the indictments, you can recreate the entire Russia playbook from the outside. And there’s nothing technically sophisticated that the Chinese can’t do, the Iranians the North Koreans.
ISAACSON: And you’ve gone to Capitol Hill to talk to some of these people a lot. I think on the Senate side, you have Senator Mark Warner and Senator Burr, democrat an republican trying to work together but not yet too far yet.
STAMOS: Yes-
ISAACSON: But at least they’re civil. Bit on the House side, it’s much different, right?
STAMOS: Right. Yes. Yes ,and you have to give the Senate intel committee credit that they have been able to accept the basic (inaudible) that we were attacked in 2016 and that they have to do something about it. They haven’t done anything, right? So, I mean there is that component we are almost two years out. And there hasn’t been any actual legislation that’s come out. And there hasn’t – I think honestly as a country, what we’ve missed was the opportunity for something like a 9/11 commission. We really should’ve had a nonpartisan, nonpolitical–
ISAACSON: Yes.
STAMOS: Investigation. Because one of the benefits we have in the 9/11 commission is there are lots of augments about whose fault things were and over the legislation but the basic set of facts in tat document were generally agreed upon by the vas majority of the political-
ISAACSON: OK, so tell me what should be done. It’s probably a little bit too late for the 2018 midterms, is that right?
STAMOS: Yes. Yes. I — I think we’ve — we’ve definitely missed our shot to collectively respond in the — effectively in 2018. There are people in DHS doing work, there are people in the FBI doing work, there’s people at the tech companies doing work. It’s not very well coordinated, and as a country we have not done the big picture things that we need. When you think of all of the components of the online ecosystem that were weaponized, the component I think we need to be most concerned about is advertising, because that is a very powerful way to find and reach people who are vulnerable to your message. And that’s not just in the case of foreign influence. We have to be really careful about what we allow our politicians to do online versus traditional advertising.
ISAACSON: And so give me the Alex Stamos plan.
STAMOS: One thing we’ve got to look at is who’s responsible for the actual security and election infrastructure. DHS was granted power on the last day of the Obama administration…
ISAACSON: Yes, Department of Homeland Security, but they couldn’t get the states to buy in.
STAMOS: Exactly. And they’re working really hard, but they don’t have the power, they don’t have the resources, they have no authority.
ISAACSON: Is that partly because the Trump administration doesn’t want this?
STAMOS: I think one of the big issues right now is that there’s no cyber- security coordinator in the White House, and so as a result you have all the agencies trying to do their best, but part of the goal of the White House and the National Security Council is to knock those heads together and to get everybody going in the same direction, and that doesn’t exist. We need to push this responsibility to the states. There’s a number of states that have central security teams that are highly competent. Colorado is probably the best example that I personally worked with. Every state needs to have a cyber-security center on elections who can support then all of the cities and counties. There’s over 10,000 local election authorities. We can’t have 10,000 competent security teams, but we can have 50. And so we — we need to have federal legislation to support that, to give them access to intelligence to do clearances for the people, because they’re going to need access to NSA and FBI data, but we’ve got to push that responsibility to the states. If the states want to run their own elections, they should also be held accountable for that.
ISAACSON: And what should Facebook and other social media giants do?
STAMOS: There needs to be more and more transparency. Now this is where it gets difficult. Because those pages and those pseudo-anonymous Twitter accounts are also what drove democracy movements in Egypt, and are being used by the resistance in Thailand to fight the military junta, and are being in Turkey, and so there’s a lot of autocrats who would love to have transparency. It’s just…
ISAACSON: And by the way, all of those movements failed in the lay.
STAMOS: That’s right. And yes, and that’s I think one of the sad — when we look back at the Arab Spring and all these other democracy movements, there’s only been a couple of positive long-term stories. For the most part, the autocrats have — the empire has struck back, right? The autocrats have figured out how to first neutralize the risk from social media and then to weaponize it against their own — their own movements.
ISAACSON: So what are you doing now? Both to address this and to make sure average citizens, students of Stanford can address it?
STAMOS: Yes, so I’ve joined Stanford as an adjunct professor, and one of the things I want to work on is — is solving that problem of how do we build institutional memory in Silicon Valley of all the ways people have caused harm before so that when you launch your product, you have thought adversarialy, you have thought about the problems that Facebook or Google or Twitter or YouTube have faced, and you have put solutions in place so that you don’t have to scramble afterwards and make excuses of like, “How could we possibly know that this problem that’s existed for years happened over again?”
ISAACSON: And so I’m a student in your class and I’m saying, “Professor Stamos, it didn’t work out the way your generation thought. Social media didn’t help democracy, even around the world, it didn’t help democracy at home, it didn’t bring us together, it lead to a lot of bullying and divisiveness. What did you do wrong and what would you do next time?”
STAMOS: I think what we have to do is we have to stop building products just for ourselves.
ISAACSON: Yes.
STAMOS: Silicon Valley has a serious diversity issue, as a lot of people know. A lot of people look like me, they come from backgrounds like mine, computer science degree from an elite school, you know, backgrounds that allow us to get into good colleges, access to capital; those kinds of issues mean that a lot of people look the same and have the same abuse. We have to first build diverse teams from day one, not staple on diversity way later. And the second is when we build products, we have to think adversarialy. We have to, from day one, not just think, “This is how I want my product to be used, this is how my parents will use it and my friends will use it,” you’ve got to think about all the bad guys having it. But even if 0.1 percent of the planet are people who really want to cause harm, that is tens of millions of people on these platforms, and you have to imagine that from day one, that those people exist, understand what they want, understand what they’ve done in the past, and think about your product from day one…
ISAACSON: Alex Stamos, thank you for joining us.
STAMOS: Yes, thank you, sir.
About This Episode EXPAND
Christiane Amanpour interviews Bob Woodward, author of “Fear: Trump in the White House” and Associate Editor, The Washington Post and Carl Bernstein, Watergate Journalist and CNN Political Analyst; Sarah Jessica Parker, Actress and Co-founder of SJP for Hogarth and Fatima Farheen Mirza, Author of “A Place for Us”; Walter Isaacson interviews Alex Stamos, Former Chief Security Officer, Facebook.
LEARN MORE