08.13.2019

Caleb Cain and Kevin Roose on Radicalization and YouTube

Hari Sreenivasan delves into the perils of technology and the alt-right with Caleb Cain, who became radicalized watching YouTube, and columnist for the New York Times Kevin Roose, whose reporting explores how people like Caleb are groomed online.

Read Transcript EXPAND

CHRISTIANE AMANPOUR: Now, it`s been two years since right-wing extremists descended on Charlottesville, Virginia in what became the largest eruption of white supremist violence in a generation. Today, questions remain over why young people around the world are adopting far-right ideologies. After dropping out of college, Caleb Cain, embraced white nationalism to find direction. But his journey into alt-right America did not happen at rallies or protests, it took place in his bed room watching YouTube. “New York Times” journalist, Kevin Roose, has been exploring the alt-right universe online and reporting on how people like Caleb are groomed. And the two sat down with our Hari Sreenivasan to explain what it takes to embrace and then to reject extremism in this digital age.

(BEGIN VIDEO TAPE)

HARI SREENIVASAN: Kevin Roose, Caleb Cain, thank you both for joining us. Kevin, you wrote an article about the making of a YouTube radical that featured Caleb. And, Caleb, for people who haven`t read that article I kind of want to start with the basics. Where were you in life when this all started?

CALEB CAIN, RADICALIZED BY ALT-RIGHT VIDEOS: Yes. So, I had a lot of depression. I wasn`t going to class. Dropped out of school. Ended up back at home in West Virginia. And really was like beaten up and depressed about that. I felt like I had failed. And I went to YouTube, a place that was comfortable for me, because I was spending a lot of time just lying in bed and watching stuff on the internet. And I eventually found self-help videos and videos about psychology and neuroscience on YouTube and I went down this rabbit hole of self-help and it led me to someone named Stefan Molyneux. And Stef`s videos were like a spark to kind of jolt me out of that. What I didn’t realize was all the ideology that came with that.

SREENIVASAN: I want to just get an explanation. Who is Stefan for viewers who may not know him?

CAIN: Yes. So, Stefan Molyneux is a Canadian YouTuber and broadcaster.

(BEGIN VIDEO CLIP)

STEFAN MOLYNEUX, CANADIAN BROADCAST AND YOUTUBER: The only thing that stands between the left and its takeover and subsequent horrors in the West are white males, which is, of course — which is why white males must be so demonized.

(END VIDEO CLIP)

CAIN: And he made his money back in the `90s, I think, with a tech company. And basically, he transitioned from that world into doing a philosophy radio show.

SREENIVASAN: Yes.

CAIN: And it was a show — a mixture of him giving therapy lessons and also inducting people into libertarian ideology. And his transition has been very interesting. You know, we talk — I talk a lot about the libertarian to alt-right pipeline, right. And he followed that trajectory, you know, whether he radicalized himself or he`s just chasing a fan base that`s been pulled to the right. And so, he`s got a whole career, a whole empire built on this philosophy show which is really just a propaganda outlet for, you know, right-wing beliefs, far-right beliefs in my opinion.

SREENIVASAN: So, is that something that you see? You`ve talked to a lot of people like him. What are the kind of common ingredients that these individuals have before they get into the rabbit hole?

KEVIN ROOSE, NEW YORK TIMES TECH COLUMNIST: Yes. I think that — I have talked to a lot of people who I would categorize as extremists, people who have sort of far-right ideology. And I`d say like 75 percent of them easily got started on YouTube. And they were — there are some patterns. You see people who are sort of young, who, you know, are not super comfortable where they`re living, geographically, they don`t have a ton of friends where they are. They`re spending a lot of time on the internet. Maybe things aren`t going so well for them economically or in their life. And there`s this whole sort of network of YouTube creators who have gotten very good and very savvy at speaking to those people.

SREENIVASAN: So, you`re personally struggling and you`re looking to YouTube for self-help, which is admirable, right? You wanted to try to improve your situation. But what was it about these characters that you found online that resonated with you?

CAIN: I grew up around a lot of racists, you know, people that would openly say slurs and have ideas about people. But I always I fought against that. And so, it was strange to me how I fell into these beliefs. What really gravitated me towards it and what a lot of, as you were saying, as Kevin was saying, what a lot of these people have in common, these YouTube content creators, is they set themselves up as authority figures and more importantly, they set themselves up as father figures. And a lot of people looked up to these people. Stef was a father figure to me. It`s something that I was actually kind of conscious of during this whole period. Jordan Peterson, you know, I wouldn`t put Jordan Peterson the camp of the far-right, but Jordan Peterson is another figure like that. Even Jared Taylor and David Duke have this kind of old uncle vibe to them and I think people are drawn to that, a lot of young men that find themselves kind of distraught and lost in life. And they look for something to structure themselves. And they feel like in today`s society, they don`t have that. We don`t have a lot of organized religion, we don`t have a lot of, you know, bonds in our society. And so, they turn to people online that offer that to them through rhetoric and ideology.

SREENIVASAN: So, we`ve got two of those ingredients here, a kid that`s lonely and looking for help, you got these personalities that are compelling and offer them structure and then comes YouTube. And really the algorithm that suggests the next video you should watch.

ROOSE: Yes. I think people have an idea that YouTube is just a place that hosts videos. But the core of YouTube is really this kind of recommendations algorithm that shows people videos that, you know, you might also like. If you`re watching this video you might like these five other ones and then it auto plays after the video you`re watching finishes. And that single algorithm is responsible for something like 70 percent of the time that people spend on YouTube. So, that`s really the heart of YouTube, is this algorithm. And it`s been through a lot of changes. So, it started off just sort of basically, you know, if you are watching one basketball video it will show you another basketball video.

SREENIVASAN: Sure.

ROOSE: But then Google started improving it using artificial intelligence and really put some engineering muscle behind it and really made it quite good at discovering how to keep people on YouTube for longer.

SREENIVASAN: Because the longer they stay the more ads they get to show you and that means the company makes money.

ROOSE: Exactly. So, they make more money the longer people watch. And in an effort to get people to stay on the site more, they started directing people through this algorithm down kind of these rabbit holes that were filled with long, you know, emotionally intense videos. They were driven by conflict and sort of ideological, you know, conversation and news. And so, they never meant to do this. This was not programmed into the algorithm but it was a huge boost to people like Stefan Molineaux and some of the others creators that Caleb started watching. They were getting millions of views through this algorithm that learned to kind of detect what could keep watching for the longest amount of time.

SREENIVASAN: So, as a creator, they could basically game it so that they understood what was valued and what would give them more views, so they start creating more contents that way and then just kind of feeds on itself.

ROOSE: Yes. Exactly. They learn that conflict sells on YouTube. They learn that certain ideologies are attractive to people on YouTube and they learn how to sort of talk about the things that people want to hear about. They are very smart at this. And they don`t have a lot of competition because there aren`t many people turning out, you know, hours and hours` worth of YouTube content every day. And for someone like Caleb who, you know, I think probably watched — at some point, you know, you were watching, what, six, seven, eight hours of YouTube a day?

CAIN: More than that.

SREENIVASAN: Wow.

ROOSE: So, somebody has to start of fill that bucket and these people were more than willing to do that.

SREENIVASAN: So, for his story, you laid out your entire YouTube history over a period of years for him.

CAIN: That was very brave.

ROOSE: It is, right.

SREENIVASAN: You literally had an idea and a look into every single video that you watched.

CAIN: Yes.

SREENIVASAN: Didn`t your family, your friends start to say, “Hey, man, what`s going on? You`re spending more time on your laptop than hanging out with us?”

CAIN: I mean, with my family, I`m disconnected from a lot of them and with my friends, I mean, the friends that I was talking to at the time, I have some — you know, you have internet friends and you have your — you know, your real life friends. And both of those mixtures were people that kind of shared my beliefs. So, when we would talk about stuff like it didn`t seem weird that, oh, you know, that I watched a lot of YouTube or that I had these beliefs. And even the people — you know, the person I was dating at the time, I don`t think that she was very fully aware of like what I was really, you know, watching and how much of it I was watching. So, no, nobody ever stopped and said, “Hey, like you`re watching a lot of YouTube.”

SREENIVASAN: But what`s the message that was piercing through to you? What made you connect to somebody there and say, “Yes, I want to see more of what this person has to say?”

CAIN: So, there`s this concept of the red pill. And it`s taken from the movie “The Matrix” where Neo lives in a fantasy world, a simulated computer simulation. And he is offered the red pill to basically wake him up from the false reality and to see the world for what it really is. So, when you start — the way these ideas are presented to you is that these are the uncomfortable truths that you`re not willing to deal with. The fact that there are racial disparities, there are income disparities, crime disparities and IQ disparities between races and that`s it`s in an hierarchy, this is, you know, an objective truth that you have to deal with. Never mind all the — you know, the ways that people are, you know, kind of trained in to things, you know, socialized into things, never mind economics and all that stuff. Something like race realism is explained to you that this is biology, this is the way it is, the differences are genetic and if you want to be in line with reality, an objective truth, then you have to deal with the truth. You can`t live off in liberal fantasy land.

SREENIVASAN: So, they`re exposing you to the secret or a reality you haven`t —

CAIN: Exactly.

SREENIVASAN: — been privy to yet but they`re showing you the light.

CAIN: It`s a hidden knowledge and that`s exactly what it feels like. It feels like you`re going into this deep cave to discover hidden knowledge, except at the bottom of the cave is Nazi gold, not some, you know, universal truth. What`s at the bottom of the rabbit hole is literally Nazi ideology but it`s not that packaged that way. It`s a slow drip. They slowly drip feed you these ideologies and they tell all along the way that you`re doing the right thing, you`re being objective, you`re being logical. And it`s a mixture of using narrative and rhetoric to slowly bring you to do white nationalism.

SREENIVASAN: By the end of it, what was the most dangerous thing you were casually believing?

CAIN: Well, I would say the race realism was one of the more intense things. I believe that there is a feminist plot that feminism, whether guided or unguided, was a plot to emasculate men. I thought the transgender movement or however you want to describe it was a plot to emasculate men. I thought that Muslims were invading Europe through the 2015migrant crisis, that they were coming in en masse to invade the country and demographically replace people. At the bottom of it, I was listening to people like Jared Taylor who I later learned that guy goes back decades of promoting you know white nationalist and white supremacist propaganda. So that was like the bottom of the barrel for me from my experience.

SREENIVASAN: So Kevin, YouTube is going to say, “You know what, we are not in the business of picking winners and losers, we don`t sit there and say let`s amp up this video and let`s vote this video down.” Is that accurate?

ROOSE: It is accurate to say that I don`t think anyone is accusing YouTube of programming into their algorithm, you know, boost this Nazi video. I don`t think anyone feels like that`s what`s going on behind the scenes. But what is going on behind the scenes is that the algorithm is learning what people want and what people will watch, if you put it in front of them. And what we`re learning is that that doesn`t always bring out the best in people. People, you know, they want this sort of secret knowledge. If you tell them — if you give them a choice between two videos and one of them says the moon landing happened and one of them says here`s why the moon landing didn`t happen, which one do you think they`re going to click on? So the algorithm in some ways is training itself on the biases of humans which are toward conflict, which are toward conspiracy. But there were signs early on that the company chose not to act on that people like Alex Jones, people like Stefan Molyneux, people who are in this more extreme camp, were being amplified through this algorithm and they didn`t act on it until very recently.

SREENIVASAN: So what did they do?

ROOSE: So they`ve said that they`ve changed the recommendations, algorithm slightly so that it`s not recommending as many conspiracy theories. They have taken a lot of sort of blatant white nationalism off the platform so they`re trying to clean themselves up. But even as recently as this year, they have been making changes to the algorithm that are sort of refining it further. They`re becoming better at determining what will keep people on YouTube for longer because ultimately, that`s the way they make money.

SREENIVASAN: But Caleb, what`s the advantage that the far-right has that the far left doesn`t? Why has that ideology spread so quickly and taken a route in these places where the counterweight hasn`t gotten up and challenged it?

CAIN: I think like what the far left — like when you start — at least with a lot of leftists that I talked to, we`re talking past liberals at this point, further left than not obviously, it`s not an easy sell. It`s not a message that you can sell so easily. The thing with the far right is it`s dominating a certain sub-sec of society right now, right. It`s usually a straight white guy, young. They`re 15, 16, 17, 18, 19, 20. And usually, most of them operate within that range of Generation Z to Millennials. And I think that you have a lot of compounding factors. You had the 2008 financial crisis hit. You`ve — we have had a lot of cultural changes and what the far right does that the far left doesn`t do is the far right gives you something to strive towards. It gives you like this golden ideal in your head of self-improvement, of fixing your society, of having this, you know, grand vision for the future. Whereas the far left is more focused on deconstructing the problems we`re facing right now. It doesn`t have the same sort of narrative that I think pulls people into the far right.

ROOSE: I think one thing that I have sort of learned through talking to Caleb is the extent to which like the right has sort of captured this kind of countercultural idea. One thing that really stuck out to me about your story is that when you were in high school, before this sort of YouTube thing hit, you were really into sort of punk and, you know, Michael Moore documentaries and sort of like going against the grain of your high school. And I think, you know, that was during the Obama years. And now I think during the Trump years like there`s this movement of people that have come to see being very conservative, being far-right as sort of punk and edgy. It attracts the part of I think a lot of young men you know who see the establishment, you know, doing one thing and wants to do the exact opposite.

SREENIVASAN: What was the final straw that made you realize you have gone too far and this is not for you?

CAIN: There wasn`t one moment. Deradicalization is a process, but there were moments along the way. You know, one moment I remember is when I watched Natalie win.

(BEGIN VIDEO CLIP)

NATALIE: I live in constant fear and fear is what freedom is all about.

(END VIDEO CLIP)

CAIN: The creator of the YouTube page Counterpoints, she did a video Deconstructing the Alt-Right or Decrypting the Alt-Right. And when she explained to me that cultural Marxism, a belief that I had held that communists were subverting our institutions, they were invading Hollywood and invading academia to transform America into a communist country, when I learned that that was just a repackaged Nazi conspiracy theory, that it was the Jewish conspiracy that Jews controlled the world, that was a watershed moment for me.

ROOSE: If you got through his YouTube history and watch 12,000 — I did, I sifted through 12,000 of your YouTube videos. You can see that the thing that brought him out is not that, you know, some teacher, you know, from his school intervened or, you know, an adult in his life or a friend. It was people who understood the language and the culture of YouTube, who were getting into his algorithm by making videos on the same subjects with the same sort.

CAIN: Yes. Yes. There`s no direction to the ideologies on YouTube. You can get pulled out of it just as easily as you could get pushed into it. It`s all about chance. And that`s kind of what freaks me about it is it`s pretty random.

SREENIVASAN: When you see events like the one that just happened in El Paso, how does something like that fit into what you have just been

through?

CAIN: It`s just showing the ultimate manifestation of this stuff. It`s showing the logical conclusion. If you believe in the great replacement, if you believe that white people are being displaced in their societies, and that brown people coming in are invaders, what`s the logical conclusion of that? This is a conspiratorial way of thinking that I think that a massacre is like the logical conclusion or conspiracy theorist.

SREENIVASAN: And is it working in that way?

ROOSE: Yes. I think there`s a real danger here. I think for a long time the things that happened online were not taken as seriously because they were happening online. It was sort of like, that`s just the Internet. And so for years now, these movements have been building traction, have been building support, have been ramping up their rhetoric and their ideology. And we`re just now kind of looking and seeing like oh, maybe we should have paid attention to that when it was smaller when it was gaining steam. Because now, it`s now a pattern where you see these young men who have been radicalized online going out and committing acts of mass violence.

SREENIVASAN: How do we fix this?

CAIN: We can game algorithms. We can demonetize and deplatform. We can do these things all day long but white nationalism stems from problems in our society, problems that are deeply embedded in our society. Yes, racism is a thing that we have to fix, but it`s also it comes down to material things. It comes down to we need an economy that helps everyone. It`s no coincidence that I was from West Virginia, in a disenfranchised community and I fell into this. People gravitate to these things because they`re searching for something. They`re searching for identity. They`re searching for comfortability. They`re searching community. And you have to offer that to people. I don`t know how we offer that exactly but I at least know that you can give people health care. You can give people education. You can give people access to opportunities so that they don`t spend all their time behind a computer screen and they get sucked into it. If these people had things in their lives, opportunities in their lives that they were out in the world working and building a future for themselves and they felt like they had a future to look forward to, I don`t think that they would need white nationalism.

SREENIVASAN: Caleb Cain, Kevin Roose, thank you both.

ROOSE: Thank you for having us.

About This Episode EXPAND

Lewis Lukens joins Christiane Amanpour to discuss a thinning line between foreign policy decisions and trade. Hari Sreenivasan delves into the perils of technology and the alt-right with Caleb Cain and Kevin Roose. Perri Peltz and Matthew O’Neill speak to Amanpour about their new documentary, “Alternate Endings: Six New Ways to Die in America.”

LEARN MORE