A journo for The New York Times found a guy who claims he was radicalized by the far-right on YouTube and he spent months writing about it. The timing of this article seems a tad bit convenient, with what happened earlier this week with Carlos Maza trying to shut Steven Crowder down.
He even wrote a thread about his article.
I’ve been working on a story for a few months that I’m excited to share.
It’s about a 21-year-old guy who was radicalized into the far-right, with help from his YouTube recommendations. https://t.co/7UojMUuwWR
— Kevin Roose (@kevinroose) June 8, 2019
This took him months?
Alrighty then.
I’ve been writing a lot about YouTube recently, but I’ve never articulated why. I think it’s the most important and least understood force in our culture and politics. More than Facebook, more than Twitter.
— Kevin Roose (@kevinroose) June 8, 2019
YouTube is where young people spend all their time, of course. But (as we saw this week!) it’s also a political battleground, a parallel media universe, and a test lab for some of the most powerful AI ever developed.
— Kevin Roose (@kevinroose) June 8, 2019
I’ve been interviewing right-wing extremists for years, and YouTube is (or was, I guess) the center of their universe. It’s where ideas are generated, and debates are won and lost. It’s where many of them got redpilled in the first place.
— Kevin Roose (@kevinroose) June 8, 2019
Redpilled.
Oh boy.
Nobody really knows how YouTube works. Even people who work there shrug when you ask why Video X or Channel Y blew up. It’s a black-box AI built by Google PhDs that figures out how to keep your attention, and convert it into money.
— Kevin Roose (@kevinroose) June 8, 2019
It’s all a plot!
After the Christchurch shooting, I wanted to try to figure out, as best I could, how YouTube works — both as a technology and out in the world, on people’s brains. Does it change them? If so, how? What does that process look like?
— Kevin Roose (@kevinroose) June 8, 2019
Or maybe, and just hear us out, but maybe awful people who do awful things would do them even without YouTube … just thinking out loud.
That’s when I found Caleb Cain. He’s a guy from West Virginia who went looking for self-help videos on YouTube during a personal rough patch, and spent the next 4 years falling down a far-right rabbit hole.
— Kevin Roose (@kevinroose) June 8, 2019
Ever notice how they never go down a far-left rabbit hole?
Caleb sent me his entire YouTube history — 12,000 videos in all — so that I could reconstruct his journey from the left to the far-right and back. It’s a *fascinating* data set, and not at all what I expected.
— Kevin Roose (@kevinroose) June 8, 2019
One guy.
I also learned that a few years ago, YouTube retooled its AI not just to recommend videos users would watch, but to *change* what they watched, and steer them into new interests. The project was called Reinforce. It worked, well.
— Kevin Roose (@kevinroose) June 8, 2019
Reinforce is a classic example of an AI that would be fine in most cases (who cares if the cooking video fan gets steered into home DIY videos?) but in an environment with extreme politics, it can create gateways, and draw people in.
— Kevin Roose (@kevinroose) June 8, 2019
And according to Kevin, only the right is extreme in their politics.
Of course.
YouTube didn’t set out to amplify extremists, and it’s done a lot of work recently to reduce their influence. But over the last 7 years, it made a series of changes that played into the far-right’s strategy, and helped it go mainstream.
— Kevin Roose (@kevinroose) June 8, 2019
MWAHAHAHAHAHAHA!
We can’t believe he spent months writing this.
Anyway, there’s a lot more in here, including some incredible visuals by an amazing team of designers and data journalists. I hope it helps decipher the insane, tangled world of YouTube politics, and explain how we ended up here. (Sorry to thread! I’m done!)
— Kevin Roose (@kevinroose) June 8, 2019
Admit it, you rolled your eyes at least once reading this thread.
i cant believe people pay you to write, or to read, and this took you months, i guess you never listened to anyone you put on here.
— Dreamweaver1984 (@LexSportsCards) June 8, 2019
BS completely one sided. Get some perspective.
— Sean Peck (@Pecktec101) June 9, 2019
The visuals are literally misleading propaganda. The fact that this was completely missed by you makes one wonder how much research you bothered to put into this story.
— Keifer Jennings (@TheDailyKeef) June 8, 2019
He said it took him months.
Well, you start off by assuming the right is radical. I think most of the YouTube conservatives just played the system. They went viral through people on FB, Twitter, but you watch the video on YouTube cause it's where most people put a video.
— Nick Sewell (@Wsewell525) June 9, 2019
Cool.
Now do a story on people radicalized by reading nyt.
— U.S. ConstitutEOINalist (@mini_poli_me) June 9, 2019
Cool.
Related:
Join the conversation as a VIP Member