Dem Mark Warner Blames Trump’s FBI for Not Arresting J6 Pipe Bomber Suspect...
Stardate 90210: Yet Another Awful Star Trek Series Announced
MAZE Posts Epic Mehdi Hasan Self-Own Over Search for the Far-Right, White Pipe...
Bulwark’s Tim Miller Applauds Jamie Raskin’s Investigation Into Trump's 60 Minutes Intervi...
'Major Milestone’: Home in Pacific Palisades Receives Final Approval From the City
When Jake Tapper Said the J6 Pipe Bomber Was a ‘White Man’ and...
Rep. Jerry Nadler Explains Why States Are Refusing to Hand Over SNAP Data:...
Pramila Jayapal: ‘Being Undocumented Isn’t a Crime’ – Federal Law and Half of...
Jim Acosta Says Trump Should Be Impeached Over Hateful Comments About the Somali...
Another ‘Police Brutality’ Story Collapses: Woman Refuses ID to Protect Illegal Boyfriend
JD Vance Is Hearing Rumors That the EU Commission Will Fine X Hundreds...
George Clooney's Casual Muslim Brotherhood Flex: Bragging About Wife's Terror Ties on Barr...
Mayor Brandon Johnson Refuses to Entertain Racist Question About Teen Violence in Chicago
Rep. Ilhan Omar Claims She Knew Nothing About $250 Million Welfare Fraud Scheme
Dumbo Gumbo: Leftist Pro-Illegal Alien Protesters Disrupt Council Meeting Over New Orleans...

'Disgusting and Disturbing': Study Finds AI Being Trained With Explicit Images of Children

Andy Tullis/The Bulletin, via AP, File

We're not even quite sure what to say about this, because it's horrible and horrifying.

AI is being 'trained' using explicit images of children and, well, words fail us:

Advertisement

From the AP:

Hidden inside the foundation of popular artificial intelligence image-generators are thousands of images of child sexual abuse, according to a new report that urges companies to take action to address a harmful flaw in the technology they built.

Those same images have made it easier for AI systems to produce realistic and explicit imagery of fake children as well as transform social media photos of fully clothed real teens into nudes, much to the alarm of schools and law enforcement around the world.

Until recently, anti-abuse researchers thought the only way that some unchecked AI tools produced abusive imagery of children was by essentially combining what they’ve learned from two separate buckets of online images — adult pornography and benign photos of kids.

This is why SkyNet will turn on us.

And, frankly, we're Team Robot.

Yes. As serious as a heart attack.

Advertisement

It's fine. This is fine. Everything is fine.

Hard to argue with this sometimes, frankly.

Now and to the fullest extent of the law.

Very much so.

It's just terrible, isn't it?

We don't want to know how they got there. Too nightmarish to think about,

What you say when you don't know what else to say.

Advertisement

AI doesn't exist independently. It is created and trained by humans. So those images got there via human hands.

3,200 images. That's a lot.

We're going to go with repugnant, regardless.

It is abuse.

And it needs to be stopped.

An excellent point.

Advertisement

This is just awful.

The fact some people don't seem shocked by this is also telling. They're horrified and disgusted, just not surprised.

WTH about sums it up.

Technology can be great, but it's dark side is so dark, so vile, and so pervasive. And as tech advances, keeping up with safety and protecting children has to be a priority.

***

Editor's Note: Do you enjoy Twitchy's conservative reporting taking on the radical left and woke media? Support our work so that we can continue to bring you the truth. Join Twitchy VIP and use the promo code SAVEAMERICA to get 40% off your VIP membership!

Join the conversation as a VIP Member

Recommended

Trending on Twitchy Videos

Advertisement
Advertisement
Advertisement