Elon’s “Nudify” Mess: How X Supercharged Deepfakes Titelbild

Elon’s “Nudify” Mess: How X Supercharged Deepfakes

Elon’s “Nudify” Mess: How X Supercharged Deepfakes

Jetzt kostenlos hören, ohne Abo

Details anzeigen

Nur 0,99 € pro Monat für die ersten 3 Monate

Danach 9.95 € pro Monat. Bedingungen gelten.

Über diesen Titel

On Christmas Eve, Elon Musk’s X rolled out an in-app tool that lets users alter other people’s photos and post the results directly in reply. With minimal safeguards, it quickly became a pipeline for sexualized, non-consensual deepfakes, including imagery involving minors, delivered straight into victims’ notifications. Renée DiResta, Hany Farid, and Casey Newton join Kara to dig into the scale of the harm, the failure of app stores and regulators to act quickly, and why the “free speech” rhetoric used to defend the abuse is incoherent. Kara explores what accountability could look like — and what comes next as AI tools get more powerful. Renée DiResta is the former technical research manager at Stanford's Internet Observatory. She researched online CSAM for years and is one of the world’s leading experts on online disinformation and propaganda. She’s also the author of Invisible Rulers: The People Who Turn Lies into Reality. Hany Farid is a professor of computer sciences and engineering at the University of California, Berkeley. He’s been described as the father of digital image forensics and has spent years developing tools to combat CSAM. Casey Newton is the founder of the tech newsletter Platformer and the co-host of The New York Times podcast Hard Fork. Questions? Comments? Email us at on@voxmedia.com or find us on YouTube, Instagram, TikTok, Threads, and Bluesky @onwithkaraswisher. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Noch keine Rezensionen vorhanden