Victims of nonconsensual deepfake porn are Dear Utol (2025): WEEK 5 HIGHLIGHTS Episode 29using the laws of copyright to take back ownership of their likenesses, according to a new investigation.
In an analysis of copyright claims against websites known to share nonconsensual, digitally-altered videos, WIREDdiscovered thousands of women (including streamers, gamers, and other popular content creators) levying complaints to Google demanding the content be taken down.
The publication documented more than 13,000 copyright claims (which includes almost 30,000 URLs) against dozens of sites that populate Google.
Victims are utilizing the Digital Media Copyright Act (DMCA), which is frequently weaponized to remove copyrighted music, videos, and other media from third-party sites (and personal pages) online. The DMCA has also been used on behalf of victims of image-based sexual abuse or "revenge porn," with cases citing personal authorship and the unauthorized used of images.
A deepfake creator's alteration or outright fabrication of original images does complicate the matter, providing a higher obligation of proof for victims claiming rights over intellectual property.
Google has previously addressed the spread of revenge porn and deepfakes with new policies and reporting procedures, including options to remove personal explicit images from search results and deepfake reporting systems involving the detection of both original and copied images. The company has also documented its efforts to flag and remove such content. According to Google's own data, around 82 percent of complaints resulted in URL removal. "For the biggest deepfake video website alone," WIRED reported. "Google has received takedown requests for 12,600 URLs, 88 percent of which have been taken offline."
The sheer number of confirmed violations has prompted online safety and copyright advocates to wonder why the websites are still allowed to remain up. "If you remove 12,000 links for infringement, why are they not just completely removed?” posed Dan Purcell, founder and CEO of pracy protection firm Ceartas, in a WIRED interview. “They should not be crawled. They’re of no public interest."
The copyright strategy is a legal workaround for victims as government leaders crawl forward with proposed legislation that would criminalize the spread of "sexualized digital forgeries."
Known as the DEFIANCE (Disrupt Explicit Forged Images and Non-Consensual Edits) Act, the legislation also outlines a civil path for victims to sue the creators of deepfake images using their likeness.
"Victims of nonconsensual pornographic deepfakes have waited too long for federal legislation to hold perpetrators accountable. As deepfakes become easier to access and create — 96% of deepfake videos circulating online are nonconsensual pornography — Congress needs to act to show victims that they won’t be left behind,” wrote Congresswoman Alexandria Ocasio-Cortez upon the bill's introduction to the House.
In February, hundreds of AI leaders — joined by academics, researchers, artists, and even politicians — issued an open letter calling for the prioritization of deepfake legislation. The coalition called for a bill that would fully criminalize deepfake child pornography, establish criminal penalties for anyone knowingly involved in creating or spreading harmful deepfakes, and place requirements on software developers and distributers, Mashable's Meera Navlakha reported.
The letter cited the limits and inadequacies of current legislation to address deepfakes specifically, as well as the sheer increase in deepfake technologies and output. "Unprecedented AI progress is making deepfake creation fast, cheap, and easy. The total number of deepfakes has grown by 550 percent from 2019 to 2023," the coalition wrote.
Explicit deepfakes of celebrities are top of mind for many, following the spread of nonconsensual images of Taylor Swift on X and the recent discovery of deepfake porn ads using the likeness of actor Jenna Ortega.
But the problem is just as worrisome for non-famous individuals. Deepfake images are increasingly entering the social lives of young children and teens, prompting online child safety experts to call for preventative measures and heightened attention from parents.
In February, a group of California middle school students used deepfake technology to create and disseminate nude images of their classmates, just the latest instance among minors who seem to be getting younger and younger. Various court cases have laid down the verdict on victim recourse, with few to no laws to guide them.
"Deepfake pornography is a form of digital sexual violence. It violates victims' consent, autonomy, and privacy," wrote Sexual Violence Prevention Association (SVPA)founder Omny Miranda Martone in support of the DEFIANCE Act. "Victims face increased risk of stalking, domestic abuse, loss of employment, damaged reputation, and emotional trauma."
If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful informationas well as a list of international resources.
Topics Artificial Intelligence Google Social Good
Meteor shower this week: Everything you need to know about tau HerculidsNASA's Artemis moon rocket launch date unclear after incomplete testSouth Africa vs. Bangladesh 2024 livestream: Watch T20 World Cup for freeNASA's Mars Perseverance rover films dust devil mayhemSee NASA's tenacious Ingenuity helicopter soar over MarsXiaomi and MediaTek announce first joint lab, unveil nextA rocket slammed into the moon. NASA got a picture.NASA rover spots strange Martian rock revealing wild watery pastPhotographer captures strange blue spiral of light in the night skyTesla’s secondWWDC 2024: AirPods will let you accept calls by noddingRenault to develop cheaper EV batteries with CATL, LG Energy · TechNodeWWDC 2024: Genmoji lets you create your own emojis with 'Apple Intelligence'How to see 5 planets align in order for first time in 18 years'Monster' Mars quake shows the red planet isn't nearly deadA meteorite punched a hole in a dog house. Now it's a collector's item.China’s Zhihu introduces AI tool to respond to users’ questions · TechNodeA rocket slammed into the moon. NASA got a picture.NASA preps new spacecraft heat shield for Mars landingsWWDC 2024: Apple revamps Siri with a bunch of AI features Apple finally ditches Bing, embraces Google for Siri search NFL players slam Donald Trump for rant against protests Pumpkin Spice Latte? Whatever happened to simple drinks, like my triple foam half When a power company gets a letter from a little girl asking for a hamster, magic happens Russian Twitter trolls are attacking Morgan Freeman, and is nothing sacred? Uber CEO admits 'we've got things wrong' in open letter Apple Watch Series 3 teardown shows a beefed Facebook and Microsoft's big undersea cable is finally finished Inside Apple's new App Store Tesla might turn Supercharger stations into mini Klingons on 'Star Trek: Discovery' feel a little too real in the age of Trump YouTuber puts iPhone 8 through cringeworthy scratch and bend test Hurricane Maria devastated Puerto Rico. Here's what you can do to help. A year after Colin Kaepernick's first protest, his absence looms larger than ever Microsoft opening London flagship down the block from Apple store 'Hepeating' is the new 'mansplaining' and it's definitely happened to you 'Star Trek: Discovery' announces companion show for fans who can't get enough The 'Terminator' movies may be getting their original hero back Scientists are studying click beetles to engineer self Apple's High Sierra macOS is available for download right now
3.3197s , 10138.546875 kb
Copyright © 2025 Powered by 【Dear Utol (2025): WEEK 5 HIGHLIGHTS Episode 29】,Wisdom Convergence Information Network