US President Donald Trump has signed a invoice criminalizing nonconsensual synthetic intelligence-generated deepfake porn, which additionally requires web sites to take down any illicit pictures inside 48 hours.
Trump signed the invoice into regulation on Could 19, often known as the TAKE IT DOWN Act, an acronym for Instruments to Deal with Identified Exploitation by Immobilizing Technological Deepfakes on Web sites and Networks.
The invoice, backed by first girl Melania Trump, makes it a federal crime to publish, or threaten to publish, nonconsensual intimate pictures, together with deepfakes, of adults or minors with the intent to hurt or harass them. Penalties vary from fines to jail.
Web sites, on-line companies, or apps should take away unlawful content material inside 48 hours and set up a takedown course of.
Trump stated in remarks given on the White Home Rose Backyard and posted to the social media platform Reality Social that the invoice additionally covers “forgeries generated by a synthetic intelligence,” generally known as deepfakes.
Melania Trump had instantly lobbied lawmakers to help the invoice, and stated in an announcement that the regulation is a “nationwide victory.”
“Synthetic Intelligence and social media are the digital sweet of the subsequent technology — candy, addictive, and engineered to have an effect on the cognitive growth of our youngsters,” she stated.
“However in contrast to sugar, these new applied sciences might be weaponized, form beliefs, and sadly, have an effect on feelings and even be lethal,” she added.
Senator Ted Cruz and Amy Klobuchar launched the invoice in June 2024, and it handed each homes in April of this 12 months.
US the newest to ban specific deepfakes
There was a rising variety of circumstances the place deepfakes are used for dangerous functions. One of many extra high-profile cases noticed deepfake-generated illicit pictures of pop star Taylor Swift quickly unfold by X in January 2024.
X briefly banned searches utilizing Taylor Swift’s identify in response, whereas lawmakers pushed for laws criminalizing the manufacturing of deepfake pictures.
Associated: AI scammers are actually impersonating US authorities bigwigs, says FBI
Different nations, such because the UK, have already made sharing deepfake pornography unlawful as a part of the nation’s On-line Security Act in 2023.
A 2023 report from safety startup Safety Hero revealed that almost all of deepfakes posted on-line are pornographic, and 99% of people focused by such content material are girls.
Journal: Deepfake AI ‘gang’ drains $11M OKX account, Zipmex zapped by SEC: Asia Categorical