US bill proposes to let victims sue over digitally fake sexual images

US bill proposes to let victims sue over digitally fake sexual images

Technology

Pornographic AI-manipulated images referred to as deepfakes have grown in popularity

Follow on
Follow us on Google News

(Web Desk) - As AI-generated explicit images of Taylor Swift create massive debate among the policy makers, US lawmakers have proposed a bill that would let victims sue over digitally faked sexual images.

The ‘Disrupt Explicit Forged Images and Non-Consensual Edits’ (DEFIANCE) Act would add a civil right of action for intimate “digital forgeries” depicting an identifiable person without their consent, letting victims collect financial damages from anyone who “knowingly produced or possessed” the image with the intent to spread it.

The bill has been introduced by Senate Majority Whip Dick Durbin (D-IL), joined by Senators Lindsey Graham (R-SC), Amy Klobuchar (D-MN), and Josh Hawley (R-MO), reports The Verge.

“An identifiable individual who is the subject of a digital forgery may bring a civil action in an appropriate district court of the United States for relief against any person that knowingly produced or possessed the digital forgery with intent to disclose it, or knowingly disclosed or solicited the digital forgery,” read the bill.

The term ‘digital forgery’ means any intimate visual depiction of an identifiable individual created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means, including by adapting, modifying, manipulating, or altering an authentic visual depiction, to appear to a reasonable person to be indistinguishable from an authentic visual depiction of the individual.

Pornographic AI-manipulated images, frequently referred to as deepfakes, have grown in popularity and sophistication since the term was coined in 2017.

Meanwhile, Microsoft has also introduced more protections to its AI text-to-image generation tool Designer that users were utilising to create nonconsensual sexual images of celebrities.