A sweeping piece of legislation aimed at addressing the growing threat of AI-generated intimate imagery is now sitting on Governor Greg Abbott’s desk, awaiting his signature. Senate Bill 441, a bipartisan effort designed to establish civil liability for the creation and distribution of non-consensual “artificial intimate” visual material, follows closely on the heels of a new federal law targeting similar misconduct.
The legislation, authored by state Sen. Juan “Chuy” Hinojosa (D-McAllen) and carried in the House by state Rep. Suleman Lalani (D-Sugarland), would hold individuals and companies accountable for producing or sharing digitally altered sexual content without consent — including imagery involving minors. The bill mandates that platforms implement streamlined systems allowing victims to report such content and receive a response within 24 hours.
“SB 441 complements, builds upon, and conforms to the recently-enacted federal TAKE IT DOWN Act,” said Rep. Lalani, referring to the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act, co-authored by Sens. Ted Cruz (R-TX) and Amy Klobuchar (D-MN). Signed into law by President Donald Trump on May 19 after unanimous Senate passage, the federal law criminalizes the distribution of non-consensual intimate imagery using AI and other technologies.
Unlike the federal legislation, which focuses on criminal penalties, SB 441 takes a civil liability approach. Victims would have up to ten years to file civil suits against creators, distributors, or even payment processors who facilitated the spread of such content. For minors, the statute of limitations would begin after they turn 18.
A high-profile Texas case helped push the issue to the forefront. Elliston Berry, a 14-year-old girl, was victimized when a classmate used AI to generate explicit images of her from innocent Instagram posts — and then circulated the content to thousands of students. Her story helped galvanize support for both the federal and state measures.
“Victims like Elliston should never have to face the trauma of seeing their likeness manipulated and shared without consent,” said Lalani. “This bill gives them a path to justice.”
Key provisions of SB 441 include:
A 72-hour takedown window: Websites must remove flagged non-consensual content within 72 hours of notice or face legal consequences.
Payment processor liability: Companies that facilitate financial transactions related to the distribution of such content may be held civilly liable.
Provenance data transparency: An amendment added by Rep. Lalani requires AI-generated images to retain origin metadata, helping trace responsibility and improve transparency.
Confidentiality protections: Victim identities will be protected in legal proceedings to help prevent further harm or stigmatization.
Although the bill enjoyed broad bipartisan support, its final version in the House received some pushback — 36 votes against — following confusion over the provenance amendment. Some lawmakers expressed concern over legal ambiguities, such as how courts would define “effective consent” and determine “intent to harm.”
State Rep. Giovanni Capriglione (R-Southlake), who supported the amendment, clarified that the provision aims to ensure that no one can “hide behind technology” or evade accountability by manipulating metadata.
SB 441 passed the Senate and House overwhelmingly, including unanimous votes in their respective criminal justice and judiciary committees. After conference committee negotiations resolved minor differences, the final version was sent to Governor Abbott on June 3.
If signed into law, SB 441 will take effect September 1, 2025.
“This legislation is about protecting people from modern-day exploitation using 21st-century tools,” said Hinojosa. “It’s a necessary step to defend privacy and dignity in an age where artificial intelligence is increasingly misused.”