Please consider supporting us by disabling your content blocker.
loader

Published June 27, 2024 at 9:13 PM EDT

Introduction

The General Assembly has passed two significant bills that create penalties for the use of deep fakes, marking some of Delaware’s first laws regarding artificial intelligence.

House Bill 316

Sponsored by State Rep. Cyndie Romer, House Bill 316 introduces a new election crime: the use of deep fake technology to influence an election. Under this statute, distributing “deep fake” audio or visual depictions meant to harm a candidate or deceive voters within 90 days of an election is now a crime.

“We’re seeing videos out there of elected officials utilizing these deep fakes,” Romer says. “And we have two people in our state that are running for national office, so it’s not just a Delaware problem, this is becoming a national problem.”

Romer emphasizes that the bill targets distributors of such content, as finding the creators could be nearly impossible, especially if they are out-of-state or international.

House Bill 353

State Rep. Krista Griffith’s HB 353 imposes civil penalties for distributing AI-generated images of individuals in the nude or engaging in sexual conduct. When these images involve a minor, the penalties become criminal.

“People are tired of seeing, especially images of children, manipulated, and then spread,” Griffith says. “And even adults. It’s false information that is being sent out that damages and hurts people.”

Future Legislation

Griffith mentions that more legislation is on the way regarding data privacy and AI technology. This includes preventing bias and discrimination in hiring processes and criminal prosecutions when AI is involved.

Delaware Artificial Intelligence Commission

Griffith’s bill to establish the Delaware Artificial Intelligence Commission has also passed both chambers. This commission will make recommendations to the General Assembly and the Department of Technology and Information on AI utilization within the state.

For more details, visit the source.