TEXAS VIEW: Deepfake nudes and AI porn torment teens

THE POINT: Ted Cruz is fighting back, but it shouldn’t take congressional pressure for victims to get relief.

Volleyball practice. Bible study. The English essay due next week. These are the things that once occupied 14-year-old Elliston Berry’s mind.

Then came Ellis’ rude awakening. One fall morning last year at her home outside of Fort Worth, she woke up to a flurry of missed calls and texts from her friends that sent her heart racing: Had she seen the nude photo of her circulating on Snapchat?

Ellis had never taken a naked photo of herself, much less sent one. But that didn’t matter.

A benign cruise photo she had posted on her private Instagram had been enough for a classmate to feed into a “nudification” tool — an app that, armed with artificial intelligence, can strip a person’s clothes and create a realistic explicit image in seconds. With the same ease, the classmate, a 15-year-old boy, allegedly created several Snapchat accounts to befriend classmates and then bombarded them with fake nudes of Ellis and six of her friends.

“On Oct. 7, 2023 our lives were forever changed,” Anna McAdams, Ellis’ mom, tearfully testified to the Texas Senate Committee on Criminal Justice earlier this month. “My daughter’s innocence was shattered and her eyes were opened to the reality of how cruel a person can be.”

As if parents didn’t have enough to worry about, now it’s malicious “deepfakes” — videos, photos or audio digitally altered with AI. While often associated with fake photos and videos aimed at candidates vying for public office, more than 95% of deepfakes online are pornographic and 99% target women or girls. Very few are published with consent.

More often than not, deepfake porn targets celebrities, including Taylor Swift and, more recently, Houston’s own Megan Thee Stallion. Federal and state laws failed these celebrities, and they’ve failed to protect children, who have far less power, from being traumatized by this technology.

Last session, the Legislature passed at least three bills addressing the issue, including House Bill 2700, which expanded the definition of child pornography to encompass visual material that uses an actual child’s image, including content created using AI. But the law doesn’t go far enough and we’re glad to see that Lt. Gov. Dan Patrick has charged senators with closing loopholes.

While Ellis worries whether, each time she applies to college or to a job, these nude photos will come back to haunt her, the classmate who targeted her and her friends will never have to worry about a permanent record.

When her family reached out to the school and the sheriff’s office for help, at first they were told that, because the cyberbully was a minor, they couldn’t reveal his identity. It wasn’t until Anna McAdams filed a Title IX complaint that they found out who he was.

Even then, “the school was ill-equipped and law enforcement didn’t know what to do,” McAdams testified. She says the 15-year-old boy received in-school suspension, and then his parents transferred him to another school. The sheriff’s office charged him with distribution of harmful material to a minor, a Class A misdemeanor, but he was let off on probation, according to McAdams.

“At 18, his case will be expunged,” she said. “No one will ever know what he did.”

The case has inspired U.S. Sen. Ted Cruz, a ranking member of the U.S. Senate Commerce Committee, to introduce a bipartisan bill on Tuesday, June 18, called the TAKE IT DOWN Act. The bill treats the online world as interstate commerce, making it a criminal offense to publish or threaten to publish nonconsensual intimate images depicting real people, including computer-generated images, on social media. Cruz’s office explained that the penalty — three years of jail time if the victim is a minor or two years if they’re an adult — would apply to both adults and minors who commit the offense.

The bill, if passed, would send a more powerful warning to kids considering creating or publishing nonconsensual deepfakes. More importantly, it will also give victims a message of hope — a way out from having those images follow them in perpetuity.

Cruz, who is also among the 70 senators co-sponsoring a comprehensive reform bill called the Kids Online Safety Act, included in the new bill a provision requiring websites to take down those nonconsensual images within 48 hours of a victim requesting it. For McAdams, persuading Snapchat to remove the images had been like screaming into a void.

“Right now, if you happen to be a big, famous star like Taylor Swift, you can get the images pulled down,” Cruz told the editorial board last week. “But if you’re just a Texas teenager, Big Tech’s answer is typically ‘go jump in a lake.’”

One phone call from Cruz’s office was all it took for Snapchat to finally scrub the photos, but it shouldn’t take congressional pressure for victims to get relief from an explicit photo published without their say.

Parents don’t have to wait for regulations to catch up to technology to get informed about the issue and start sounding the alarm in local school communities. Invite your kids to a frank conversation about deepfakes. Extend that conversation to your school board. McAdams told us her daughter’s high school still doesn’t have a specific AI deepfake policy in place, despite at least three incidents. We understand that schools are often too under resourced and overwhelmed by the sheer volume of potential online harms to pay sufficient attention to cases of AI misuse.

But urge your schools to heed Ellis’ warning: It can happen to anyone. Using something as benign as a LinkedIn headshot, a bad actor can upend your life. Without stronger laws, there’s nothing to stop him.

Houston Chronicle