A teenage victim of deepfake porn blamed Big Tech for what she went through, and called on the industry to take action, as she spoke at a roundtable with First Lady Melania Trump.
“I was just 14 years old when a classmate created AI nudes of me from an innocent social media picture,” Elliston Berry said. “I was 14 years old when I was violated all over social media. And I was just 14 years old when my innocence was stripped away.
“October 2, 2023, I woke up to messages from a friend notifying me that these photos of me were circulating on social media — pictures from a past Instagram photo with a nude body and my face attached, made from AI.
“I not only blame the classmate,” Berry added. “We need to hold Big Tech accountable to take action.”
Berry was one of several victims of sexually explicit deepfake porn who attended the roundtable, which was aimed at urging the House to pass the “TAKE IT DOWN” Act, which would crack down on sexually explicit deepfakes and revenge porn.
Berry hammered Big Tech platforms for not removing apps that people use to create deepfake porn — similar to her case.
“The boy used a Google app called DNGG that existed to remove articles of clothing from a photo, available for all ages for free. While that app is no longer available, thousands of other apps similar are available,” Berry said.
“Additionally, social media needs to be required to take action,” she added.