Earlier this year, the White House put out a call for voluntary cooperation from the tech sector to help stop the “deepfake porn,” which is quickly becoming a serious problem as artificial intelligence makes the creation of such illicit content easy. In addition, the content can quickly be shared across social media and other online platforms.

“As generative AI broke on the scene, everyone was speculating about where the first real harms would come. And I think we have the answer,” the Biden administration’s chief science adviser Arati Prabhakar, director of the White House’s Office of Science and Technology Policy told The Associated Press in May.

The call from the White House came even before it was reported in June that deepfake porn production increased 464% last year.

Take It Down Act – Good First Step

Lawmakers have introduced legislation that could hold online platforms accountable for the publishing and distribution of nonconsensual AI images that contain the faces of real individuals.

The Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks (TAKE IT DOWN) Act would criminalize the publication of non-consensual intimate imagery (NCII), including AI-generated NCII – deepfake pornography – while it requires social media and similar websites to have in place procedures to remove such content upon notification from a victim.

“The Take it Down Act is a step in the right direction, and more will need to be done. Despite the existence of laws in nearly every state to protect against NCII, the effectiveness of these laws varies significantly,” explained Theresa Payton, founder and CEO of cybersecurity provider Fortalice Solutions.

“The inconsistency leads to the victim suffering mental anguish as the imagery propagates nearly unabated,” Payton told ClearanceJobs. “The burden lies squarely on the victim’s shoulders and social media platforms require laborious, tedious processes for victims to remove harmful images from the internet, perpetuating their trauma.”

Layers of Legislation Could be Required

Several states have also adopted legislation that could criminalize the publishing of such content, but lawmakers are struggling to keep up with advances in the technology.

“More needs to be done rather than relying on other federal laws that may address some of the conduct surrounding these crimes.  There needs to be greater clarity as AI technology continues to evolve, making it easier to create revenge porn,” suggested technology industry analyst Susan Schreiner of C4 Trends.

“Bills are advancing in several states such as Illinois, Missouri, New Jersey, and Ohio.  While, several bills have been introduced in Congress, including the AI Labeling Act of 2023 and the DEFIANCE Act of 2024, neither has moved out of committee,” Schreiner told ClearanceJobs.

She further suggested that the popularity of singer Taylor Swift might finally move the needle. The spread of fabricated sexually explicit images of the pop star had been seen as a catalyst for greater policy action at the federal level.

“Given current politics and polarization that might be overly optimistic,” Schreiner continued.

Going After the Publishers

The technology to create convincing deepfakes isn’t likely to go away, so instead, it may be necessary to combat the publishing of the content.

“Technology platforms and hosts should not allow the distribution of NCII,” said Payton. “They should be required to deploy automatic detection, removal, and blocking. Each technology platform would ideally have a governance committee that oversees the number of reported incidents and the average time to resolution. This should be required reporting to the public.”

This may not entirely curtain it, but it could be an important step in the right direction.

“When imagery slips through the automated detection and removal, the platforms should be required to provide a case number to the victim and if the situation is prolonged, access to a contact center and a case manager to ensure that the images are removed completely,” added Payton.

“The worst offenders, such as adult imagery hosts, use the law to their advantage to make takedowns extremely drawn out and costly for the victim,” Payton continued. “The new law should be explicit and include severe penalties for non-compliance.”

Related News

Peter Suciu is a freelance writer who covers business technology and cyber security. He currently lives in Michigan and can be reached at petersuciu@gmail.com. You can follow him on Twitter: @PeterSuciu.