crime
AI Infiltrates Congress: Female Lawmakers Face Onslaught of Explicit Deepfake Attacks
A recent study has revealed a concerning trend among members of Congress regarding the creation and distribution of sexually explicit deepfakes, underscoring a significant gender disparity. The American Sunlight Project (ASP) found that more than 35,000 mentions of nonconsensual intimate imagery (NCII) involving 26 lawmakers—25 women and one man—were present on deepfake websites. Most of this material was swiftly taken down after researchers notified the affected members.
Nina Jankowicz, an expert on online disinformation and the study’s author, emphasized the urgent need to address the growing prevalence of such attacks, especially against women and marginalized groups. “The internet has opened up many harms that disproportionately affect women,” she stated, highlighting the risks to their participation in politics and civic life.
Nonconsensual intimate imagery, often dubbed deepfake porn, can be generated using artificial intelligence or by superimposing individuals’ images onto existing adult content. The study specifically noted that gender played a crucial role, with women lawmakers being 70 times more likely to be victims compared to their male counterparts.
While ASP refrained from disclosing the names of the affected congress members to prevent further targeting, they alerted the offices of those identified, providing resources on managing online harassment. Interestingly, in many cases, the imagery was completely or almost entirely removed from the websites shortly after the initial report, although researchers noted that this does not eliminate the risk of reposting.
Approximately 16 percent of female lawmakers in Congress are victims of AI-generated NCII, a trend that raises alarms about the mental health effects on targets. Jankowicz, herself a victim of deepfake abuse, pointed out how such attacks can disrupt public discourse, particularly for women and women of color.
Mental health impacts from image-based sexual abuse extend beyond high-profile figures, affecting everyday individuals, including minors. Reports of targeted harassment against high school girls have surfaced in several states, prompting mixed reactions from school officials and new warnings from the FBI on the legal implications of sharing such content.
The study also highlighted alarming self-censorship among young women, with nearly 41 percent admitting to silencing their voices online to avoid harassment. Research director Sophie Maddocks warned that this trend poses a significant threat to democracy and free speech.
Despite the increasing prevalence of AI-generated imagery, there remains a lack of comprehensive federal legislation addressing the issue. While some states have enacted laws, they often fall short of deterrent criminal penalties. Jankowicz is hopeful that Congress will take action, given that these issues directly affect both lawmakers and ordinary citizens alike.
In the wake of recent political events, such as the case of Susanna Gibson, a Virginia lawmaker whose intimate videos were released against her consent, the spotlight on image-based sexual abuse intensifies. Gibson has since launched initiatives aimed at combating this issue and supporting women candidates facing similar threats.
The ASP is advocating for the passage of the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024 (DEFIANCE Act), which would empower individuals to take legal action against those behind such harassment. Concerns around free speech and definitions of harm are problematic, complicating progress on this front in Congress.
In the absence of robust legislative measures, the White House is seeking industry collaboration to create solutions against image-based abuse. Nevertheless, skepticism remains about the tech industry’s commitment to self-regulate, especially in light of past failures. “It’s alarming how easy it is for perpetrators to create this content,” Jankowicz concluded, emphasizing the broader message sent to women about the potential fallout of speaking out.