VICTORY! Google Enhances Protections Against Deepfake/AI-Generated Pornography

Driven by feedback from survivors and advocates, Google has announced enhanced protections against deepfake and AI-generated pornography. The post VICTORY! Google Enhances Protections Against Deepfake/AI-Generated Pornography appeared first on NCOSE.

Story

In a significant stride towards combating image-based sexual abuse (IBSA), Google has announced major updates to its policies and processes to protect individuals from sexually explicit deepfake and AI-generated content. These changes, driven by feedback from experts and survivor advocates, represent a monumental victory in our ongoing fight against IBSA.

Understanding the Impact of Deepfake and AI-Generated Pornography

Computer-generated IBSA, commonly called “deepfake pornography” or “AI-generated pornography,” become increasingly prevalent, posing severe threats to personal privacy and safety. With increased ease and rapidity, technology can forge highly realistic explicit content that often targets individuals without their knowledge or consent. The distress and harm caused by these images are profound, as they can severely damage reputations, careers, and mental health.

And it can happen to anyone. It can happen to you and any of the people you love. If a photo of your face exists online, you are at risk.

Key Updates from Google

Recognizing the urgent need to address these issues, Google has implemented several critical updates to its Search platform to make it easier for individuals to remove IBSA, including computer-generated IBSA, and to prevent such content from appearing prominently in search results. Here’s what’s new:

  • Explicit Result Filtering: When someone successfully requests the removal of an explicit deepfake/AI-generated image, Google will now also filter all explicit results on similar searches about that person. This helps prevent the reappearance of harmful content in related searches.
  • Deduplication: Google’s systems will scan for and automatically remove duplicate sexually explicit images that have already been successfully removed. This reduces the likelihood of recurring trauma for victims who previously had to repeatedly request removals of the same images.
  • Ranking Updates: Google is updating its ranking algorithms to reduce the visibility of deepfake/AI-generated pornography. By promoting high-quality, non-explicit content, Google aims to ensure that harmful material is less likely to appear at the top of search results.
  • Demotions: Websites with a high volume of removal requests will be demoted in search rankings. This discourages sites from hosting deepfake/AI-generated pornography and helps to protect individuals from repeated exposure to such material.

Join Us in Thanking Google!

Please take a few moments to sign the quick form below, thanking Google for listening to survivors and creating a safer Internet!

Thank Google!

Listening to Survivors: A Critical Element

One of the most commendable aspects of Google’s update is its foundation in the experiences and needs of survivors. By actively seeking and incorporating feedback from those directly affected by IBSA, Google has demonstrated a commitment to creating solutions that truly address the complexities and impacts of this form of abuse.

NCOSE arranged for Google to meet with survivors, and we are thrilled that the company has listened to their critical insights in developing these new features. We warmly thank these brave survivors for raising their voices to make the world a safer place for others.

We also thank YOU for your advocacy which helped spark this win! Over the years, you have joined us in numerous campaigns targeting Google, such as the Dirty Dozen List, which Google Search and other Google entities have been named to many times. This win is YOUR legacy as well!

A Step Forward, But More Work to Do

While these changes mark a significant victory, the fight against IBSA is far from over. Continued vigilance, innovation, and cooperation from tech companies, policymakers, and advocacy groups are essential to building a safer online environment. We must keep pushing for more robust measures and support systems for those affected by image-based sexual abuse.

ACTION: Call on Microsoft’s GitHub to Stop Facilitating IBSA!

Google was far from the only corporation facilitating computer-generated IBSA. In fact, there is one corporate entity that is at the root of almost all of this abuse: Microsoft’s GitHub.

Microsoft’s GitHub is the global hub for creating sexually exploitative AI tech. The vast majority of deepfakes and computer-generated IBSA originate on this platform owned by the world’s richest company. 

It’s time for Microsoft’s GitHub to stop fueling this problem and start fighting it instead!

Take 30 SECONDS to sign the quick action form below, calling on Microsoft’s GitHub to combat deepfake and AI-generated pornography.

TAKE ACTION!

ACTION: Urge Your Senator to Support the TAKE IT DOWN Act!

We also urgently need better legislation to combat IBSA. As it stands today, there is NO federal criminal penalty for those who distribute or threaten to distribute nonconsensual sexually explicit images. 

The TAKE IT DOWN Act seeks to resolve this appalling gap in the law.

The TAKE IT DOWN Act has already unanimously passed Committee. Please join us in pushing it through the next steps!

Take action now, asking your Senator to support this crucial bill.

TAKE ACTION!

We encourage everyone to stay informed about IBSA, support survivors, and advocate for stronger protections and accountability from tech companies. Together, we can create a safer, more respectful digital world.

For more information and resources on combating image-based sexual abuse, visit our webpage here.