One of the most commendable aspects of Google’s update is its foundation in the experiences and needs of survivors. By actively seeking and incorporating feedback from those directly affected by IBSA, Google has demonstrated a commitment to creating solutions that truly address the complexities and impacts of this form of abuse.
NCOSE arranged for Google to meet with survivors, and we are thrilled that the company has listened to their critical insights in developing these new features. We warmly thank these brave survivors for raising their voices to make the world a safer place for others.
We also thank YOU for your advocacy which helped spark this win! Over the years, you have joined us in numerous campaigns targeting Google, such as the Dirty Dozen List, which Google Search and other Google entities have been named to many times. This win is YOUR legacy as well!
A Step Forward, But More Work to Do
While these changes mark a significant victory, the fight against IBSA is far from over. Continued vigilance, innovation, and cooperation from tech companies, policymakers, and advocacy groups are essential to building a safer online environment. We must keep pushing for more robust measures and support systems for those affected by image-based sexual abuse.
ACTION: Call on Microsoft’s GitHub to Stop Facilitating IBSA!
Google was far from the only corporation facilitating computer-generated IBSA. In fact, there is one corporate entity that is at the root of almost all of this abuse: Microsoft’s GitHub.
Microsoft’s GitHub is the global hub for creating sexually exploitative AI tech. The vast majority of deepfakes and computer-generated IBSA originate on this platform owned by the world’s richest company.
It’s time for Microsoft’s GitHub to stop fueling this problem and start fighting it instead!
Take 30 SECONDS to sign the quick action form below, calling on Microsoft’s GitHub to combat deepfake and AI-generated pornography.
ACTION: Urge Your Senator to Support the TAKE IT DOWN Act!
We also urgently need better legislation to combat IBSA. As it stands today, there is NO federal criminal penalty for those who distribute or threaten to distribute nonconsensual sexually explicit images.
We encourage everyone to stay informed about IBSA, support survivors, and advocate for stronger protections and accountability from tech companies. Together, we can create a safer, more respectful digital world.
For more information and resources on combating image-based sexual abuse, visit our webpage here.
In September 2023, the FBI issued a warning about a satanic pedophilic cult using Telegram as its main source of communication. This cult regularly extorted children as young as 8 years old into filming themselves committing suicide or self-harm, sexually abusing their siblings, torturing animals, or even murdering others. Members of the Telegram group would control their victims by threatening to share sexually explicit images of the children with their family and friends, or post the images online. Many members had the final goal of coercing the children to die by suicide on live-stream.
Telegram users would gain access to this group by sharing videos of the children they extorted, or videos of adults sexually abusing children.
Yet rather than taking much-needed steps to combat these crimes, Telegram provided a cover for them to continue unchecked. The truth is, Telegram’s very design seems built to invite and protect criminals and predators.
The company makes it incredibly difficult for law enforcement to investigate crimes occurring on the app. It uses end-to-end encryption in many areas of the platform—and for the areas not covered by end-to-end encryption, it uses distributed infrastructure. In Telegram’s own words, distributed infrastructure means that “data is stored in multiple data centers around the globe that are controlled by different legal entities spread across different jurisdictions … As a result, several court orders from different jurisdictions are required to force us to give up any data.”
The Stanford Internet Observatory concluded in a June 2023 report thatTelegram implicitly allows the trading of CSAM in private channels. They concluded this because Telegram had no explicit policy against CSAM in private chats, no policy at all against grooming, no efforts to detect for known CSAM, and the researchers found CSAM being traded openly in public groups.
It is therefore no surprise that Telegram was noted as the #1 most popular messaging app used to “search for, view, and share CSAM” by almost half of CSAM offenders participating in a 2024 study.
These are only a couple examples of the many ways Telegram designed its platform to shelter criminals and allow abuse to proliferate. You can read more about this here.
Telegram CEO Arrested in France … Where is U.S. Department of Justice?
Pavel Durov, the CEO of Telegram was arrested in France this week, as part of an investigation into the myriad crimes on the platform.
The only question now is: Why is the United States Department of Justice not engaged?
Please join us in urging the DOJ to investigate Telegram now! Take 30 seconds to complete the quick action below.
Your report is anonymous, except if you're reporting an intellectual
property infringement. If someone is in immediate danger, call the
local emergency services - don't wait.
Report Story
Any Additional Notes
Your report is anonymous, except if you're reporting an intellectual
property infringement. If someone is in immediate danger, call the
local emergency services - don't wait.
Report Story
Thanks for letting us know
We use these reports to:
Understand problems that people are having with different types
of content on CauseCircle
Show you less of this kind of content in the future