Adobe, Anthropic, Cohere, Common Crawl, Microsoft, and OpenAI have all pledged to take concrete steps to prevent their technologies from being used to create non-consensual intimate images (NCII) and child sexual abuse material (CSAM).
These commitments include “responsibly sourcing data,” “implementing feedback” and “stress testing” to protect from misuse, and removing nude images from existing AI training datasets.
Interestingly, other major tech giants like Apple, Amazon, Google, and the extremist Meta* are not participating in this initiative.
*Meta is recognized as extremist and prohibited in the Russian Federation
Source: Ferra

I am a professional journalist and content creator with extensive experience writing for news websites. I currently work as an author at Gadget Onus, where I specialize in covering hot news topics. My written pieces have been published on some of the biggest media outlets around the world, including The Guardian and BBC News.