Techno Blender
Digitally Yours.
Browsing Tag

CSAM

US and UK law enforcement are bracing for an explosion of AI-generated CSAM, and criticize Meta, the authorities' best partner for…

Eileen Sullivan / New York Times: US and UK law enforcement are bracing for an explosion of AI-generated CSAM, and criticize Meta, the authorities' best partner for flagging CSAM, over E2EE — Law enforcement officials are bracing for an explosion of material generated by artificial intelligence that realistically depicts children … Eileen Sullivan / New York Times: US and UK law enforcement are bracing for an explosion of AI-generated CSAM, and criticize Meta, the authorities' best partner…

X plans to hire 100 content moderators to fill new Trust and Safety center in Austin

X’s head of business operations Joe Benarroch said the company plans to open a new office in Austin, Texas for a team that will be dedicated to content moderation, reports. The “Trust and Safety center of excellence,” for which the company is planning to hire 100 full-time employees, will primarily focus on stopping the spread of child sexual exploitation (CSE) materials.X CEO Linda Yaccarino is set to testify before Congress on Wednesday in a hearing about CSE, and the platform at the end of last week published a about…

Researchers found child abuse material in the largest AI image generation dataset

Researchers from the Stanford Internet Observatory say that a dataset used to train AI image generation tools contains at least 1,008 validated instances of child sexual abuse material. The Stanford researchers note that the presence of CSAM in the dataset could allow AI models that were trained on the data to generate new and even realistic instances of CSAM.LAION, the non-profit that created the dataset, told that it "has a zero tolerance policy for illegal content and in an abundance of caution, we are temporarily…

Stanford Report Suggests Mastodon Has Child Abuse Material Problem

A new report suggests that the lax content moderation policies of Mastodon and other decentralized social media platforms have led to a proliferation of child sexual abuse material. Stanford’s Internet Observatory published new research Monday that shows that decentralized sites have serious shortcomings when it comes to “child safety infrastructure.” Unfortunately, that doesn’t make them all that different from a majority of platforms on the normal internet.Buy Now or Wait? How to Avoid Tech Buyer's RemorseWhen we talk…

Mastodon’s decentralized social network has a major CSAM problem

Mastodon has gained popularity over the past year as Twitter users looked for alternatives following Elon Musk’s . Part of its appeal is its decentralized nature that insulates it against the whims of billionaires who speak before they think. Unsurprisingly, though, what makes it so appealing has also proven to be a headache, making content moderation all but impossible. found 112 matches of known child sexual abuse material (CSAM) over a two-day period, with almost 2,000 posts using common hashtags related to abusive…

Discord bans teen dating servers and the sharing of AI-generated CSAM

Discord has updated its policy meant to protect children and teens on its platform after reports came out that predators have been using the app to create and spread child sexual abuse materials (CSAM), as well as to groom young teens. The platform now explicitly prohibits AI-generated photorealistic CSAM. As The Washington Post recently reported, the rise in generative AI has also led to the explosion of lifelike images with sexual depictions of children. The publication had seen conversations about the use of Midjourney…

Senators demand answers from Meta over how it handles CSAM on Instagram

A group of bipartisan senators are said to have asked to explain to prevent child sexual abuse material (CSAM) from being shared among networks of pedophiles on the platform. Lawmakers from the Senate Judiciary Committee also want to know how Instagram's algorithms brought users who want to share such content together in the first place, according to . In a letter to the company, 10 senators including committee chair Dick Durbin and Republican ranking member Lindsey Graham reportedly said they were “gravely concerned…

Apple blasted over lack of child exploitation prevention days after scrapping controversial CSAM plan

The world's first-ever report from Australia's eSafety Commissioner says that companies including Apple are not doing enough to tackle child sexual exploitation on platforms like iOS and iCloud, just days after Apple announced plans to scrap a controversial child sexual abuse material scanning tool. The commission sent legal notices to Apple, Meta (Facebook and Instagram), WhatsApp, Microsoft, Skype, Snap, and Omegle earlier this year, which requires companies to answer detailed questions on how they are tackling child…

Apple stops developing CSAM detection system for iPhone users

Last year, Apple announced that iCloud Photos would be able to detect inappropriate material in users’ photos based on a database of Child Sexual Abuse Material (CSAM) image hashes. While Apple wouldn’t see these photos since it would use on-device processing, it generated a lot of criticism from privacy and security researchers. Now, after announcing a new Advanced Data Protection for iCloud, the company’s executive Craig Federighi has confirmed that Apple will not roll out the CSAM detection system for iPhone…

Pornhub Puts a Chatbot to Work to Help People Searching For CSAM

Pornhub is persuading people away from child abuse: Chatbot to work for the motive The most popular pornographic website in the world, Pornhub, will start using AI to block content featuring child sex assault. The company has put a chatbot to work to address Child Sexual Abuse Material (CSAM). Pornhub’s chatbot will appear on users’ screens and inform them that they are trying to locate something they shouldn’t be looking for when users search for this kind of content. To better comprehend the search phrases…