Techno Blender
Digitally Yours.

Twitter It Ran Ads on Accounts Promoting Child Sex Abuse

0 46


A sign outside a big concrete building of Twitter headquarters reads @twitter with the blue twitter bird above it.

Twitter reportedly has a child porn problem that surpasses its ability to take down accounts advertising child sexual abuse content.
Photo: Justin Sullivan (Getty Images)

Reports have shown Twitter has struggled to deal with a rash of accounts peddling child sex abuse material on its platform, and now advertisers have declared they don’t want their ads to associate with a platform that can’t police itself for non-consensual sexual material.

Reuters first reported Wednesday that major brands discovered their ads had been appearing alongside tweets that were soliciting child abuse materials. The outlet said there were 30 advertisers, including the likes of Walt Disney Company, Coca-Cola, Dyson, NBCUniversal, and more that appeared next to the profile pages of Twitter accounts active in selling and soliciting child sexual abuse material.

Reuters reported that the names of around 30 brands came up in research on online child sex abuse from cybersecurity firm Ghost Data. Some of those advertisers, like chemical company Ecolab, home tech maker Dyson, car manufacturer Mazda, and more have already pulled their ads off the platform, according to the report.

A separate report from Business Insider said Twitter had started informing advertisers Wednesday they had discovered their ads running on these kinds of profiles, which apparently occurred after Reuters shared their account with Twitter. The emails seen by Insider reportedly said the company has banned accounts that were violating Twitter’s rules and it’s investigating how many people the ads may have reached.

In an email statement to Gizmodo, a Twitter spokesperson said “We are working closely with our clients and partners to investigate the situation and take the appropriate steps to prevent this from happening in the future” which apparently includes updates to detection and account suspension methodologies. The spokesperson added that they have suspended all those profiles they found were peddling child sexual abuse content, though they did not reveal how many accounts were involved.

The company does have a bi-annual transparency report. From July through December 2021, the company says it suspended close to 600,000 accounts and moderated another 600,000 for child sexual exploitation.

But despite those numbers, the question remains just how much of a child sex abuse problem is on Twitter? After all, it’s not like Twitter is Pornhub, a site that’s been repeatedly knocked for actively profiting off of child abuse. The site has had such a hard time making any concerted effort against the rash of child sex abuse material, credit card companies have abandoned support for the site and top executives have quit.

Reuters cited Ghost Data which noted that 70% of the 500 accounts they identified were dealing in child sexual abuse materials did not get taken down over a 20-day period in September. At least one of those accounts was asking for sexual content for those “13+.” The accounts advertised their content on Twitter and then moved onto apps like Telegram or Discord in order to complete actual transactions, sharing the content using Mega and Dropbox.

According to recent reports Twitter has known about such accounts for quite a while. The Verge reported that at one point earlier this year Twitter seriously considered its own version of OnlyFans, basically allowing creators the option to create paid subscriptions for adult content. What ended the initiative was that a dedicated team reported Twitter consistently fails to police against “child sexual exploitation and non-consensual nudity at scale.”

The report, based on internal documents and interviews with unnamed staff, notes that employees have been warning the company about its child porn problem for over a year. Creating such an OnlyFans-type operation could have resulted in the loss of advertising dollars as well.

Other sites like Reddit have also been cited in lawsuits for their failure to police underage sexual content, but Twitter has a lot riding on its advertising dollars.

Considering 92% of Twitter’s 2021 revenue was dependent on advertising, according to BusinessofApps data, Twitter may need a much stronger banhammer to keep the advertisers happy.




A sign outside a big concrete building of Twitter headquarters reads @twitter with the blue twitter bird above it.

Twitter reportedly has a child porn problem that surpasses its ability to take down accounts advertising child sexual abuse content.
Photo: Justin Sullivan (Getty Images)

Reports have shown Twitter has struggled to deal with a rash of accounts peddling child sex abuse material on its platform, and now advertisers have declared they don’t want their ads to associate with a platform that can’t police itself for non-consensual sexual material.

Reuters first reported Wednesday that major brands discovered their ads had been appearing alongside tweets that were soliciting child abuse materials. The outlet said there were 30 advertisers, including the likes of Walt Disney Company, Coca-Cola, Dyson, NBCUniversal, and more that appeared next to the profile pages of Twitter accounts active in selling and soliciting child sexual abuse material.

Reuters reported that the names of around 30 brands came up in research on online child sex abuse from cybersecurity firm Ghost Data. Some of those advertisers, like chemical company Ecolab, home tech maker Dyson, car manufacturer Mazda, and more have already pulled their ads off the platform, according to the report.

A separate report from Business Insider said Twitter had started informing advertisers Wednesday they had discovered their ads running on these kinds of profiles, which apparently occurred after Reuters shared their account with Twitter. The emails seen by Insider reportedly said the company has banned accounts that were violating Twitter’s rules and it’s investigating how many people the ads may have reached.

In an email statement to Gizmodo, a Twitter spokesperson said “We are working closely with our clients and partners to investigate the situation and take the appropriate steps to prevent this from happening in the future” which apparently includes updates to detection and account suspension methodologies. The spokesperson added that they have suspended all those profiles they found were peddling child sexual abuse content, though they did not reveal how many accounts were involved.

The company does have a bi-annual transparency report. From July through December 2021, the company says it suspended close to 600,000 accounts and moderated another 600,000 for child sexual exploitation.

But despite those numbers, the question remains just how much of a child sex abuse problem is on Twitter? After all, it’s not like Twitter is Pornhub, a site that’s been repeatedly knocked for actively profiting off of child abuse. The site has had such a hard time making any concerted effort against the rash of child sex abuse material, credit card companies have abandoned support for the site and top executives have quit.

Reuters cited Ghost Data which noted that 70% of the 500 accounts they identified were dealing in child sexual abuse materials did not get taken down over a 20-day period in September. At least one of those accounts was asking for sexual content for those “13+.” The accounts advertised their content on Twitter and then moved onto apps like Telegram or Discord in order to complete actual transactions, sharing the content using Mega and Dropbox.

According to recent reports Twitter has known about such accounts for quite a while. The Verge reported that at one point earlier this year Twitter seriously considered its own version of OnlyFans, basically allowing creators the option to create paid subscriptions for adult content. What ended the initiative was that a dedicated team reported Twitter consistently fails to police against “child sexual exploitation and non-consensual nudity at scale.”

The report, based on internal documents and interviews with unnamed staff, notes that employees have been warning the company about its child porn problem for over a year. Creating such an OnlyFans-type operation could have resulted in the loss of advertising dollars as well.

Other sites like Reddit have also been cited in lawsuits for their failure to police underage sexual content, but Twitter has a lot riding on its advertising dollars.

Considering 92% of Twitter’s 2021 revenue was dependent on advertising, according to BusinessofApps data, Twitter may need a much stronger banhammer to keep the advertisers happy.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment