Corporations Pull Twitter Ads After Promoted Tweets Appeared Near Child Porn Requests

by | Sep 28, 2022

Corporations Pull Twitter Ads After Promoted Tweets Appeared Near Child Porn Requests

by | Sep 28, 2022

twitter 2021 10 d

Bryce Durbin / TechCrunch

An investigation conducted by the cybersecurity group Ghost Data found that advertisements from well-known companies appeared on Twitter accounts that promote child pornography. Several corporations have removed their ads from the social networking site in response. Ghost Data identified hundreds of accounts sharing underage pornographic content which Twitter did not remove. 

Disney, NBCUniversal, Coca-Cola and Scottish Rite Children’s Hospital were among 30 firms whose ads appeared on the criminal accounts, Reuters reported after reviewing Ghost Data’s analytics.

Several companies, including  Dyson, Mazda, Ecolab and Cole Haan responded by suspending their Twitter advertisements. “We’re horrified,” David Maddocks, brand president at Cole Haan, told Reuters. He continued, “either Twitter is going to fix this, or we’ll fix it by any means we can, which includes not buying Twitter ads.” One of the shoemaker’s promoted tweets appeared next to a tweet soliciting underage content. 

The ads from major corporations appearing next to illegal content may be a result of Twitter’s failure to remove much of the child pornography posted on the site. “Ghost Data identified over 500 accounts that openly shared or requested child sexual abuse material over a 20-day period this month. Twitter failed to remove more than 70% of the accounts during the study period,” Reuters reported. 

Twitter responded to the report saying it “has zero tolerance for child sexual exploitation” and is investing more resources dedicated to child safety. The company sent an email to several of its advertisers on Wednesday saying it found ads on accounts linked to child sex abuse. The email was shared with Insider

According to the outlet, Twitter told some advertisers it had suspended all ads on profiles that shared the illicit material. The company also said it had “updated its systems” in order to better detect accounts linked with child sexual abuse material and to prevent ads from being served next to this content.

However, the proliferation of child pornography on Twitter has been a well-known phenomenon for some time, and the company has a mixed record of protecting underage victims. In one infamous example, two children were blackmailed into producing sexual content that was posted on Twitter. According to a lawsuit filed by the survivors, even after they provided evidence of being underage, the company refused to remove the video. 

“We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time,” Twitter told the plaintiff in 2020, the lawsuit alleges. Commenting on the case, the National Center for Missing and Exploited Children said, “the facts, in this case, are especially egregious because the electronic service provider was aware of the child victims’ graphic sexual images and refused to remove the videos from the platform.” 

Last month, The Verge reported that Twitter rejected a proposal to monetize adult content on the site because of its inability to identify and remove underage content. “Twitter cannot accurately detect child sexual exploitation and non-consensual nudity at scale,” an internal Twitter report found. 

Human trafficking survivor advocate Eliza Bleu agrees Twitter has not done enough to address the exploitation of minors on its platform, and it will finally start to lose money from advertisers as a response. In a statement to the Libertarian Institute, Bleu said: 

Twitter has knowingly refused to remove child sexual abuse material at scale for a very long time. Unfortunately, it’s gone unchecked for so long that going after Twitter’s advertisers is the Hail Mary before governments and those seeking power step in. Not only is the child sexual exploitation material devastating to the survivors of this crime, but this very real crime is also the perfect excuse for governments to step in, ultimately violating digital privacy rights. If the governments step in and have access to private messages, this will be harmful to political dissidents, activists, journalists, survivors and innocent citizens globally. 

I never blame Twitter for the initial sexual abuse. I blame them for refusing to remove child sexual exploitation material at scale. Twitter has had many priorities over the last few years. They prioritized the censorship of words over the removal of human rights violations committed against children. Perhaps when most of the advertisers pull away from Twitter, they will finally get the message. My only hope now is that they still have money left to compensate the two brave minor survivors suing the platform.

About Kyle Anzalone

Kyle Anzalone is news editor of the Libertarian Institute, opinion editor of Antiwar.com and co-host of Conflicts of Interest with Will Porter and Connor Freeman.

Our Books

latest book lineup.

Related Articles

Related