A nonprofit was unable to promote several tweets that included the terms “illegal alien” and “criminal alien” on Twitter. The social media platform told the nonprofit the tweets violated its “Hateful Content” policy. The issue received significant media attention and Twitter reversed the decision, saying it was “made in error.”
The Center for Immigration Studies (CIS), which advocates a “low-immigration, pro-immigrant” policy, tried to start paid promotion campaigns on 11 of its tweets on Sept. 11, but Twitter refused to start the campaigns on four of them.
“A new video from the @DailyCaller showing illegal aliens pouring across the border reminds us why we need a wall,” reads one of the rejected tweets. “Technologies and adequate manpower are well and good, but the best defense is always to prevent individuals from entering in the first place.”
The other three tweets also included the legal terms “illegal alien” or “criminal aliens” and discussed law enforcement measures against illegal immigration.
Twitter users can purchase campaigns to promote their tweets more widely, beyond their followers.
The CIS reached out to Twitter and received what appeared to be a standardized reply:
“We’ve reviewed your tweets and confirmed that it is ineligible to participate in the Twitter Ads program at this time based on our Hateful Content policy. Violating content includes, but is not limited to, that which is hate speech or advocacy against a protected group.”
The CIS documented the issue in a Sept. 12 press release, asking, “Do these tweets illustrate ‘hateful content’, or is Twitter filtering content with a political bias?”
Multiple right-leaning media outlets picked up the story and CIS Executive Director Mark Krikorian was interviewed by Fox News’ Tucker Carlson that night.
It was on the show that Krikorian learned that Twitter had reversed the decision to block the campaigns.
A Twitter spokesperson made a similar statement to The Epoch Times in a Sept. 13 email, saying, “this decision was overturned by our team and was made in error.”
“We enforce our rules dispassionately and judiciously but sometimes mistakes happen—both on the conservative and the liberal side,” the spokesperson said in an email, mirroring the Congress testimonies of the platform’s CEO, Jack Dorsey, on Sept. 5.
But CIS still hasn’t been informed by Twitter.
“No one has told us that,” Marguerite Telford, CIS director of communications, told The Epoch Times over the phone on Sept. 13 afternoon.
A Pattern of Bias
Twitter has repeatedly been caught repressing conservative voices.
For several years, Twitter users have accused the company of shadowbanning—hiding a user’s content from other users without informing them.
One method of shadowbanning is the so-called “quality filter,” which removes affected accounts from the “latest” category of search results—unless the user manually switches the filter off. The filter then snaps back on after each search.
The Epoch Times previously reviewed dozens of Twitter accounts of Trump supporters and opponents that appeared to exhibit similar patterns of behavior. Only the Trump supporters were affected by the filter.
The Epoch Times wasn’t able to find any official announcement of this particular function.
Also, several Republican Congressmen had their Twitter accounts scrapped from the platform’s search suggestion function. When called out by some media in July, Twitter corrected the situation and said it was the result of a function that improperly penalized accounts for the behavior of their followers. The function affected 600,000 accounts, both liberal and conservative, the company stated. But, despite extensive testing, media uncovered only four affected lawmakers—all Republican.
Twitter employees have previously told undercover reporters that the platform had unwritten rules and a company culture that at least condoned suppressing conservative voices.
Dorsey acknowledged in one of his testimonies that algorithms that Twitter uses to filter content may be made biased unintentionally. He said the company is “very, very early” in its work on addressing this issue.
From The Epoch Times