
Johansen’s experience is common among pro-choice activist communities. Most people who spoke to Wired said their content appeared to have been automatically removed by artificial intelligence, rather than being reported by other users.
Activists also worry that even if the content isn’t completely removed, its reach could be limited by the platform’s artificial intelligence.
While it’s nearly impossible for users to discern how Meta’s AI moderation is being implemented on their content, last year the company announced that it would de-emphasize political and news content in users’ News Feeds. Meta did not answer a question about whether abortion-related content is classified as political content.
Just as the different abortion activists WIRED spoke with experienced varying degrees of moderation on the Meta platform, so did users in different regions around the world. Wired magazine attempted to post the same phrase “abortion pills available by mail” in English, Spanish and Tagalog on Facebook and Instagram accounts in the UK, US, Singapore and the Philippines. Instagram removed English-language posts of the phrase from the U.S., where abortion is illegal in some states following a court ruling last week. But a post in Spanish in the US and one in Tagalog in the Philippines both stopped.
When posted in English from the UK, the phrase still exists on Facebook and Instagram. When posted in English in Singapore, abortion is legal and widely available, the phrase is still on Instagram but flagged on Facebook.
Courtesy of Kenneth DiMalibot
Courtesy of Kenneth DiMalibot
Ensley told WIRED that Reproaction has been very successful with its Instagram campaigns about abortion interviews in Spanish and Polish, and hasn’t seen any issues facing the group’s English-language content.
“Meta, in particular, relies heavily on automated systems that are extremely sensitive to English and less sensitive to other languages,” said Katharine Trendacosta, associate director of policy and advocacy at the Electronic Frontier Foundation.
WIRED also tested Meta for moderation using a Schedule 1 substance, which is available for recreational use in 19 states and medicinal in 37 states, sharing the short “Cannabis is available by mail” on English-language Facebook in the U.S. language. This post has not been flagged.
“Content moderation using artificial intelligence and machine learning takes a long time to establish and a lot of effort to maintain,” said a former Meta employee familiar with the organization’s content moderation practices, who asked not to be named. “As the situation changes, you need to change the pattern, but it takes time and effort. So when the world is changing rapidly, these algorithms often don’t work optimally and may perform less accurately during periods of drastic change.”
However, Trendacosta worries that law enforcement might also flag content for removal. In Meta’s 2020 Transparency Report, the company noted that it “restricted access to 12 U.S. state attorneys general reports related to the promotion and sale of regulated goods and services, and 15 U.S. attorneys general reports of alleged involvement in Price gouging.” All positions were later reinstated. “The state attorneys general can only say to Facebook, ‘Take this stuff down,’ and it’s very dangerous that Facebook will do that even if they eventually get it back,” Trendacosta said.