
However, mass reporting could at least flag certain content or accounts for review, said a former Meta employee who worked on its content moderation systems and policies, who spoke to WIRED on condition of anonymity. The more often a certain type of content is flagged, the more likely the algorithm will flag it in the future.However, the AI may not be as accurate for languages that have less material to train the algorithm on, such as Bulgarian, which the former employee said could be more It is likely that human moderators have the final say on whether to remove a piece of content.
Meta spokesman Ben Walters told WIRED that Meta does not remove content based on the volume of reports. “If a piece of content doesn’t violate our community standards, no matter how high the number of reports is, it won’t result in removal of the content,” he said.
Some moderation issues may be the result of human error. “There’s going to be error rates, there’s going to be stuff that gets removed that Meta didn’t want to remove. That happens,” they said. These errors are more likely in non-English languages. Content moderators typically only have a few seconds to review a post before deciding whether to keep it online, which is a measure of their job performance.
There is also the potential for bias among human moderators. “Even after the Ukraine war, the majority actually supported Russia,” Garev said. Galev said it is not unreasonable to think that some moderators might also hold these views, especially in a country with limited independent media.
“There is a lack of transparency about who is making the decisions and who is making the decisions,” said Ivan Radev, board member of the European Journalists Association in Bulgaria, a nonprofit that provides information. “This sentiment is fueling discontent in Bulgaria.” Such opacity leads to confusion.
The imbalance between the ability of coordination activities to acquire content flagging and that of individuals or small civil society organizations whose reports are handed over to human moderators contributes to the impression in Bulgaria that Meta will pro- Russian content is placed above pro-Ukrainian content.
More than half of Bulgaria’s 6.87 million people use Facebook, the country’s main social platform. Bulgaria has long been the target of Russian trolls and pro-Russian propaganda, especially since the war in Ukraine began. Both sympathetic local media and Russian disinformation operations have pushed the pro-Russian narrative, blaming NATO for the conflict.
Ezekiev, a member of the BOEC, told Wired that he was never given an explanation as to why his content was removed or how he made his choices. “If you go against propaganda and talk about the war in Ukraine, your account could be suspended,” he said. Ezekiev said that the whole situation is further murky by Meta’s own lack of transparency around the review process.
It was this frustration that prompted the BOEC to protest at Telus’ Sofia office and resulted in employees (whomselves were largely powerless) being harassed and harassed, although there is no evidence that any Telus moderators have strayed from Meta’s own directives.
In February, Bulgarian media reported that Telus would close its operations in the country and move work to Germany. “As part of the business integration, the work that Telus International has done for Meta in Sofia will be transferred to another of our sites,” said Telus spokeswoman Michelle Brodovich. “Telus International continues to work successfully with Meta, ensuring the highest level of professional standards.” The company did not say whether an investigation into its work in Bulgaria contributed to the decision.