Trad, whose own Facebook account had once been suspended after being spuriously reported for hosting extremist content, says that pro-Russian groups will often organize on Telegram and choose which accounts or posts to report and get removed from Facebook. Some of these groups, according to Trad, operate from Russia, while others may be paid-for trolls from within Bulgaria, where labor is relatively cheap.
According to Todor Galev, director of research at the Center for the Study of Democracy, a European public policy think tank, the Atlantic Council’s Bulgarian Facebook page has been banned several times after being mass reported. He says the accounts of prominent pro-NATO and pro-EU journalists and media outlets have also been targeted.
“We suspect that Facebook relies mostly on algorithms for small markets like Bulgaria,” says Galev. “Because human moderation is very limited. There are only a few people working [on moderation] for Bulgaria.”
A former Meta employee who worked on its content moderation systems and policy, and who spoke to WIRED on the condition of anonymity, says, however, that mass reporting could at least get certain pieces of content or accounts flagged for review. And the more frequently a certain type of content is flagged, the more likely the algorithm will be to flag it in the future. However, with languages where there is less material to train the algorithm, like Bulgarian, and AI might be less accurate, the former employee says that it’s possibly more likely that a human moderator would make the final call about whether or not to remove a piece of content.
Meta spokesperson Ben Walters told WIRED that Meta does not remove content based on the number of reports. “If a piece of content does not violate our Community Standards, no matter how high the number of reports is, it won’t lead to content removal,” he says.
Some moderation issues could be the result of human error. “There are going to be error rates, there are going to be things that get taken down that Meta did not mean to take down. This happens,” they say. And these errors are even more likely in non-English languages. Content moderators are often given only seconds to review posts before having to make a decision about whether or not it will stay online, an indicator through which their job performance is measured.
There is also a real possibility that there could be bias among human moderators. “The majority of the population actually supports Russia even after the war in Ukraine,” says Galev. Galev says that it’s not unreasonable to think that some moderators might also hold these views, particularly in a country with limited independent media.
“There’s a lack of transparency around who is who is deciding, who is making the decision,” says Ivan Radev, a board member of the Association of European Journalists Bulgaria, a nonprofit, which put out a statement condemning Bird.bg’s posting of employee information. “This sentiment is feeding dissatisfaction in Bulgaria.” This opacity can breed confusion.
The imbalance between the ability of coordinated campaigns to get content flagged, and that of individuals or small civil society organizations, whose reports go to human moderators, has helped to create an impression in Bulgaria that Meta is prioritizing pro-Russian content over pro-Ukrainian content.