Technology

How war in Ukraine roiled Facebook and Instagram

Meta, which owns Facebook and Instagram, took an unusual step last week: It suspended some of the quality controls that ensure that posts from users in Russia, Ukraine and other Eastern European countries meet its rules.
Under the change, Meta temporarily stopped tracking whether its workers who monitor Facebook and Instagram posts from those areas were accurately enforcing its content guidelines, six people with knowledge of the situation said. That’s because the workers could not keep up with shifting rules about what kinds of posts were allowed about the war in Ukraine, they said.
Meta has made more than half a dozen content policy revisions since Russia invaded Ukraine last month. The company has permitted posts about the conflict that it would normally have taken down — including some calling for the death of President Vladimir Putin of Russia and violence against Russian soldiers — before changing its mind or drawing up new guidelines, the people said.
The Revenok family and their friends on a train in Kyiv after fleeing the besieged city of Chernihiv, Ukraine, on Tuesday, March 29, 2022. The first sign of progress emerged in peace talks between Ukraine and Russia on Tuesday as a deputy Russian defense miner said Russia would sharply “reduce military activity” near Kyiv, Ukraine’s capital, and the northern city of Chernihiv. (Ivor Prickett/The New York Times)
The result has been internal confusion, especially among the content moderators who patrol Facebook and Instagram for text and images with gore, hate speech and incitements to violence. Meta has sometimes shifted its rules on a daily basis, causing whiplash, said the people, who were not authorized to speak publicly.
The bewilderment over the content guidelines is just one way that Meta has been roiled the war in Ukraine. The company has also contended with pressure from Russian and Ukrainian authorities over the information battle about the conflict. And internally, it has dealt with discontent about its decisions, including from Russian employees concerned for their safety and Ukrainian workers who want the company to be tougher on Kremlin-affiliated organizations online, three people said.

Dani Lever, a Meta spokeswoman, declined to directly address how the company was handling content decisions and employee concerns during the war.
After Russia invaded Ukraine, Meta said it established a round-the-clock special operations team staffed employees who are native Russian and Ukrainian speakers. It also updated its products to aid civilians in the war, including features that direct Ukrainians toward reliable, verified information to locate housing and refugee assance.
Russia Today employees work at the Kremlin-funded network’s studio in London, Feb. 22, 2017. Meta has limited Russia Today’s reach on its platforms. (Sergey Ponomarev/The New York Times)
Mark Zuckerberg, Meta’s chief executive, and Sheryl Sandberg, the chief operating officer, have been directly involved in the response to the war, said two people with knowledge of the efforts. But as Zuckerberg focuses on transforming Meta into a company that will lead the digital worlds of the so-called metaverse, many responsibilities around the conflict have fallen — at least publicly — to Nick Clegg, the president for global affairs.
Last month, Clegg announced that Meta would restrict access within the European Union to the pages of Russia Today and Sputnik, which are Russian state-controlled media, following requests Ukraine and other European governments. Russia retaliated cutting off access to Facebook inside the country, claiming the company discriminated against Russian media, and then blocking Instagram.

This month, President Volodymyr Zelenskyy of Ukraine praised Meta for moving quickly to limit Russian war propaganda on its platforms. Meta also acted rapidly to remove an edited “deepfake” video from its platforms that falsely featured Zelenskyy yielding to Russian forces.

The company has made high-profile makes as well. It permitted a group called the Ukrainian Legion to run ads on its platforms this month to recruit “foreigners” for the Ukrainian army, a violation of international laws. It later removed the ads — which were shown to people in the United States, Ireland, Germany and elsewhere — because the group may have misrepresented ties to the Ukrainian government, according to Meta.
Internally, Meta had also started changing its content policies to deal with the fast-moving nature of posts about the war. The company has long forbidden posts that might incite violence. But Feb. 26, two days after Russia invaded Ukraine, Meta informed its content moderators — who are typically contractors — that it would allow calls for the death of Putin and “calls for violence against Russians and Russian soldiers in the context of the Ukraine invasion,” according to the policy changes, which were reviewed The New York Times.
This month, Reuters reported on Meta’s shifts with a headline that suggested that posts calling for violence against all Russians would be tolerated. In response, Russian authorities labeled Meta’s activities as “extrem.”
Volunteers stack sandbags around the Princess Olga Minument in Kyiv, Ukraine, on Wednesday, March 30, 2022. Hopes that peace talks could ease Russia’s assault on Ukraine were dampened on Wednesday when local officials reported new attacks on the outskirts of Kyiv and the northern city of Chernihiv, two areas where Russia had vowed to sharply reduce combat operations. (Daniel Berehulak/The New York Times)
Shortly thereafter, Meta reversed course and said it would not let its users call for the deaths of heads of state.
“Circumstances in Ukraine are fast moving,” Clegg wrote in an internal memo that was reviewed The Times and first reported Bloomberg. “We try to think through all the consequences, and we keep our guidance under constant review because the context is always evolving.”
Meta amended other policies. This month, it made a temporary exception to its hate speech guidelines so users could post about the “removal of Russians” and “explicit exclusion against Russians” in 12 Eastern European countries, according to internal documents. But within a week, Meta tweaked the rule to note that it should be applied only to users in Ukraine.
The constant adjustments left moderators who oversee users in Central and Eastern European countries confused, the six people with knowledge of the situation said.
The policy changes were onerous because moderators were generally given less than 90 seconds to decide on whether images of dead bodies, videos of limbs being blown off or outright calls to violence violated Meta’s rules, they said. In some instances, they added, moderators were shown posts about the war in Chechen, Kazakh or Kyrgyz, despite not knowing those languages.
Lever declined to comment on whether Meta had hired content moderators who specialize in those languages.
Emerson T. Brooking, a senior fellow at the Digital Forensic Research Lab of the Atlantic Council, which studies the spread of online disinformation, said Meta faced a quandary with war content.
“Usually, content moderation policy is intended to limit violent content,” he said. “But war is an exercise in violence. There is no way to sanitize war or to pretend that it is anything different.”
Meta has also faced employee complaints over its policy shifts. At a meeting this month for workers with ties to Ukraine, employees asked why the company had waited until the war to take action against Russia Today and Sputnik, said two people who attended. Russian state activity was at the center of Facebook’s failure to protect the 2016 US presidential election, they said, and it didn’t make sense that those outlets had continued to operate on Meta’s platforms.
People charge their smartphones at the main train station in Kyiv, Ukraine, on Monday, March 28, 2022. A steady trickle of people are still leaving the city on a daily basis, either heading further west or on to destinations in Europe. (Ivor Prickett/The New York Times)
While Meta has no employees in Russia, the company held a separate meeting this month for workers with Russian connections. Those employees said they were concerned that Moscow’s actions against the company would affect them, according to an internal document.
In discussions on Meta’s internal forums, which were viewed The Times, some Russian employees said they had erased their place of work from their online profiles. Others wondered what would happen if they worked in the company’s offices in places with extradition treaties to Russia and “what kind of risks will be associated with working at Meta not just for us but our families.”
Lever said Meta’s “hearts go out to all of our employees who are affected the war in Ukraine, and our teams are working to make sure they and their families have the support they need.”
At a separate company meeting this month, some employees voiced unhappiness with the changes to the speech policies during the war, according to an internal poll. Some asked if the new rules were necessary, calling the changes “a slippery slope” that were “being used as proof that Westerners hate Russians.”
Others asked about the effect on Meta’s business.
“Will Russian ban affect our revenue for the quarter? Future quarters?” read one question. “What’s our recovery strategy?”
This article originally appeared in The New York Times.

Related Articles

Back to top button