Adults or sexually abused minors? Getting it right vexes Facebook
Facebook is a leader among tech companies in detecting child sexual abuse content, which has exploded on social media and across the internet in recent years. But concerns about makenly accusing people of posting illegal imagery have resulted in a policy that could allow photos and videos of abuse to go unreported.
Meta, the parent company of Facebook, Instagram, Messenger and WhatsApp, has instructed content moderators for its platforms to “err on the side of an adult” when they are uncertain about the age of a person in a photo or video, according to a corporate training document.
Antigone Davis, head of safety for Meta, confirmed the policy in an interview and said it stemmed from privacy concerns for those who post sexual imagery of adults. “The sexual abuse of children online is abhorrent,” Davis said, emphasizing that Meta employs a multilayered, rigorous review process that flags far more images than any other tech company. She said the consequences of erroneously flagging child sexual abuse could be “life changing” for users.
While it is impossible to quantify the number of images that might be misclassified, child safety experts said the company was undoubtedly missing some minors. Studies have found that children are physically developing earlier than they have in the past. Also, certain races and ethnicities enter puberty at younger ages, with some Black and Hispanic children, for example, doing so earlier than Caucasians.
“We’re seeing a whole population of youth that is not being protected,” said Lianna McDonald, executive director of the Canadian Center for Child Protection, an organization that tracks the imagery globally.
Each day, moderators review millions of photos and videos from around the world to determine whether they violate Meta’s rules of conduct or are illegal. Last year, the company made nearly 27 million reports of suspected child abuse to a national clearinghouse in Washington that then decides whether to refer them to law enforcement. The company accounts for more than 90% of the reports made to the clearinghouse.
The training document, obtained The New York Times, was created for moderators working for Accenture, a consulting firm that has a contract to sort through Facebook’s noxious content and remove it from the site. The age policy was first disclosed in California Law Review a law student, Anirudh Krishna, who wrote last year that some moderators at Accenture disagreed with the practice, which they referred to as “bumping up” adolescents to young adults.
Accenture declined to comment on the practice.
Technology companies are legally required to report “apparent” child sexual abuse material, but “apparent” is not defined the law. The Stored Communications Act, a privacy law, shields companies from liability when making the reports, but Davis said it was unclear whether the law would protect Meta if it erroneously reported an image. She said lawmakers in Washington needed to establish a “clear and consent standard” for everyone to follow.
Legal and tech policy experts said that social media companies had a difficult path to navigate. If they fail to report suspected illicit imagery, they can be pursued the authorities; if they report legal imagery as child sexual abuse material, they can be sued and accused of acting recklessly.
“I could find no courts coming close to answering the question of how to strike this balance,” said Paul Ohm, a former prosecutor in the Justice Department’s computer crime division who is now a professor at Georgetown Law. “I don’t think it’s unreasonable for lawyers in this situation to put the thumb on the scale of the privacy interests.”
Charlotte Willner, who leads an association for online safety professionals and previously worked on safety issues at Facebook and Pinterest, said the privacy concerns meant that companies “aren’t incentivized to take risks.”
But McDonald, of the Canadian center, said the rules should err on the side of “protecting children,” just as they do in commerce. She cited the example of cigarette and alcohol vendors, who are trained to ask for identification if they have doubts about a customer’s age.
Representatives for Apple; Snap, the owner of Snapchat; and TikTok said their companies took the opposite approach of Meta, reporting any sexual image in which a person’s age was in question. Some other companies that scan their services for illegal imagery, including Dropbox, Google, Microsoft and Twitter, declined to comment on their practices.
In interviews, four former content moderators contracted Meta said they encountered sexual images every day that were subject to the age policy. The moderators said they could face negative performance reviews if they made too many reports that were deemed out of policy. They spoke on the condition of anonymity because of nondisclosure agreements and concerns about future employment.
“They were letting so many things slide that we eventually just didn’t bring things up anymore,” said one of the former moderators, who described detecting images of oral sexual abuse and other explicit acts during his recent two-year tenure at Accenture. “They would have some crazy, extravagant excuse like, ‘That blurry portion could be pubic hairs, so we have to err on the side of it being a young adult.’”
The number of reports of suspected child sexual abuse has grown exponentially in recent years. The high volume, up from roughly 100,000 in 2009, has overwhelmed both the national clearinghouse and law enforcement officials. A 2019 investigation the Times found that the FBI could only manage its case load from the clearinghouse limiting its focus to infants and toddlers.
Davis said a policy that resulted in more reports could worsen the bottleneck. “If the system is too filled with things that are not useful,” she said, “then this creates a real burden.”
But some current and former investigators said the decision should be made law enforcement.
“No one should decide not to report a possible crime, especially a crime against a child, because they believe that the police are too busy,” said Chuck Cohen, who led a child exploitation task force in Indiana for 14 years.
This article originally appeared in The New York Times.