Leaked document indicates Facebook may be underreporting images of child abuse


A training document used by Facebook’s content moderators raises questions about whether the social network is under-reporting images of potential child sexual abuse, The New York Times reports.The document reportedly tells moderators to “err on the side of an adult” when assessing images, a practice that moderators have taken issue with but company executives have defended.

At issue is how Facebook moderators should handle images in which the age of the subject is not immediately obvious. That decision can have significant implications, as suspected child abuse imagery is reported to the National Center for Missing and Exploited Children (NCMEC), which refers images to law enforcement. Images that depict adults, on the other hand, may be removed from Facebook if they violate its rules, but aren’t reported to outside authorities.

But, as The NYT points out, there isn’t a reliable way to determine age based on a photograph. Moderators are reportedly trained to use a more than 50-year-old method to identify “the progressive phases of puberty,” but the methodology “was not designed to determine someone’s age.” And, since Facebook’s guidelines instruct moderators to assume photos they aren’t sure of are adults, moderators suspect many images of children may be slipping through.

This is further complicated by the fact that Facebook’s contract moderators, who work for outside firms and don’t get the same benefits as full-time employees, may only have a few seconds to make a determination, and may be penalized for making the wrong call.

Facebook, which reports more child sexual abuse material to NCMEC than any other company, says erring on the side of adults is meant to protect users’ and privacy and to avoid false reports that may hinder authorities’ ability to investigate actual cases of abuse. The company’s Head of Safety Antigone Davis told the paper that it may also be a legal liability for them to make false reports. Notably, not every company shares Facebook’s philosophy on this issue. Apple, Snap and TikTok all reportedly take “the opposite approach” and report images when they are unsure of an age.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.



Source link

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *