Facebook is testing two new tools on the social network aimed at curbing the searching and sharing of photos and videos that contain child sexual abuse.

“Using our apps to harm children is abhorrent and unacceptable,” said Antigone Davis, who oversees Facebook’s global safety efforts in a blog post on Tuesday.

The move comes as the social network faces more pressure to combat this problem amid plans to end-to-end encrypt messages on Facebook Messenger and its photo serviceInstagram by default. This added layer of security means that messages can’t be viewed by anyone outside the sender and recipient, including Facebook and law enforcement officials. Child safety advocates have raised concerns that Facebook’s encryption plans could make it harder to crack down on child predators.

The first tool Facebook is testing is a pop-up that appears if you search for a term that is associated with child sexual abuse. The notice will ask you if you want to continue and includes a link to offender diversion organizations. The pop-up also notes child sexual abuse is illegal and viewing these images can lead to consequences including imprisonment.

Facebook tests tools to combat child sexual abuse

Last year, Facebook said analyzed the child sexual abuse content reported to the National Center for Missing and Exploited Children. The company found more than 90% of the content was the same or similar to previously reported content. Copies of six videos made up more than half of the child exploitative content reported in October and November 2020.

“The fact that only a few pieces of content were responsible for many reports suggests that a greater understanding of intent could help us prevent this revictimization,” Davis wrote in the blog post. The company also conducted another analysis that showed users were sharing these images for other reasons outside of harming the child, including “outrage or in poor humor.”

The second tool Facebook said it was testing is an alert that will inform you if you try to share these harmful images. The safety alert tells you if you share this type of content again your account may get disabled. The company said it’s using this tool to help identify “behavioral signals” of users who might be at a greater disk of sharing this harmful content. This will help the company “educate them on why it is harmful and encourage them not to share it on any surface — public or private,” Davis said.

Facebook also updated its child safety policies and reporting tools. The social media giant said it will pull down Facebook profiles, Pages, groups and Instagram accounts “that are dedicated to sharing otherwise innocent images of children with captions, hashtags or comments containing inappropriate signs of affection or commentary about the children depicted in the image.” Facebook users who report content will also see an option to let the social network know that the photo or video “involves a child,” allowing the company to prioritize it for review.

Facebook tests tools to combat child sexual abuse

During the coronavirus pandemic, online child sexual abuse images have increased, according to a Januaryreport by Business Insider. From July to September, Facebook detected at least 13 million of these harmful images on the main social network and Instagram.

Leave a Reply

Your email address will not be published. Required fields are marked *