As platforms get better at detecting child abuse videos, they’re finding more of them - 4 minutes read


As platforms get better at detecting child abuse videos, they’re finding more of them
More than 41 million videos of child sexual abuse were reported to the National Center for Missing and Exploited Children in 2019, partly because the videos have become easier for technology platforms like Facebook and Snapchat to detect, The New York Times reports. Just five years ago, the number of videos reported was less than 350,000. A record-breaking 70 million total images and videos were reported to the center last year; many of them were reported more than once across multiple platforms as users shared the illegal content.
Facebook reported almost 60 million photos and videos, the Times report states, based on some 15.9 million reports. Facebook tells The Verge that not all of that content is considered “violating” and that only about 29.2 million met that criteria.
Google reported 3.5 million total videos and photos in about 449,000 reports, and Imgur reported 260,000 photos and videos based on about 74,000 reports. It’s worth noting that the number of reports and the number of actual images discovered weren’t always proportional. For instance, Dropbox only made about 5,000 reports in 2019, but found more than 250,000 photos and videos, according to the Times. Apple was apparently one of the lower-reporting big companies, tipping 3,000 images and no videos. And Amazon was almost entirely absent from the list.
It makes sense that Facebook would have the highest number of reports; it’s the largest social media platform with more than 2.3 billion users. And last August, Facebook open-sourced the algorithms it uses to identify child sexual exploitation and other graphic content on its platform, so it could remove such content more quickly, which may have played a role in its much higher number. So even though it’s number one on this list, that may be because it’s actively doing more to find this content.
“The size and expertise of our team, together with our sophisticated technology, have made us industry leaders in detecting, removing and reporting these images, and thwarting people from sharing them,” Antigone Davis, Facebook global head of safety said in a emailed statement to The Verge,adding that the company “would continue to develop the best solutions to keep more children safe.”
But even with better detection of video content, it’s still not possible to fully map the problem of online videos of child sexual abuse. As the Times notes, some cloud storage services, including Amazon, don’t scan for illegal content. And content on Apple’s messaging app is encrypted, so Apple can’t scan it to find illegal material.
Privacy is at the heart of a debate over how to detect and remove this content without introducing unnecessary friction for users. Facebook is considering moving toward encryption, but taking a lot of flak for it for it. And a draft bill to create a National Commission on Online Child Exploitation Prevention would reduce legal protections for websites while establishing rules for detecting and removing content that exploits children — potentially including limits on encryption.

Source: The Verge

Powered by NewsAPI.org

Keywords:

Child abuseChild sexual abuseNational Center for Missing & Exploited ChildrenFacebookSnapchatThe New York TimesFacebookFacebookGoogleImgurDropbox (service)Apple Inc.Amazon.comFacebookSocial mediaFacebookOpen-source softwareChild sexual abuseTechnologyIndustryLeadershipAntigone (Sophocles play)FacebookChild sexual abuseCloud computingAmazon.comContent managementContent managementApple Inc.Instant messagingPrivacyFacebookEncryptionInternetRisk managementLawLawEncryption