Child sexual abuse material on the web has grown exponentially in recent years. In 2019, there were 69.1 million files reported to the National Center for Missing and Exploited Children in the US — triple the levels of 2017 and a 15,000% increase over the previous 15 years. A new AI-powered tool called Safer aims to stem the flow of abusive content, find the victims, and identify the perpetrators. The system uses machine learning to detect new and unreported child sexual abuse material (CSAM). Thorn, the non-profit behind Safer, says it spots the content with greater than 99% precision. Thorn built the tool for businesses…
This story continues at The Next Web
No comments:
Post a Comment