Apple is about to announce a new technology for scanning individual users’ iPhones for banned content. While it will be billed as a tool for detecting child abuse imagery, its potential for misuse is vast based on details entering the public domain.
The neural network-based tool will scan individual users’ iDevices for child sexual abuse material (CSAM), respected cryptography professor Matthew Green told The Register today.
Rather than using age-old hash-matching technology, however, Apple’s new tool – due to be announced today along with a technical whitepaper, we are told – will use machine learning techniques to identify images of abused children.