Today Apple announced a set of measures aimed at improving child safety in the USA. While well-intentioned, their plans risk opening the door to mass surveillance around the world while arguably doing little to improve child safety.
Among the measures, Apple has announced that it is to introduce “on-device machine learning” which would analyse attachments for sexually explicit material, send a warning, and begin scanning every photo stored on its customers’ iCloud in order to detect child abuse.
Apple, which has for years marketed itself as a global leader on privacy, is at pains to reassure that these measures are “designed with user privacy in mind” – because, as they explain it, all the processing is done on each user’s device. But make no mistake: no matter how Apple describes it, these measures undermine encryption and threaten everyone’s privacy and security.