WhatsApp Head Slams Apple Over Its Tools To Curb Child Abuse, Says Clear Violation Of Privacy

On Thursday, Apple confirmed plans to deploy new technology within iOS, macOS, watchOS, and iMessage that will detect potential child abuse imagery, but clarified crucial details from the ongoing project.

WhatsApp Head Slams Apple Over Its Tools To Curb Child Abuse, Says Clear Violation Of Privacy

News Summary

Apple has said that other child safety groups were likely to be added as hash sources as the programme expands.

Apple confirmed plans to deploy new technology within iOS, macOS, watchOS, and iMessage that will detect potential child abuse imagery, but clarified crucial details from the ongoing project.

WhatsApp Head Will Cathcart has slammed Apple over its plans to launch photo identification measures that would identify child abuse images in iOS photo libraries, saying the Apple software can scan all the private photos on your phone which is a clear privacy violation.

Stressing that WhatsApp will not allow such Apple tools to run on his platform, Cathcart said that Apple has long needed to do more to fight child sexual abuse material (CSAM), "but the approach they are taking introduces something very concerning into the world".