TechnologyThursday 08.12.21

Apple’s Plan to Scan Your Photos

Apple is ready to launch a system to scan iCloud Photos in the U.S. for child sexual abuse material (CSAM). Their clever NeuralHash process doesn’t actually view your photos, and should only report “collections of known CSAM images” to law enforcement. But privacy advocates are alarmed that this private company would build any type of government backdoor into users’ encrypted data.

Want to show your appreciation for The Recount's coverage?
Tip us!Tip us!

Reframe

ADVERTISEMENT