Friday 10.01.21
The Truth Behind Missing White Woman Syndrome
Apple is ready to launch a system to scan iCloud Photos in the U.S. for child sexual abuse material (CSAM). Their clever NeuralHash process doesn’t actually view your photos, and should only report “collections of known CSAM images” to law enforcement. But privacy advocates are alarmed that this private company would build any type of government backdoor into users’ encrypted data.