Apple drops CSAM plan to scan phones for child porn

Last August, Apple announced a controversial decision: It would scan the iPhones of US users for known child pornographic photos, using CSAM (Child Sexual Abuse Material) images from the National Center for Missing and Exploited Children (NCMEC) The hash value is matched to the image hash on the user’s phone, and if at least 30 matches are found it will be reported to the relevant authority after review. After sparking widespread criticism and opposition, Apple suspended the program. Now it has abandoned plans to scan for child pornography on mobile phones and has instead enhanced the “Communication Safety” feature, which allows parents and caregivers to opt-in to the protection feature through the family iCloud account to ensure the safety of children’s communications. warnings to prevent and reduce the generation of new CSAM.

This article is transferred from: https://www.solidot.org/story?sid=73591
This site is only for collection, and the copyright belongs to the original author.