The eroticize arrobicspushback against Apple's plan to scan iPhone photos for child exploitation images was swift and apparently effective.
Apple said Friday that it is delaying the previously announced system that would scan iPhone users' photos for digital fingerprints that indicated the presence of known Child Sexual Abuse Material (CSAM). The change is in response to criticism from privacy advocates and public outcry against the idea.
"Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material," a September 3 update at the top of the original press release announcing the program reads. "Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."
Announced in August, the new feature for iOS 15 would have checked photos in an iPhone user's photo library — on the device before sending the photos to iCloud — against a database of known CSAM images. If the automated system found a match, the content would be sent to a human reviewer, and ultimately reported to child protection authorities.
The fact that the scanning happened on the device alarmed both experts and users. Beyond it being generally creepy that Apple would have the ability to view photos users hadn't even sent to the cloud yet, many criticized the move as hypocritical for a company that has leaned so heavily into privacy. Additionally, the Electronic Frontier Foundation criticized the ability as a "backdoor" that could eventually serve as a way for law enforcement or other government agencies to gain access to an individual's device.
"Even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor," the EFF said at the time.
Experts who had criticized the move were generally pleased with the decision to do more research.
This Tweet is currently unavailable. It might be loading or has been removed.
This Tweet is currently unavailable. It might be loading or has been removed.
Others said the company should go further to protect users' privacy. The digital rights organization fight for the future said Apple should focusing on strengthening encryption.
This Tweet is currently unavailable. It might be loading or has been removed.
While other companies scan cloud-based photo libraries for CSAM, like Google, and the overall goal of protecting children is obviously a good one, Apple thoroughly bungled the rollout of this product with privacy concerns justifiably overshadowing the intended purpose. Better luck next time, folks.
Topics Apple Cybersecurity iPhone Privacy
iPhone 16 rumors that'll make you skip the iPhone 15From Standby to NameDrop: 17 new features in iOS 17Elon Musk might charge everyone a fee to use Twitter/XJoe Biden launches American Climate Corps to tackle climate change through paid job training'Interview with the Vampire' reboot is a brilliant gay fever dreamiPhone 15 FineWoven cases on sale: Save 5% at AmazonBest headphones deal: Get Bose 700 headphones for $80 off6 features iPhone 15 stole from Pixel 7Which iPhone 15 should you get? Comparing price, specs, camerasWordle today: Here's the answer and hints for September 20 Animal shelter finds forever homes for furry 'aliens' with Area 51 memes Cool dad attempts to make a perfect splash for his daughter's pic Monica Lewinsky's reply to this tweet about career advice is absolutely perfect The Times front page just *happened* to pair Trump story with a cockatoo photo Pastry chef Claire Saffitz attempts to explain what it's like to be the internet's crush Mark Ruffalo's Avengers 'wrong answers only' meme is the best one yet Alexandria Ocasio 14 best Guy Fieri tweets of 2019 (so far) Marianne Williamson tells kid reporter her cat died Brexit is destroying sex lives
0.2394s , 9939.953125 kb
Copyright © 2025 Powered by 【eroticize arrobics】Apple delays plan to check iPhones for child abuse images,Global Hot Topic Analysis