国产精品美女一区二区三区-国产精品美女自在线观看免费-国产精品秘麻豆果-国产精品秘麻豆免费版-国产精品秘麻豆免费版下载-国产精品秘入口

Set as Homepage - Add to Favorites

【eroticize arrobics】Apple delays plan to check iPhones for child abuse images

Source:Global Hot Topic Analysis Editor:hotspot Time:2025-07-02 18:04:01

The eroticize arrobicspushback against Apple's plan to scan iPhone photos for child exploitation images was swift and apparently effective.

Apple said Friday that it is delaying the previously announced system that would scan iPhone users' photos for digital fingerprints that indicated the presence of known Child Sexual Abuse Material (CSAM). The change is in response to criticism from privacy advocates and public outcry against the idea.

"Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material," a September 3 update at the top of the original press release announcing the program reads. "Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."


You May Also Like

Announced in August, the new feature for iOS 15 would have checked photos in an iPhone user's photo library — on the device before sending the photos to iCloud — against a database of known CSAM images. If the automated system found a match, the content would be sent to a human reviewer, and ultimately reported to child protection authorities.

Mashable Light Speed Want more out-of-this world tech, space and science stories? Sign up for Mashable's weekly Light Speed newsletter. By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy. Thanks for signing up!

The fact that the scanning happened on the device alarmed both experts and users. Beyond it being generally creepy that Apple would have the ability to view photos users hadn't even sent to the cloud yet, many criticized the move as hypocritical for a company that has leaned so heavily into privacy. Additionally, the Electronic Frontier Foundation criticized the ability as a "backdoor" that could eventually serve as a way for law enforcement or other government agencies to gain access to an individual's device.

"Even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor," the EFF said at the time.

Experts who had criticized the move were generally pleased with the decision to do more research.

Others said the company should go further to protect users' privacy. The digital rights organization fight for the future said Apple should focusing on strengthening encryption.

While other companies scan cloud-based photo libraries for CSAM, like Google, and the overall goal of protecting children is obviously a good one, Apple thoroughly bungled the rollout of this product with privacy concerns justifiably overshadowing the intended purpose. Better luck next time, folks.

Topics Apple Cybersecurity iPhone Privacy

0.2394s , 9939.953125 kb

Copyright © 2025 Powered by 【eroticize arrobics】Apple delays plan to check iPhones for child abuse images,Global Hot Topic Analysis  

Sitemap

Top 主站蜘蛛池模板: 果冻传媒视频在线播放 | 午夜无码黄 | 99视频免费观看 | 一区二区三区不卡在线观看 | 91制片厂制作果冻传媒 | 成人黄18免费网站 | 点击进入好看的电影网站 | 91福利网站| 午夜av在线不卡 | 91探花国产综合在线精 | 91精品国产在热久久下载 | 国产va无码人在线观看天堂 | 99国产精品欲一区二区三区 | 97精品伊人久久大香线蕉app | 99久久精品午夜一区二区 | 日韩av在线播放网址 | 午夜麻豆福利视频 | 97SE亚洲精品一区 | 丰满人妻在公车被猛烈进入电影 | 91精品人| 午夜福利av无码 | 午夜毛片不卡免费观看视频 | 午夜国产成人精品日本亚洲专 | 丰满无码人妻束缚无码区 | 国产办公室紧身裙丝袜av在线 | 91精产品一永久下载安装免费 | 91精产国品一二三区在线 | 午夜十二点 | 国产av丝袜旗袍无码网站 | 91久久久 | 91popny肥熟国产老肥熟 | 91久久久精品无码一区二区大全 | av手机电影在线不卡 | www.日本在线视频 | 91麻豆国产免费 | 国产aa成人网站 | 91与国产超碰在线观看 | 一区视频在线观 | 高清国产无码自拍 | 国产不卡在 | 高潮流白浆潮喷在线播放视频 |