<\/div>\n<\/div>\n
<\/span><\/p>\n
photo: Anton_Ivanov (Shutterstock)<\/figcaption><\/p>\n<\/div>\n<\/figure>\nApple has officially killed one of its most controversial proposals ever: a plan<\/span> to scan iCloud images for signs of child sexual abuse material (or, CSAM). <\/p>\nYes, last summer, Apple announced<\/span> that it would be rolling out on-device scanning\u2014a new feature in iOS that used advanced tech to quietly sift through individual users’ photos for signs of bad material. The new feature was designed<\/span> so that, should the scanner find evidence of CSAM, it would alert human technicians, who would then presumably alert the police. <\/p>\nThe plan immediately inspired a torrential backlash<\/span> from privacy and security experts, with critics arguing that the scanning feature could ultimately be re-purposed to hunt for other kinds of content. TOvein having such scanning capabilities in iOS was a slippery slope towards broader surveillance abuses, critics alleged, and the general consensus was that the tool could quickly become a backdoor for police. <\/p>\nAt the time, Apple fought hard against these criticismsbut the company ultimately relented and, not long after it initially announced the new feature, it said that it would \u201cpostpone<\/span>\u201d implementation until a later date. <\/p>\nNow, it looks like that date will never come. On Wednesday, amidst announcements for a bevy of new iCloud security features<\/span>, the company also revealed that it would not be moving forward with its plans for on-device scanning. In a statement shared<\/span> with Wired magazine, Apple made it clear that it had decided to take a different route:<\/p>\n\nAfter extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021. We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with advocacy, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all .<\/em><\/p>\n<\/blockquote>\nApple’s plans seemed well-intentioned. CSAM’s digital proliferation is a major problem<\/span>\u2014and experts say that it has only gotten worse in recent years. Obviously, an effort to solve this problem was a good thing. That said, the underlying technology Apple suggested using\u2014and the surveillance dangers it posed\u2014seems like it just wasn’t the right tool for the job.<\/p>\n<\/div>\n .<\/p>\n","protected":false},"excerpt":{"rendered":"
photo: Anton_Ivanov (Shutterstock) Apple has officially killed one of its most controversial proposals ever: a plan to scan iCloud images for signs of child sexual abuse material (or, CSAM). Yes, last summer, Apple announced that it would be rolling out on-device scanning\u2014a new feature in iOS that used advanced tech to quietly sift through individual …<\/p>\n
Apple Officially Ditches Plan to Scan iCloud for Child Abuse Images<\/span> Read More »<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"site-sidebar-layout":"default","site-content-layout":"default","ast-global-header-display":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","jetpack_publicize_message":"","jetpack_is_tweetstorm":false,"jetpack_publicize_feature_enabled":true},"categories":[4],"tags":[2302,453,18166,27391,19138,554,31043,16418,306,9734,582,4634,3138,17272,1323,19139],"jetpack_publicize_connections":[],"yoast_head":"\nApple Officially Ditches Plan to Scan iCloud for Child Abuse Images - harchi90<\/title>\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n\t \n\t \n\t \n