This article is from the source 'rtcom' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.rt.com/usa/533878-apple-csam-delay/

The article has changed 3 times. There is an RSS feed of changes available.

Version 1 Version 2
Apple DELAYS controversial plan to scan iPhones for child abuse images following privacy backlash Apple DELAYS controversial plan to scan iPhones for child abuse images following privacy backlash
(about 2 months later)
Apple has announced it will “take additional time” in the coming months to work on plans for flagging child sexual abuse material (CSAM), amid concerns from activists and rights groups over censorship and privacy issues.Apple has announced it will “take additional time” in the coming months to work on plans for flagging child sexual abuse material (CSAM), amid concerns from activists and rights groups over censorship and privacy issues.
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” an Apple spokesperson said in a statement on Friday.“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” an Apple spokesperson said in a statement on Friday.
The delay follows a controversial announcement that was immediately met with calls to abandon the plans from civil rights groups, including the American Civil Liberties Union (ACLU).The delay follows a controversial announcement that was immediately met with calls to abandon the plans from civil rights groups, including the American Civil Liberties Union (ACLU).
Apple’s technology would scan photos and conversations for CSAM, using a program the company previously claimed would still protect individual privacy because the technology does not identify the overall details of a picture or conversation, or need to be in possession of either – though many critics have voiced their doubts.Apple’s technology would scan photos and conversations for CSAM, using a program the company previously claimed would still protect individual privacy because the technology does not identify the overall details of a picture or conversation, or need to be in possession of either – though many critics have voiced their doubts.
The system uses a database of references or ‘image hashtags’ to recognize specific content to be flagged, though security experts have warned that such technology could likely be manipulated, or innocent images could be misinterpreted. The system uses a database of references or ‘image hashtags’ to recognize specific content to be flagged, though security experts have warned that such technology could likely be manipulated, or innocent images could be misinterpreted. 
Even Apple employees have reportedly expressed concerns with the detection technology, worrying that it could be used to work around encryption protections, that it could easily misidentify and flag some photos – or even that some governments could exploit it to find other material. Apple maintains that it will refuse any requests from governments to use the system for anything other than child abuse images.Even Apple employees have reportedly expressed concerns with the detection technology, worrying that it could be used to work around encryption protections, that it could easily misidentify and flag some photos – or even that some governments could exploit it to find other material. Apple maintains that it will refuse any requests from governments to use the system for anything other than child abuse images.
“iMessages will no longer provide confidentiality and privacy to those users through an end-to-end encrypted messaging system in which only the sender and intended recipients have access to the information sent,” read a letter from a coalition of more than 90 activist groups to Apple CEO Tim Cook on the potential changes. “iMessages will no longer provide confidentiality and privacy to those users through an end-to-end encrypted messaging system in which only the sender and intended recipients have access to the information sent,” read a letter from a coalition of more than 90 activist groups to Apple CEO Tim Cook on the potential changes. 
The exact timeline for the current delay is unknown, but the new detection system was originally intended to be in use sometime this year.The exact timeline for the current delay is unknown, but the new detection system was originally intended to be in use sometime this year.
If you like this story, share it with a friend!If you like this story, share it with a friend!
Dear readers and commenters,
We have implemented a new engine for our comment section. We hope the transition goes smoothly for all of you. Unfortunately, the comments made before the change have been lost due to a technical problem. We are working on restoring them, and hoping to see you fill up the comment section with new ones. You should still be able to log in to comment using your social-media profiles, but if you signed up under an RT profile before, you are invited to create a new profile with the new commenting system.
Sorry for the inconvenience, and looking forward to your future comments,
RT Team.