Apple issued a statement, in which the giant announces the decision of delaying the detection technology that would have remotely checked the user's iOS-based devices for child sexual abuse content.
Initially, when it came to the details of child protection technology, concerns were risen, especially by privacy groups. The users considered the
People met the idea of this technology with complete seriousness. Petitions were signed to
“The company must go further than just listening and drop its plans to put a backdoor into its encryption entirely”
“The enormous coalition that has spoken out will continue to demand that user phones – both their messages and their photos – be protected and that the company maintains its promise to provide real privacy to its users.”
Apple said that the negative feedback has been heard and the decision to delay the system update, which would include the new technology, has been made.
The NeuralHash technology is the key tool of the innovation as it would scan the photos or pictures before the content goes to iCloudPhotos. The potential forbidden images would have compared to the ones that would have already been in the database maintained by National Center for Missing and Exploited Children, which is aimed at children protection.
If matches had been found, the account would have been immediately banned and the information of the violation would have been reported to law enforcement.
Apple claimed that the technology is “extremely” accurate and the error when an account is flagged may occur no more than once in a trillion cases. The technology giant earlier said that the application would also identify the slightly edited photos, comparing them to the original.
The update was planned to have issued by the end of the year.