shadow

HIGHLIGHTS

  • Apple wants to make an informed decision before rolling out the child protection feature that was announced last month.
  • The Cupertino-giant in a fresh statement said that it would not roll out the feature without making improvements to it based on feedback.
  • Apple had announced that its new feature would scan the users’ photos for child sexual abuse material.

Apple wants to make an informed decision before rolling out the child protection feature that was announced last month. The Cupertino-giant in a fresh statement said that it would not roll out the feature without making improvements to it based on feedback. Apple had announced that its new feature would scan the users’ photos for child sexual abuse material. However, the move was severely criticised by the privacy advocates as it clearly breaches the privacy of users and can be exploited by governments.

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple spokesperson told The Verge in a statement.

Apple had previously said that it wants to protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM). While Apple’s intention to work in this area was laudable, it’s move was rather unexpected. Apple had said that it would announce new technology in iOS and iPadOS to detect known CSAM images stored in iCloud Photos. This clearly means that Apple will sneak into the iCloud storage of suspects and report instances to the National Center for Missing and Exploited Children (NCMEC).

“Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices,” Apple said in a blog.

Apple’s move to limit the spread of child sexual abuse material was not only criticised by the privacy and security experts but also by WhatsApp head, Will Cathcart. He had in a series of tweets explained that he would never adopt a system like Apple to curb the spread of CSAM material. He had alleged that Apple’s new surveillance system could very easily be used to scan private content for anything they or a government decides it wants to control. He has also raised several questions about the whole system.

Author

India today

Leave a Reply

Your email address will not be published. Required fields are marked *