Apple Delays Feature That Scans iPhones and iPads for Child Sexual Abuse Material Amid Public Concern

Apple logo with app store iconApple logo with app store icon
Getty Images

The tech company announced a rollout of iOS technology that would scan users iCloud photos for child pornography, but have delayed the plan with no new date.

Apple released a statement last month sharing their plans to incorporate iOS technology that would scan customer’s photos on their iPhones and iPads, citing the need for child protection, but have opted to delay the changes amid public concern.  

According to their release, the company wants “... to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM).”

There has been pushback from many over this technology due to the potential privacy concerns. The Electronic Frontier foundation released a statement denouncing these upcoming changes and supporting a petition to push back against Apple, saying, “...​​it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children.”

“As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.”

Apple has since announced it is delaying its rollout of the new features.

The company added an update on Sept. 3 to their release saying, “Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Related Stories