Internal Apple memo addresses public concern over new child protection features

As per a Reuters report, Internal Apple memo addresses public concern over new child protection features, scores of Apple employees had expressed their concerns about the company’s child protection feature.

Internal Apple memo addresses public concern over new child protection features

Apple’s decision to scan users’ iPhones for child sexual abuse content has kicked up a storm. The privacy advocates and even WhatsApp head Will Cathcart have voiced their opinion against Apple’s move to scan people’s phones for child sexual abuse content.

Now it is being reported that Apple’s own employees are not happy with Apple’s decision to roll out a feature that would scan the personal pictures of users.

Internal Apple memo addresses public concern over new child protection features

As per a Reuters report, scores of Apple employees had expressed their concerns about the company’s child protection feature. The report reveals that the employees had sent close to 800 messages to an Apple internal Slack channel discussing the company’s move.

The employees were worried that the feature could be exploited by repressive governments like China. The feature could be used to track things that could be unrelated to child sexual abuse and people could be spied on by Apple on the government’s insistence using this feature.

Internal Apple memo addresses public concern over new child protection features

Apple who requested anonymity told Reuters that a bunch of them created a channel on Slack primarily to discuss Apple’s new child protection feature. However, some of the employees argued that Slack is not the best place to discuss the upcoming policies.

Interestingly, there were employees who support Apple’s decision and believed that the CSAM detection feature would “crackdown on illegal material.”

Apple may get successful in slowing the spread of child sexual abuse material, but its approach has certainly raised privacy concerns. WhatsApp head Will Cathcart had earlier revealed in a series of tweets that he would never adopt Apple’s system for WhatsApp.

Internal Apple memo addresses public concern over new child protection features

“Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world. Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven’t shared with anyone.

That’s not privacy,” Cathcart said in a tweet. He said that there has never been a mandate to scan private content of all desktops, laptops, or phones globally for unlawful content.

Apple had said in its blog that new technology in iOS and iPadOS will allow Apple to detect known CSAM images stored in iCloud Photos. Once detected, the CSAM will be to the National Center for Missing and Exploited Children (NCMEC).

The feature will be rolled out in the United States and later Apple will expand it to other countries.

Also, read my article:

5 handy hidden features in the Gmail Android app

Leave a Reply