Apple Preparing Its Employees to Answer CSAM-Related Questions

apple logo

Earlier this month, Apple announced controversial child-safety features. Despite the good use, the new CSAM features have drawn a lot of backlash from industry experts. According to a new report from Bloomberg, Apple is circulating an internal message among employees, preparing its employees to the CSAM-related questions people might ask them.

CSAM has drawn a lot of backlashes. Edward Snowden went onto say that the feature would convert iPhone into “iNarcs.” Even Apple’s own employees were questioning the company’s decision. After such havoc, Apple released an internal memo to address all the concerns and answered some important privacy-related questions in a detailed FAQ.

Now, according to a report from Bloomberg, Apple is preparing its employees in case they need to answer questions related to CSAM when asked. The company has asked employees to review the FAQ it floated earlier this week.

The Cupertino-giant floated a memo, which reads:

You may be getting questions about privacy from customers who saw our recent announcement about Expanded Protections for Children. There is a full FAQ here to address these questions and provide more clarity and transparency in the process. We’ve also added the main questions about CSAM detection below. Please take time to review the below and the full FAQ so you can help our customers.

Earlier today, Craig cleared some of the confusion surrounding CSAM. He said that there is a threshold only after which Apple will be flagged about the CSAM content in someone’s iCloud. He said Apple will be alerted about CSAM after 30 images are matched by the review system. Moreover, Craig said that Apple will use a database of images from multiple child protection organizations, and the process would include an independent auditor.

“The tech giant also said it will address privacy concerns by having an independent auditor review the system.”

Apple’s child-safety features have been a matter of concern ever since its announcement. Some say it’s fine for the company to check child abuse in the photos while some say it’s a breach of their privacy. How do you feel about Apple searching the iCloud Photo Library for CSAM? Do you think this is a breach of your privacy? Drop a comment and let us know your thoughts!

[Via Bloomberg]