Apple promises that its child abuse scanner will not be made into a surveillance weapon

Apple has released a public briefing on its new tool designed to combat child sexual abuse (or CSAM) on its platforms. While this feature aims to do the right thing, it has raised a lot of privacy concerns since Apple announced it last week.

Apple’s new response doubles its commitment to using the tool and tries to reduce some of these concerns.

First, it was an extended use of the content monitoring tool outside of CSAM. Apple now promises that this will not be the case.

In a statement, Apple stated that its CSAM recovery tool is designed “to locate known CSAM images stored in iCloud Photos.”

It refuses to entertain the idea that the company can use you to filter any other content in iCloud.

Apple promises to adhere to this commitment whenever any government wants a company to use its tool for any other reason.

It says it will “reject any such demands on the state,” as it has been doing for some “government-approved reforms that reduce user privacy.”

Apple promises that its child abuse scanner will not be made into a surveillance weapon

Apple’s statement comes in response to criticism from experts on the use of the tool.

For those who don’t know, Apple’s new CSAM tool will perform algorithmic scanning of user photos on iCloud to detect any child abuse items.

Scanning will also be transferred to iMessage and Siri searches. If certain content checks existing checks, it will be marked by Apple for further investigation.

The first tool with its own type of user data scanner right on their devices. Critics have since raised many concerns about the use of the tool.

First, it was Apple’s intention to use it, which could change in the future. Although the company has now issued that opportunity.


Other concerns include equipment malfunctions, as seen on several social media platforms. How Apple will improve is not yet known, but there is a possibility that some completely harmless files are marked by an Apple tool like CSAM.

Then there is the danger of threatening actors, who may have used the hashing method used by Apple to create media like the well-known CSAM hashtags.

A completely harmless news file can put a person in trouble with the authorities.

Apple is now erasing both of these concerns. It says the program is designed to be “very accurate,” and the probability that the wrong flag is “less than one million a year.”

As a result, there will be manual verification for each CSAM game to be downloaded.

In any case, it is easy to see how your files can go with the eyes of people and digital.

Exactly what will be known will be released when Apple releases it to users devices later in the year with iOS 15 and iPadOS 15.

Also see: The iPhone 13 may include new video portrait mode and other camera-related enhancements

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top