WhatsApp is not on the screen to scan Apple’s child abuse image on iPhones because of privacy

Apple’s latest move to prevent child sexual abuse has sparked controversy over online behavior, and not everyone agrees with the technology manager.

WhatsApp CEO, for example, has now expressed concern about Apple’s new practice.

WhatsApp chief executive Will Cathcart has deleted from a recent series of tweets that the messaging app will not accept Apple’s new feature once it is launched.

Cathcart says the feature is “wrong” and “reverses human privacy” worldwide.

For those who don’t know, Apple has announced a new tool that will look at Child Abuse Matters (CSAM) for photos uploaded to iCloud by iPhone and iPad owners.

The feature will be released on iOS 15 and iPadOS 15 over the course of the year and will use NeuralHash to compare photos with well-known CSAM images. You can check out how Apple’s new tool will work here.

Naturally, the announcement of this feature was enough to divide the digital world into two groups – those who support it and those who do not.

The issue of support is clear – Apple has good intentions for the feature. Flipside claims that this could change at any time and that the tool could be used for something very dangerous that could result in the destruction of user privacy.

In his recent tweets, Cathcart emphasized this. You mentioned that Apple’s new anti-CSAM feature “can scan all private photos on your phone – even photos you’ve never shared with anyone.” That, as he says, “is no secret.”

He goes on to make a series of points against the feature. Alternatively, he says this feature can be used to scan all kinds of confidential content that Apple or the government wants.

He also asks the question of how the tool will be used in Apple’s global markets, with different rules on the subject.

Finally, Cathcart also highlights the negative effects of a tool damaged by threatening characters.

The concerns raised by Cathcart are serious but not new. Several experts have raised similar concerns with Apple’s planned monitoring system since it was announced last week.

At the moment, there seems to be no answer to these real questions that question Apple’s intentions with the new tool.

The simple reason is that the deep-rooted scan of the device has not been heard before. Additionally, Apple will use the feature in all of its services, including Messages and Siri.

This can add to the content monitoring of your content. In simple words, what you do in these apps is viewed.

In defense of Apple, the company will not personally look at this content.

Instead, it will use the NeuralHash program to identify images of child sexual abuse against known patterns. The big argument, however, is that such a hashing system tends to make mistakes.

When that happens, a completely unrelated image can be marked as CSAM and sent to Apple servers.

In short, this tool is more likely to access your device than the specific CSAM content you want.

Cathcart may not have been the first to voice these concerns, but his public acceptance of their negative consequences is a strong word in the industry.

In addition, it sends a clear message that if Apple could install such a scan on private content, not everyone would accept it without raising their voice.

Also see: Windows 11 lets you take Focus Sessions with Spotify music integration

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top