WhatsApp head Will Cathcart will strike Apple with its new plan to get child abuse photos on iPhones

Apple on Friday announced that it will launch a new program to test images of child sexual abuse on iPhones.

Cupertino-giant said the decision was taken to curb the proliferation of child pornography or sexually explicit material.

However, Apple’s new move has never gone down well with privacy lawyers as well as WhatsApp head Will Cathcart.

In a series of tweets, Cathcart said Apple’s approach was wrong and would seriously undermine people’s privacy around the world.

Apple is considered the safest when it comes to user safety and privacy.

But it’s not fun to hear when Apple announces that it will scan the iCloud storage for users of child abuse material.

While Apple may be successful in reducing the prevalence of child pornography, there may be some privacy and user safety.

The head of WhatsApp in a series of tweets explained why he would not accept Apple’s WhatsApp app.

“Apple has been doing a lot to fight CSAM for a long time, but the method they are using introduces something to the world.

Instead of focusing on making it easier for people to report shared content, Apple has developed software that can scan all private photos on your phone – even photos you’ve never shared with anyone.

WhatsApp head Will Cathcart will strike Apple with its new plan

That is no secret, ”Cathcart tweeted. He said there has never been a mandate to scan the confidential content of all desktops, laptops or phones around the world with illegal content.

Cathcart suspects Apple’s new monitoring system could easily be used to scan private content for anything or government it decides it wants to control. He also raised several questions about the whole system.

“What will happen when spyware companies find a way to exploit this software? Recent reports have shown the cost of being compromised on iOS software as it stands.

What happens when someone finds a way to use this new system? There are a lot of problems with this, and it is frustrating to see them work without involving long-time professionals writing their technical and general concerns about this, ”he added.

However, Apple when announcing the new program said that it does not learn anything about images that do not support the well-known CSAM database.

The Cupertino-giant also states that Apple “cannot access metadata or other visual imagery of identical CSAM images beyond the limit of games for iCloud photo accounts.”

Also see: If Airtel notifies you that your services have been terminated, simply ignore the message

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top