After much disappointment, Apple’s says it will only hunt for tagged images in most countries

Apple has been in the news for a long time because of its child safety policies. Apple’s announced a few days ago that it would be releasing a new feature that will scan iCloud for users with images of child sexual abuse.

Apple’s announcement recently invited angry lawmakers, who strongly criticized the move.

However, Apple has now released a statement saying it will use its system to scan images that are already “marked for house cleaning in many countries.”

According to a Reuters report, Apple said a limit of 30 images should be found on a person’s phone before Apple’s system warned the company that a person should review and whether they should report it to the authorities.

Apple said it would start at 30 but in the coming days, the number would drop.

“Before the limit is exceeded, cryptographic configuration does not allow Apple servers to delete any game details, nor does Apple allow you to calculate the number of games in any given account.

After the limit is exceeded, Apple’s servers can encrypt discounts corresponding to the best match, and the servers do not read information about any other images.

Encrypted vouchers allow Apple servers to access other visuals such as a low-resolution version of each similar image, ”Apple explained in a long paper.

A Reuters report also revealed that Apple was unhappy with the way it handled communications technology to come. However, Apple has not disclosed whether it has made changes to any of the policies following criticism.

After much disappointment, Apple’s says it will only hunt for tagged images in most countries

The Cupertino-giant has confirmed that the technology is in its development phase and changes are expected to take place before the final release.

Earlier, it was reported that Apple employees themselves were unhappy with Apple’s child safety features. Employees had sent nearly 800 messages to Apple’s internal Slack channel discussing the company’s move.

Workers were concerned that this feature could be exploited by oppressive governments such as China. This feature can be used to track non-child sexual offenses and can be screened by Apple for a government request to use this feature.

Apple had previously stated on his blog that the feature would be released in the United States and that Apple would later expand to other countries.

Also see: JioPhone Next goes on sale from September 10: Specs, features, expected Indian price, and what we know so far

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top