• Fri. Sep 24th, 2021

World Observer

Your Wide Network

How Apple Will Scan Photos Saved On iPhone For Child Abuse

BySam Brad

Aug 6, 2021

Apple has implemented NeuralHash technology, which it claims will allow it to find criminal evidence, including child pornography, on users’ iPhones; a move that has already received criticism from experts.

The system was initially revealed by crypto expert Matthew Green , and just a few hours later, Apple confirmed the launch of a tool capable of scanning the content stored in the memory of an iPhone, obtaining copies of photographs in which abuse is shown to minors. Therefore, Apple could find out if a user has such content, and report it to the authorities .

To allay privacy-related fears, Apple has ensured that the program does not need to copy files to external servers , but will instead be able to use the power of the iPhone chip to perform the analysis locally. In this way, Apple intends to demonstrate a greater commitment to privacy, since the files will not leave the device.

In addition, Apple ensures that this system is optional for users, since only the photos and files that are uploaded to iCloud , its system in the cloud, will be analyzed ; in fact, it goes so far as to claim that its method is more private than other cloud services, which are required by US law to analyze files uploaded to their servers.

Instead, Apple does that work on the user’s mobile, and only if a series of requirements are met, the file will be uploaded to a server for an employee to check.

However, that has not been enough to convince experts like Green, who have already pointed out several worrying details about this method. For starters, there is the problem of false positives ; no matter how good an algorithm is, it is bound to fail a certain percentage of the time.

In those cases, files that are flagged by the algorithm as suspicious would have to be sent to a server for an employee to look at and confirm that they are really illegal; therefore, it is possible for one person to view our private photos and files.

But even if the algorithms are the best on the market, this system has the potential to be misused by governments and law enforcement agencies. The reason is that the detection method is based on comparing “hashes”, strings of numbers generated by a mathematical operation applied to the files; the logic is that if two files have the same “hash”, it means that they are identical.

Apple has a list of “hashes” of illegal or criminal files, initially obtained by public sources, that the algorithm is able to compare with the files stored on the iPhone.

The problem with this approach is that its usefulness and good use largely depends on who decides what goes into the list ; for example, Green raises how powerful this tool would be in the hands of authoritarian governments, which would be able to create a “black list” of files that citizens should not have on their mobile.

It is not such an absurd approach. Countries like China have already developed methods to analyze files transmitted over the Internet, and governments like the United States have repeatedly shown their intention to implement “back doors” in end-to-end encryption, which allow “protecting the public, especially minors.”

A system like the one developed by Apple could be used to obtain the content of communications, once they have been decrypted on the user’s mobile phone, and the fear is that the company will offer this system to governments, whether they want to or not. want.

At the moment, Apple’s new system has only been implemented in the US . In the European Union, systems that analyze files of users of cloud services are prohibited, after new rules that affected services such as Facebook. Therefore, for now it is not clear if Apple will try to apply this technology also in Spain.

Sam Brad

The Great Writer and The Passionate Poet As Well, He Graduated from University Of Florida in Journalism and Brad have around 12 years of experience in news and media section.

Leave a Reply

Your email address will not be published. Required fields are marked *