Apple may bring the tool to scan child abuse content through iPhone photos

Spread the love

Apple has always made its tools and apps extraordinary. Now, Apple is developing a tool that would scan the iPhone photo gallery. By doing that, the tool may be able to find Child Sexual abuse material (CSAM). And also scan media content related to child pornography. Moreover, this new tool development is not been officially announced.

But, the report says that it will implement on the Users device that is on the client-side. The tool will look for specific hashtags in large quantities and send them directly to the Apple servers. Moreover, the idea behind the introduction of this tool is to carry out the checks on the Users device. However, the company says that it will protect the phone’s privacy, but it’s not clear whether the company could misuse the system in some or another way.

There was a tweet about this new Apple tool by cybersecurity expert Matthew Daniel Green who is also an associate professor at Johns Hopkins Information Security Institute in the Us. His tweet reads as “the under-developing tool could eventually be a key ingredient in adding surveillance to the encrypted messaging system.

Moreover, his statement states that Apple’s plan to launch the client-side system to detect child abuse images from the iPhone. Furthermore, he adds that “the way Apple is doing this launch, they are going to start with non-E2E photos that people have already shared with the cloud. So it doesn’t hurt anyone’s privacy. But you have to ask why anyone would develop a system like this if scanning E2E photos weren’t the goal”.

Moreover, this new tool will be out keeping privacy in mind. However, the exact scope of the tool is yet to confirm as Apple releases no details yet. Although according to Green’s tweet, the announcement may be within the week.

Leave a Comment