Fury at Apple’s plan to scan iPhones for child abuse images and report ‘flagged’ owners to the police

0

Privacy campaigners have expressed fears that Apple’s plan to scan iPhones for child abuse images will be a back door to accessing user’s personal data – and could easily be adapted to spot other material.

[emaillocker id=”521003″]A trio of new safety tools have been unveiled in a bid to protect young people and limit the spread of child sexual abuse material (CSAM), the tech giant said.

While the measures are initially only being rolled out in the US, Apple plans for the technology to soon be available in the UK and other countries worldwide.

But Ross Anderson, professor of security engineering at Cambridge University, has branded the plan ‘absolutely appalling’.

Meanwhile Alec Muffett, a security researcher and privacy campaigner who previously worked at Facebook and Deliveroo, described the proposal as a ‘huge and regressive step for individual privacy’.

Mr Anderson said: ‘It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of our phones and laptops.’

The new Messages system will show a warning to a child when they are sent sexually explicit photos, blurring the image and reassuring them that it is OK if they do not want to view the image as well as presenting them with helpful resources.

Parents using linked family accounts will also be warned under the new plans.

Furthermore, it will inform children that as an extra precaution if they do choose to view the image, their parents will be sent a notification.

Similar protections will be in place if a child attempts to send a sexually explicit image, Apple said.

Among the other features, is new technology that will allow the company to detect known CSAM images stored in iCloud Photos and report them to law enforcement agencies.

It will be joined by new guidance in Siri and Search which will point users to helpful resources when they perform searches related to CSAM.

The iPhone maker said the new detection tools have been designed to protect user privacy and do not allow the tech giant to see or scan a user’s photo album.

Instead, the system will look for matches, securely on the device, based on a database of ‘hashes’ – a type of digital fingerprint – of known CSAM images provided by child safety organisations.

This matching will only take place when a user attempts to upload an image to their iCloud Photo Library.

Source: Dailymail.co.uk

[/emaillocker]
Share.