SILICON VALLEY: Apple criticised for system that detects child abuse

SILICON VALLEY: Apple criticised for system that detects child abuse

SILICON
VALLEY
: Apple is facing criticism over a
new system that finds child sexual abuse material (CSAM) on US users’ devices.

The
technology will search for matches of known CSAM before the image is stored
onto iCloud Photos.

But there
are concerns that the technology could be expanded and used by authoritarian
governments to spy on its own citizens.

WhatsApp
head Will Cathcart called Apple’s move “very concerning”.

Apple
said that new versions of iOS and iPadOS – due to be released later this year –
will have “new applications of cryptography to help limit the spread of
CSAM online, while designing for user privacy”.

The
system will report a match which is then manually reviewed by a human. It can
then take steps to disable a user’s account and report to law enforcement.

The
company says that the new technology offers “significant” privacy
benefits over existing techniques – as Apple only learns about users’ photos if
they have a collection of known child sex abuse material in their iCloud
account.

But
WhatsApp’s Mr Cathcart says the system “could very easily be used to scan
private content for anything they or a government decides it wants to control.
Countries where iPhones are sold will have different definitions on what is
acceptable”.

He argues
that WhatsApp’s system to tackle child sexual abuse material has reported more
than 400,000 cases to the US National Center for Missing and Exploited Children
without breaking encryption.

The
Electronic Frontier Foundation, a digital rights group, has also criticised the
move, labelling it “a fully-built system just waiting
for external pressure to make the slightest change”.

But some
politicians have welcomed Apple’s development.

Sajid
Javid, UK Health Secretary, said it was time for others, especially Facebook,
to follow suit.

US
Senator Richard Blumenthal also praised Apple’s move, calling it a
“welcome, innovative and bold step”.

“This
shows that we can protect children and our
fundamental privacy rights,” he added.

Leave a Comment