-
SILICON VALLEY: Indian-origin executive named CEO of Microsoft Gaming - March 5, 2026
-
WASHINGTON: Indian-American lawyer at center of Trump’s biggest legal setback - March 4, 2026
-
TEXAS: ’15 of my cousins came here on H-1B’ - March 3, 2026
-
NEW YORK: Indian-origin doctor shares mother’s immigrant success journey in US - March 2, 2026
-
ARIZONA: Indian-origin scientist wins Arizona State University’s top Science Prize - March 1, 2026
-
WASHINGTON: Balaji Krishnamoorthy becoming Uber CFO amid ongoing visa row - February 28, 2026
-
LUCKNOW: Prime Minister Narendra Modi on HCL-Foxconn chip facility in UP - February 27, 2026
-
WASHINGTON: 55% Indian Americans Disapprove Of Trump’s India Policies: Survey - February 26, 2026
-
WASHINGTON: Trump Praises Indian American Harmeet Dhillon Amid Harvard Case - February 26, 2026
-
MUMBAI: Ranbir Kapoor to set up new RK Studios - February 25, 2026
SILICON VALLEY: Apple criticised for system that detects child abuse
SILICON
VALLEY: Apple is facing criticism over a
new system that finds child sexual abuse material (CSAM) on US users’ devices.
The
technology will search for matches of known CSAM before the image is stored
onto iCloud Photos.
But there
are concerns that the technology could be expanded and used by authoritarian
governments to spy on its own citizens.
WhatsApp
head Will Cathcart called Apple’s move “very concerning”.
Apple
said that new versions of iOS and iPadOS – due to be released later this year –
will have “new applications of cryptography to help limit the spread of
CSAM online, while designing for user privacy”.
The
system will report a match which is then manually reviewed by a human. It can
then take steps to disable a user’s account and report to law enforcement.
The
company says that the new technology offers “significant” privacy
benefits over existing techniques – as Apple only learns about users’ photos if
they have a collection of known child sex abuse material in their iCloud
account.
But
WhatsApp’s Mr Cathcart says the system “could very easily be used to scan
private content for anything they or a government decides it wants to control.
Countries where iPhones are sold will have different definitions on what is
acceptable”.
He argues
that WhatsApp’s system to tackle child sexual abuse material has reported more
than 400,000 cases to the US National Center for Missing and Exploited Children
without breaking encryption.
The
Electronic Frontier Foundation, a digital rights group, has also criticised the
move, labelling it “a fully-built system just waiting
for external pressure to make the slightest change”.
But some
politicians have welcomed Apple’s development.
Sajid
Javid, UK Health Secretary, said it was time for others, especially Facebook,
to follow suit.
US
Senator Richard Blumenthal also praised Apple’s move, calling it a
“welcome, innovative and bold step”.
“This
shows that we can protect children and our
fundamental privacy rights,” he added.



