WhatsApp condemned Apple’s new child safety tools as “of great concern… Monitoring system,” even as governments around the world applauded the decision to proactively search for illegal photos of abuse sex on children.
The deadlock sets up a battle between other tech platforms and officials calling on them to adopt similar tools.
An Indian government official told the Financial Times on Friday that he “welcomed” Apple’s new technology, which set “a benchmark for other tech companies,” while an EU official said the tech group had designed a “pretty elegant solution”.
US Senator Richard Blumenthal called Apple’s new system is an “innovative and daring step”.
“It’s time for others – especially Facebook – to follow their example” tweeted Sajid Javid, UK Health Secretary and former Home Secretary.
However, Apple’s rivals in Silicon Valley are reportedly “glowing” on its system to scan photos on US users’ iPhones before uploading them to iCloud, which will launch as part of the next version of iOS.
“This approach introduces something of great concern to the world,” said Will Cathcart, director of WhatsApp. “This is a surveillance system designed and operated by Apple that could very easily be used to scan private content for anything they or a government decides to control. It is disturbing to see them act without hiring experts.
“We will not adopt it at WhatsApp,” he added.
The enthusiastic response from lawmakers will only intensify concerns raised by the security and privacy community that Apple has set a dangerous precedent that could be exploited by repressive regimes or overzealous law enforcement.
Apps like WhatsApp, Telegram and Signal owned by Facebook, as well as Google with its Android operating system, are already being asked to replicate Apple’s model.
“To say we’re disappointed with Apple’s plans is an understatement,” India McKinney and Erica Portnoy of digital rights group Electronic Frontier Foundation said in a blog post. Apple’s compromise on end-to-end encryption may appease government agencies in the United States and abroad, but it’s a shocking about-face for users who have relied on the leadership of the company in terms of confidentiality and security. “
Jennifer Granick, surveillance and cybersecurity consultant for the American Civil Liberty Union Speech, Privacy and Technology Project, added: Our phones.
Political pressure on tech companies has grown around the world in recent months to allow governments to access encrypted content, including messages, photos and videos.
Indian Prime Minister Narendra Modi recently passed laws requiring technology platforms like WhatsApp to trace the source of illegal messages, breaking end-to-end encryption. WhatsApp is currently locked in a legal battle with the government, in an attempt to thwart the new rules.
Last October, in a open letter signed by the “Five Eyes” countries, plus Japan and India, officials including UK Home Secretary Priti Patel and former US Attorney General William Barr said they “urge the industry to address our serious concerns when encryption is applied in a manner that completely excludes any legal access to content ”.
They noted that child abuse was one of the reasons they believed tech companies should develop alternative methods to give authorities access to content on devices, and that there was “a growing consensus among governments and international institutions that action must be taken ”.
Critics have expressed skepticism about Apple’s promise to limit itself to researching child abuse images. “I hate going down a slippery slope, but I watch the slope, and governments around the world are covering it with oil, and Apple has just pushed its customers over its edge,” said Sarah Jamie Lewis, crypto researcher and executive director of Canadian NGO Open Privacy.
While there is no US law yet requiring Apple to research such material, its move comes as the UK and the EU prepare new legislation – the Online Safety Bill and the digital services law – which would further oblige tech companies to limit the spread of child pornography, among other forms of harmful content.
Apple’s decision to go ahead with its own individual system, rather than join in cross-industry negotiation with regulators around the world, has annoyed its neighbors in Silicon Valley – especially after they broke down. united to support its 2016 legal fight against the FBI to access a terrorist. the suspect’s iPhone.
“Some of the reactions I’ve heard from other Apple competitors are that they’re glowing,” Matthew Green, Johns Hopkins University security professor, said at a press conference. online video chat with researchers at Stanford University on Thursday.
Alex Stamos, the former Facebook security chief who is now director of the Stanford Internet Observatory, said in the same discussion that Apple “doesn’t care at all that everyone is trying to find that balance. delicate international “. “Obviously there will be immediate pressure on WhatsApp,” he said.
An Apple executive on Thursday acknowledged the fury his moves had sparked in an internal memo. “We saw a lot of positive responses today,” wrote Sébastien Marineau in a note obtained by the Apple blog 9to5Mac. “We know some people have misunderstandings and many are concerned about the implications, but we will continue to explain and detail the features so people understand what we have built. “
Facebook and Google have yet to comment publicly on Apple’s announcement.
Apple has already been criticized by some for not doing more to prevent the circulation of abusive content, especially on iMessage. Because the iPhone’s messaging app is end-to-end encrypted, the company couldn’t see any photos or videos exchanged between its users.
Messages exchanged between two senior Apple engineers, which were produced as evidence in the iPhone maker’s recent legal battle with Epic Games, suggest that some within the company believed it could do more.
In the exchange, which dates from the beginning of last year and was the first discovered by the Tech Transparency Project, Eric Friedman, head of Apple’s Fraud Engineering Algorithms and Risk unit, suggested that compared to Facebook, “we are the best platform for distributing child pornography.”
“We have chosen not to know in enough places where we really cannot say ‘how much child pornography might be present,’ Friedman added.
Additional reporting by Stephanie Findlay in Delhi, Valentina Pop in Brussels and Hannah Murphy in San Francisco