Apple replacing industry-standard end-to-end encrypted messaging system with infrastructure for surveillance and censorship — iPhone as spyPhone
You should now assume you have ZERO privacy with Apple services like iCloud, and everything else that Apple offers. Coming soon to macOS also?
This is NOT about child abuse images; that’s propaganda to get most users to accept the spyware infrastructure. Later, all sorts of social credit scoring add-ons can be made. Presumably Apple will be working hand-in-hand with Communist China first, where human life has no value. This technology, will be welcomed with open arms by the brutes in the CCP, who will quickly make expanding it for CCP purposes a condition of doing business in China.
This is about an infrastructure which can be put to use for any and all of your data. It doesn’t matter what Apple claims it is limited to doing now. What matters is that this is a general purpose capability. And your system resources will be used to scan data, without your consent.
This change is a MOAB for goodwill towards Apple—this single action makes a mockery of all Apple’s past security and privacy claims—meaningless twaddle. But that should have been obvious already to anyone paying attention to Apple PR.
This new infrastructure is a backdoor which makes a mockery of security and privacy.
As a practical matter, it is incredibly short-sighted too: anhilate user trust for a short term benefit of catching a few sickos. But only technology-ignorant child-abusers will fail to turn off iCloud photo syncing, which at the moment is what the Apple system counts on. Everyone else gets spied on. Like burning down a barn to eliminate the rats, who mostly flee elsewhere.
It’s hard to believe that Tim Cook is this dumb.
Fury at Apple's plan to scan iPhones for child abuse images and report 'flagged' owners to the police after a company employee has looked at their photos
Data privacy campaigners are raging today over Apple's 'appalling' plans to automatically scan iPhones and cloud storage for child abuse images and nudity, accusing the tech giant of opening a new back door to accessing personal data and 'appeasing' governments who could harness it to snoop on their citizens.
...There are concerns that the policy could be a gateway to snoop on iPhone users and could also target parents innocently taking or sharing pictures of their children because 'false positives' are highly likely. But Apple insists there is a 1-in-1 trillion probability of a false positive.
Others fear that totalitarian governments with poor human rights records, could, for instance, harness it to convict people for being gay if homosexuality is a crime.
While the measures are initially only being rolled out in the US, Apple plans for the technology to soon be available in the UK and other countries worldwide.
Ross Anderson, professor of security engineering at Cambridge University, has branded the plan 'absolutely appalling'. Alec Muffett, a security researcher and privacy campaigner who previously worked at Facebook and Deliveroo, described the proposal as a 'huge and regressive step for individual privacy'.
Mr Anderson said: 'It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of our phones and laptops.'
Campaigners fear the plan could easily be adapted to spot other material.
Greg Nojeim of the Center for Democracy and Technology in Washington, DC said that 'Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship.'
...Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple's algorithm and alert law enforcement. 'Researchers have been able to do this pretty easily,' he said of the ability to trick such systems.
MPG: technology like this is guaranteed to be expanded, and to abused. Heck, if Google and Facebook can get away with it, Apple apparently wants to one-up them with direct intrusion on formerly private data.
by Edward Snowden, August 25, 2021
...Why is Apple risking so much for a CSAM-detection system that has been denounced as “dangerous” and "easily repurposed for surveillance and censorship" by the very computer scientists who've already put it to the test? What could be worth the decisive shattering of the foundational Apple idea that an iPhone belongs to the person who carries it, rather than to the company that made it?
Apple: "Designed in California, Assembled in China, Purchased by You, Owned by Us."
The one answer to these questions that the optimists keep coming back to is the likelihood that Apple is doing this as a prelude to finally switching over to “end-to-end” encryption for everything its customers store on iCloud—something Apple had previously intended to do before backtracking, in a dismaying display of cowardice, after the FBI secretly complained.
... optimists are wrong: Apple’s proposal to make their phones inform on and betray their owners marks the dawn of a dark future, one to be written in the blood of the political opposition of a hundred countries that will exploit this system to the hilt. See, the day after this system goes live, it will no longer matter whether or not Apple ever enables end-to-end encryption, because our iPhones will be reporting their contents before our keys are even used.
I can’t think of any other company that has so proudly, and so publicly, distributed spyware to its own devices—and I can’t think of a threat more dangerous to a product’s security than the mischief of its own maker. There is no fundamental technological limit to how far the precedent Apple is establishing can be pushed, meaning the only restraint is Apple’s all-too-flexible company policy, something governments understand all too well.
MPG: with YouTube and Facebook deciding what is acceptable thought, Apple seemingly wants to one-up them right on your iPad, iPhone, and coming soon to a macOS near you.
Opinion: We built a system like Apple’s to flag child sexual abuse material — and concluded the tech was dangerous
by Jonathan Mayer and Anunay Kulshrestha, August 19, 2021
...We wrote the only peer-reviewed publication on how to build a system like Apple’s — and we concluded the technology was dangerous. We’re not concerned because we misunderstand how Apple’s system works. The problem is, we understand exactly how it works.
...Our system could be easily repurposed for surveillance and censorship. The design wasn’t restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser.
A foreign government could, for example, compel a service to out people sharing disfavored political speech. That’s no hypothetical: WeChat, the popular Chinese messaging app, already uses content matching to identify dissident material. India enacted rules this year that could require pre-screening content critical of government policy. Russia recently fined Google, Facebook and Twitter for not removing pro-democracy protest materials.
We spotted other shortcomings. The content-matching process could have false positives, and malicious users could game the system to subject innocent users to scrutiny. We were so disturbed that we took a step we hadn’t seen before in computer science literature: We warned against our own system design, urging further research on how to mitigate the serious downsides. We’d planned to discuss paths forward at an academic conference this month.
That dialogue never happened. The week before our presentation, Apple announced it would deploy its nearly identical system on iCloud Photos, which exists on more than 1.5 billion devices. Apple’s motivation, like ours, was to protect children. And its system was technically more efficient and capable than ours. But we were baffled to see that Apple had few answers for the hard questions we’d surfaced.
China is Apple’s second-largest market, with probably hundreds of millions of devices. What stops the Chinese government from demanding Apple scan those devices for pro-democracy materials? Absolutely nothing, except Apple’s solemn promise. This is the same Apple that blocked Chinese citizens from apps that allow access to censored material, that acceded to China’s demand to store user data in state-owned data centers and whose chief executive infamously declared, “We follow the law wherever we do business.”
...But make no mistake that Apple is gambling with security, privacy and free speech worldwide.
MPG: Ugghhh. Enjoy your SpyPhone.