A German journalist’s union has demanded that the European Commission step in over Apple’s CSAM tools, believing that the system will be used to harvest contact information and perform other intrusions.
Apple’s CSAM tools, intended to help fight the spread of illegal images of children, have courted controversy throughout August, as critics proclaim them to be an affront to privacy. The latest group to speak out about the supposed threat is, oddly, journalists in Germany, Austria, and Switzerland.
Journalist union DJV, representing writers in the country, believes that Apple “intends to monitor cell phones locally in the future.” In a press release, the union calls the tools a “violation of the freedom of the press,” and urges for the EU Commission and Austrian and German federal interior ministers to take action.
According to public editors association AGRA spokesman Hubert Krech, Apple has introduced “a tool with which a company wants to access other user data on their own devices, such as contracts and confidential documents,” which is thought to be a violation of GDPR rules.
Frank Uberall, chairman of the DJV, adds it could be the first step of many. “Will images or videos of opponents of the regime or user data be checked at some point using an algorithm?” Uberall asks.
ORF editors council spokesman Dieter Bornemann offers a beaker outlook, suggesting a government could check for images that could be evidence the user is involved in the LGBT community. It is also feared that totalitarian states could take advantage of the system’s supposed capabilities.
The group also dismisses the claim that it will only apply in the United States, as most European media outlets have correspondents in the country. Furthermore, it is believed “What begins in the USA will certainly follow in Europe as well,” the DJV states.
Misplaced concern
While the worry of having smartphones snooped by governments and security agencies can be well-founded in some cases, as with the Pegasus spying scandal, it seems DJV is overreaching with its claims of Apple’s CSAM tools.
This is in part due to the nature of Apple’s CSAM system in the first place. One part involves a scanning of hashes of images that are stored on iCloud Photos, checked against a database of existing CSAM images, rather than examining the image itself.
The second part is an on-device machine learning system for child accounts that have access to iMessage, one that doesn’t compare against CSAM databases. In that element, the system doesn’t report to Apple, only to the parental Family Sharing manager account.
Following the initial outcry from the public and critics, as well as a warped view of the system’s capabilities into being potentially used by governments for surveillance purposes, Apple has attempted to set the record straight about the tools, with evidently limited success.
Apple privacy chief Erik Neuenschwander explained the CSAM detection system has numerous elements to prevent a single government from abusing it. Apple has also published support documents explaining the system in more detail, what it does, and how it is kept safe from interference.
Apple SVP of software engineering Crag Federighi said on Friday that the company was wrong to release the three child protection features at the same time, which led to a “jumbled” and “widely misunderstood” assessment of the system.
“I grant you, in hindsight, introducing these two features at the same time was a recipe for this kind of confusion,” said Federighi. “It’s really clear a lot of messages got jumbled up pretty badly. I do believe the soundbite that got out early was, ‘oh my god, Apple is scanning my phone for images.’ This is not what is happening.”