Privacy On the iPhone: Apple's Plan To Go After Child Abusers Could Impact You Too
A recent development in the digital world has seemingly put two of the biggest pillars within that realm against one another, these are keeping people safe online and protecting people’s privacy.
You might have heard about Apple’s latest offering to combat child exploitation and abuse. No one can argue with Apple about trying to fight these crimes. However, there have been some security and privacy issues flagged by the way in which Apple has planned to tackle this problem.
What is interesting in this situation is that Apple has promoted itself for a long time as a strong supporter of privacy and security. This is why their latest announcement has taken a lot of people by surprise. The newest technology within iPhones, iPads and Macs has been created to find images and videos of child exploitation that are stored on those devices.
At the start of August, on the 5th day of the month, Apple made an announcement that a new feature would be part of the latest software updates. These updates are MacOS Monterey, Watch OS 8, iPad OS 15, and iOS 15. This new feature would be able to find images and videos of child exploitation on a person’s device. The device is capable of comparing images on a phone, laptop, and tablet against a database of already known child exploitation content. This database is controlled by the National Center for Missing and Exploited Children. If an unspecified amount of matches between the device and the database are found, Apple will then be notified. They then might take a closer look at the situation.
Apple has stated that they have created this process to try and protect the privacy of its users. The phones are scanned and Apple is only notified if there is a particular number of matches. Despite this, experts in the privacy and security field are arguing otherwise. Unanimously, people agree that combating child exploitation is a good thing, however the concern that this level of monitoring from Apple creates a situation where they can use this technology for other uses.
As a result of the software update, almost 100 policy and rights groups have signed an open letter to Apple, wherein they warn against the technology. In the letter, the group wrote, ‘Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.’ Some of the members of the group include the American Civil Liberties Union, the Electronic Frontier Foundation, the Center for Democracy & Technology, and Privacy International. Alongside this group of policy and rights groups, there are people who worked on the technology to create the scanner, these people also point out that the potential for misuse is very worrying.
In an article jointly written in the Washington Post by Jonathan Mayer and Anunay Kulshrestha, an assistant professor and graduate researcher at Princeton, they said, ‘We're not concerned because we misunderstand how Apple's system works. The problem is, we understand exactly how it works. Apple is making a bet that it can limit its system to certain content in certain countries, despite immense government pressures. We hope it succeeds in both protecting children and affirming incentives for broader adoption of encryption. But make no mistake that Apple is gambling with security, privacy and free speech worldwide.’
The situation is certainly a difficult one to navigate. Unsurprisingly, nobody is arguing that the cause is not a good one. A lot more has to be done to combat child exploitation and the sharing of content around this. The balance that has to be struck is what is the right way of combating these crimes, is it ok to impede on people’s privacy? And once it has been done for one cause, what is to say that they won’t be doing it more for other causes. The debate has also moved into an analysis of Apple as a company and if it is still as committed to privacy as it used to be. For a long time they have proudly stated that their devices are built to protect privacy. Time will tell if this is the start of a new direction for the business.
Japan's three biggest banks have set up a study group that will look at possibly building a common settlement infrastructure for digital payments, an initiative backed by the central bank and the country's financial regulator.