Over the years Apple has gotten the image of being the big tech company that respects and fights for our privacy. The company has been perceived as even saying no to the U.S. government when asked to hand over customer data for criminal investigation. Has this perception been accurate? No is hasn't. Apple, like other tech companies, has always worked with governments around the world to comply to such requests. The difference often is in the details. Is there a warrant? Is the requested data in the customer data records of Apple? Is this data on a machine owned by Apple? Or does Apple have to break into a customers device to obtain this data?
We've become used to the idea of our data being scanned. And there are many good things to say about that. Since the nineties, our data is scanned if we publish it with the intend of spreading it. For instance publishing music on your computer, to be spread to others through a peer-to-peer music sharing network. If we do this with public domain, or copylefted music, this is fine. If we do this with copyrighted materials, it is not.
The same holds true in both the physical and digital world. If you put files on the computers of a cloud service provider, you know your files will be scanned. Just the same as when you rent physical storage space. That you will subject yourself to certain terms. That prohibits you from using the storage space for illegal activities. You know that you run the risk of being subject to drugs or ammo inspection. Usually this is done by trained dogs, sniffing outside the storage area, under guidance of trained and selected personnel of a government agency. Nobody goes through your personal stuff you have stored there; the least invasive method is chosen. With cloud storage this is usually done by a scanning algorithm of some sorts. Its in the terms, and you have the choice of subjecting yourself to them, or not.
Although advertised by Apple as only scanning images when they are entering or leaving your device to or from the Apple cloud messaging or image backup service. The practical reality is that the files, due to peer to peer encryption, are scanned on your device. There's little to no difference between the actuality of where and when Apple is going to scan the files and just scanning the files stored on your device.
Currently the plan is to scan images and only scan them for Child Sexual Abuse Material (CSAM). The detection algorithm used is perceptual hashing. This is a machine learning technique where a certain number of characteristics is scanned for. This gives a set of numbers representing the amount of presence of these features. Two very different photo files of the same subject, in the same clothing with the same background from a similar angle will return the same perceptual hash. Resizing, cropping or greyscaling will most probably not change this hash.
Of course most people are appalled by sexual child abuse, so are we. It is really horrific what some children, unfortunately not just a hand full, but many, many children, have to go through in their youth. This is incredibly harmful and this abuse has severe complications in adulthood. It literally destroys lives.
However, fighting this is not the job of a public company. No public company should be doing warrantless surveillance. It is only OK when they scan their own machines, their own storage and their network they directly own. Scanning their own hardware, their own storage, they own equipment is keeping their company free from participating in illegal conduct. That is not warrantless surveillance, but what they're announcing is warrantless surveillance.
Detecting and preventing child abuse is the job of highly trained and selected people working for government. In any good working justice system, this is an open, monitored and evaluated system to safeguard against malintent and wrong outcome.
Besides the fact that law enforcement and surveillance is not within the rights and role of a public company. Apple’s technical brief is unclear about many things. It's more a marketing pamphlet than an open and clear cut source of information. It is unknown how this is implemented. The implementation is proprietary; it is closed source. They speak of a threshold that first needs to be reached. After this threshold, the hashes are checked by an Apple employee. What this threshold entails is unclear and what the training, dedication and background of this human factor entails is unclear. There is no accountability if something goes wrong. There are no safeguards outside of Apple and no safeguards against starting to scan other types of imagery other than a proven weak statement: we won't, trust us
. What Apple has announced, opens the door to acts like scanning for MacBook diagrams, photos of chips Apple does not want us to be able to purchase for repairs. Or to start scanning for pictures of protesters, activists, ethnic groups, et cetera.
In functional western democratic society, this process is completely different from Apple's whitepaper. What is legal and illegal is well documented and available for anybody to read. Including criminals; so they can change their ways, instead of being tricked in being caught out. These laws are formed after political debate with multiple safeguards build in. The people have the ability to intervene and criticize. Also the juridical system is open in its proceedings. It’s documented and both political and public debate is possible. The process of criminal investigation is subject to rules. At certain points during the process it is open and documented for experts. At some points it is open for public and political debate. This is a necessity to prevent human rights from being violated and key to improving crime reduction and key to prevent false prosecution. It’s not a perfect system, but at least we can change things that go wrong and hold people accountable for their actions. With Apple’s system none of this holds true.
Perceptual hashing is a very powerful tool. But a tool is only a good tool when used correctly. When it is used in a closed, non reviewed environment by a public company with financial interests other than crime fighting, this tool can be a weapon of destruction and horror.
Even with trained and selected personnel, employed by an impartial body instated by a government, this can go horribly wrong. Take for instance the practice of swatting. Where a prank call is made to report an armed robbery or a hostage situation. With this move from Apple, real child abusers can potentially evade detection by, for instance, replacing the face of a child with a flower pot, or crop images in just the right way. But at the same time people can get in serious trouble. What happens if Apple sends a detection message of child sexual abuse imagery with the option to appeal? Families can fall apart leaving the children victim of fighting parents. Also, the owner of an Apple device can get in serious trouble, loose his or her family, job, friends and more. Just because the children frequently played with a phone while bathing. Of course this is a bad idea and any parent should prevent such a thing from happening; parents should educate their kids about abuse and correct use of a smartphone. However, it is impossible to always prevent things from happening unnoticed. Another case would be to deliberately plant images that reproduce a known CSAM hash, just to ruin somebodies live over a dispute or due to jealousy.
Also the effectiveness can be debated. Perceptual hashing is only effective against already known material; for it's an after the fact measure. What Apple has announced does not prevent new child sexual abuse; it only prevents recirculation of already known materials. Recirculation is already prevented by scanning files spread through peer-to-peer networks by law enforcement and when people try to store materials at a cloud service. The real problem lies with highly skilled criminals who distribute these materials as a paid service and know how to evade detection. Governments around the world are trying to prevent these networks from continuing to exist on a daily basis. Apple is not contributing to this cause, nor is it Apple’s place to do so.
The other origin of child abuse lies at home, at family and very close friends. Apple's scanning actions do nothing against this abuse. Fathers, mothers, uncles, aunts and friends of the family can still harm children in many ways. Without any image of this ever being taken or being added to the database of hashes.