Don’t let them pull the Aircloak over our eyes
Big business wants our personal information so it can make even more money. Privacy laws say it cannot have our personal data, but do not define anonymized data as ‘personal’. Business-friendly privacy regulators, such as the UK’s Information Commissioner, specifically declare that anonymized personal data is not regulated by the data protection laws — and is therefore fair game and up for grabs by all and sundry. Academics say, hang on, it is impossible in the modern age to truly anonymize personal data — determination can always de-anonymize it. The whole issue becomes even more befuddled with the introduction of pseudonymized data which is even more easily unwrapped — and the Information Commissioner would dearly like to, and probably will, declare pseudonymized personal data unprotected by law and equally up for grabs. Meanwhile, big business doesn’t want any anonymization at all, and is lobbying government ferociously to get what it wants.
It’s a mess; and there is no imminent solution.
Enter Aircloak. Aircloak is a company and product spun out of Max Planck Institute (computer science) research. It tries to turn the argument on its head. If we cannot, or at least cannot agree that we can, truly anonymize personal data, then let’s keep the data raw — but anonymize the statistical analysis from that data. Big business can therefore analyse the full data set, but will receive results that cannot identify any individuals. If this can be done, it will almost certainly be acceptable to the world’s privacy legislators.
Aircloak accepts data into the system, stores it in encrypted format, allows only authorized users to query it, and returns results that (in theory) cannot identify individuals. To provide trust in the Aircloak software and ‘guarantee’ it free of backdoors, it provides the software as open source. This can thus be analysed and verified by anyone. It then uses the Trusted Computing Platform’s attestation option to ensure that the software being used is exactly the same as that which was verified.
This sounds as if it should be acceptable — but it is not. Criminals, whether they be government, business or organized, will always find ways round or over if not through any defences. Anything that is made by software can be unmade by software. Open source does not guarantee secure — just think of Heartbleed. And government agencies are brilliant at finding ways round defences — consider, for example, the hardware on which the system will run.
The prize is huge — complete, raw, big data personal information — and any intelligence agency would be in dereliction of its duty if it did not try to get it.
The real solution to this problem is simple: ban all and any collection of personal data unless the user specifically and unequivocally allows it in the informed knowledge that anyone and everyone will subsequently have access to it for any purpose.
Big business will throw its hands up in horror at such a suggestion, claiming that it is either necessary or massively beneficial to society that they have unfettered access to everything. So let’s consider three business cases: personal data used for law enforcement, targeted advertising, and medical research.
Law and intelligence agencies claim that access to communications big data (including content) enables them to locate criminals and terrorists and prevent crime and acts of terror. But they already have that access. All they need is to go to a judge with a reasonable argument that a specific target is a threat to society and they will be granted access to that target’s communications. I don’t think anybody objects to this — judicial oversight (but not by some secret and unaccountable court like FISC) is all we need.
The marketing companies claim that collecting personal data allows companies to deliver targeted (that is, relevant) adverts to internet browsers. They claim this is a benefit to the user. But they go further and suggest that without the ability to target, advertisers will desert the internet and the free internet will collapse. The argument is that advertisers need to be able to target.
This is codswallop. Advertisers could and did basic targeting even before the internet — but it was based on location subject. Sports goods were advertised on the sports pages, and shoes were advertised on the fashion pages. Not being able to display a shoe advert to only those readers known to be looking for shoes did not stop the advertisers from advertising. So why should it now? Shoes can be advertised on fashion sites, and sports goods on sports sites. It’s the same thing.
The UK’s care.data is the perfect example (see Care.data, pseudonymised data and the ICO). The NHS wants to collect every English patient’s full and intimately personal health data into a single database for sale to researchers for medical research. Nobody doubts that one of the main customers for this data will be Big Pharma, the drug companies. The NHS argument is that this will result in better and faster development of new drugs and new drug campaigns for the benefit of all.
Here’s a passage from Waterstone’s blurb on Ben Goldacre’s book, Bad Pharma. Goldacre is an author, broadcaster, campaigner, medical doctor and academic; and is highly respected.
…companies run bad trials on their own drugs, which distort and exaggerate the benefits by design. When these trials produce unflattering results, the data is simply buried. All of this is perfectly legal. In fact, even government regulators withhold vitally important data from the people who need it most. Doctors and patient groups have stood by too, and failed to protect us. Instead, they take money and favours, in a world so fractured that medics and nurses are now educated by the drugs industry. The result: patients are harmed in huge numbers.
Consider this. Big pharma is not interested in our health — it is solely interested in its own profits. When independent research began to show that Chinese herbs are more efficacious than western drugs, it campaigned for and succeeded in getting them banned. Do we really want to hand our personal health information to such people? I think not.
The reality is that none of these organizations want our data for our benefit — they want it to increase their profits at any cost to society. They make up arguments that do not stand scrutiny — they argue that because something is technologically possible, it is necessary.
Big business controls government — consider the way in which Monsanto has infiltrated the relevant US agencies and now seems to control the UK government as well (see: Defra battles to keep public in the dark over GM industry influence on policy and media — GeneWatch); and how the US telecoms companies have got inside the FCC and reversed its position on net neutrality.
The only solution to the personal privacy problem is to forbid the collection of any personal data. Where it is part of the service provided — such as that provided by GPs — the data must remain with the GP, and the GP must be held responsible for keeping it safe.
Aircloak is a dangerous development because it offers the suggestion that things can be different. They cannot.