The matter of fact way in which big companies seem to think they own and have a right to private and personal information about us is worrying. The following paragraph is lifted verbatim from a Juniper Research blog today:
The next step is to combine this with, say, healthcare data achieved through large-scale remote patient monitoring, to achieve a more accurate picture of the individual through knowledge of that individual’s peers. It may not be an exaggeration to say that we may be on the edge of a new era where individual circumstances are routinely informed through precise data analysis of a data “cloud”.
The DNA of Big Data
If you think, don’t worry, the new EU General Data Protection Regulation (GDPR) will keep our privacy safe, think again. On Friday Ross Anderson attended a GDPR lobbying meeting in London. He has thoughtfully published his notes in the Cambridge University Light Blue Touchpaper blog:
There were about 100 people present, of which only 5 were from civil society. Most were corporate lobbyists: good-looking, articulate and impressive, but pushing some jaw-dropping agendas. For example the lovely lady from the Association of British Insurers found it painful that the regulation might ban profiling that was unfair or discriminatory.
How Privacy is Lost
It’s worth reading in full – but I’m afraid it doesn’t get any better. And when you add the more direct lobbying of companies like Google and Facebook, I think we can confidently predict that the GDPR that emerges at the end – if it survives at all – is going to be vastly weaker than the one that started out last year.
I came across this news announcement. It’s an occupational hazard for journos. But this one caught my eye. It was about a new product – or more specifically, a ‘hostile vehicle mitigation product’.
I immediately thought of the press release I got from Europol on Thursday, where the latest statistics:
…show how the total number of terrorist attacks and related arrests in the EU significantly increased in 2012, in contrast to previous years. This and other findings in the report describe a threat from terrorism that remains strong and varied in Europe.
This new hostile vehicle mitigation product is probably related, I conjectured – some new unmanned remote-controlled robot able to locate under-chassis bombs, remove and defuse them; or perhaps the latest drone able to locate, track and take out suspicious vehicles with a single targeted strike.
No. It’s a bollard. A hostile vehicle mitigation product is a strategically placed bollard.
Clearly I need to upgrade my automatic overhype detection, filtration, isolation and elimination system – the delete button.
Consider the Communications Bill. That’s the bill that will supposedly allow the intelligence agencies to catch serious criminals and terrorists. But the only people it won’t catch are serious criminals and terrorists. And the only thing it will do is allow the government to know who you are talking to and where you go on the internet at all times and as you do it – at a huge cost to the public purse. (Incidental #1: The public purse is not government money. It is your money. The government doesn’t have any money. It is therefore taking your money to pay for a system to spy on you.)
Obnoxious as this is, it is not in itself undemocratic. You can argue that in a democracy, the electorate votes for a government and gives it the authority to make decisions without further reference to the electorate. (Incidental #2. I believe that this is a mis-interpretation of democracy developed and promoted by governments. I believe that in a democracy, the government is always subservient to the will of the people.)
But what is quite incredible is the experience of Conservative MP Dominic Raab. As a member of parliament he is being asked by the government to vote in favour of this bill. A fundamental part of the spying process will be the filter. ISPs are going to be asked to keep complete records of our communications and browsing. That will be a national database of everything, albeit spread across the different ISPs. Technically, not a problem – it’s a national database from a government that promised to ‘roll back the database state.’ The filter is the mechanism by which the agencies can get to what they want – that is, it is effectively a private government search engine for our emails.
Quite reasonably, Raab wanted to know more about what he was being asked – no, told – to vote for on our behalf. All he wants to know is the advice given to the Home Office to justify the filter. The Home Office said no. So with true Yorkshire grit (he can kiss goodbye to any government preferment in the future) he issued a freedom of information request. But again the Home Office said “no, national security issues, don’t you know old boy.”
So he referred it to the Information Commissioner. The Information Commissioner has requested more information from the Home Office so that he can make a ruling on whether the refusal of the FoI request is justified. The Home Office has just over 20 working days from now to respond or face potential legal action for what amounts to contempt.
My bet is that the Home Office will respond, but we won’t know how, because the Information Commissioner will agree that it is in national security interests to withhold the information. His only alternative is to side with the people, upset the government and kiss goodbye to his knighthood – just like Raab. I will be delighted and will beg his forgiveness for besmirching his noble position if he sides with the people. I doubt that I will have to.
But step back and think about this. The Home Office is demanding that our elected representatives simply do what they’re told with no understanding nor knowledge of what exactly they’re doing. That, I fear, is democracy in 21st Century Britain: we elect people to do what the government wants which is what big business and secret services want. What the electorate wants is irrelevant.
Statistics – don’t you just love ‘em?
“91% of people trust business to keep data safe despite rise in breaches” is the headline announcement from Varonis today.
“Only 3% say that their data is very secure with social networks and 11% say the same about online retailers,” concluded the Economist Intelligence Unit earlier this month.
Now I’m no mathematician, but this adds up to just one thing for me: don’t believe anything anyone tells you – and that includes me. Make up your own mind because everyone else has an agenda that may not be in your best interests.
Hint: always err on the side of distrust; you may be pleasantly surprised, but you won’t be disappointed, and you’ll almost certainly be right.
When the sea is calm and the sun is warm, it’s tempting to visit new and strange places. But if the sea is rough and full of danger, it’s safer to sit still, avoid rocking the boat, and do whatever the captain says. So when the weather’s good, the captain will warn about sharks outside, leaks within, and the coming tempest: so best sit still and do what we say. Fear is a remarkable tool for suppressing dissent.
But fear only works in the short-term – it has to be continually renewed for maximum effect. A better solution is to permanently shackle the crew so it cannot turn aside for those new and strange places. But how do you do that? The crew has to shackle itself, willingly. The answer is very simple; increase the fear level and tell the people that shackling will keep them safe.
The fear bit is easy. You don’t need to do anything; just recognize the opportunity and accentuate it. So when some idiot creates a video that is unacceptable, don’t ban it, just sit back and watch the effect and stoke the fires. When you make a diplomatic mistake and provoke the only possible response from a dangerous state, don’t diffuse it, just sit back and stoke the fires. And a few false flags will do no harm: you can foil them at the last moment to protect the crew. Hell, if one or two go wrong, a little collateral damage is no great price – just sit back and stoke the fires by locking down the entire city.
With waters this choppy, just offer the shackles, call them CISPA, and the crew will chain themselves to the oars and do whatever you tell them. Fear is a wonderful tool if you know how to use it.
Last month Bruce Schneier made an interesting comment:
I personally believe that training users in security is generally a waste of time and that the money can be spent better elsewhere… If four-fifths of company employees learn to choose better passwords, or not to click on dodgy links, one-fifth still get it wrong and the bad guys still get in.
On Security Awareness Training
My favourite riposte comes from Ira Winkler:
That argument basically says that if the bad guy gets in, all security countermeasures are irrelevant. By that measure, we should abandon security as a whole, since all countermeasures have and will fail.
Arguments Against Security Awareness Are Shortsighted
But Schneier has a point – training clearly isn’t working since (according to Trend Micro) more than 90% of successful APT attacks start from a spear-phishing success. But Winkler also has a point – all [technical and human] countermeasures have and will fail. Does that mean we should just give up on security in general and awareness training in particular?
Clearly not. Surely the solution is not to abandon what isn’t good enough, but to improve it until it is good enough. The question then becomes how do we make security training more efficient? Since the majority of breaches start from a phishing or spear-phishing attack, then phishing is where we should start. But if traditional awareness training isn’t working, perhaps we need to think of something new.
Wombat Security Technologies thinks it has the answer: simulated attack training. In a nutshell, this involves phishing your own staff. This has two huge advantages: it is teaching through experience rather than teaching through lectures (and practical always sticks better than theoretical); and it is measurable. If somebody falls for a phish, and gets sent to a benign destination with a company ‘gotcha’ message, he or she won’t want it to happen again. Secondly, however, it allows the company to measure the success of its training scheme.
If 20% (it will likely be more than 80% to start with) fall to the first phish, and then 25% fall to the next one, then clearly there is something wrong with the overall training package, and it needs to be re-evaluated. More likely, however, the number of victims will steadily decrease over time. Repetitive victims can then be pulled out for more targeted training; and super-repetitive victims can be assigned the gardening detail.
Wombat has published a new report based on the practical experience of several CSOs from major companies:A Security Officer Debate: Are simulated phishing attacks an effective approach to security awareness and training? It is well worth reading to see how simulated attack training works in practice; and what steps you need to take to get it started.
PS. Note that these are CSOs. Schneier is a CTO.