Archive for the 'Government' Category

Mar 18 2014

NSP Microcast – RSAC2014 – Utimaco

I spent a few minutes with the CEO of Utimaco, Malte Pollman at RSAC this year.  Malte explains why Hardware Security Modules are important to the web of trust of the Internet, why lawful interception is a not in conflict with that web of trust.  As with all my interviews at RSAC, I asked Malte how the last year’s worth of spying revelations have affected his company and him personally.  Also, I have a problem pronouncing the company name, which for the record is you-tee-make-oh.

NSPMicrocast-RSAC2014-Utimaco

No responses yet

Mar 15 2014

NSP Microcast – BSidesSF with Trey Ford

I caught Trey Ford right after his talk at the BSides Conference in San Francisco last month to talk about the efforts he’s making on behalf of Rapid7 and the security community.  It may be a sign that we’re a maturing industry when we’ve got folks like Trey traveling to Washington, DC in order to talk to lawmakers about how what they’re doing affects our lives.  And, as with all my interviews this year, I ask Trey how revelations about our government has affected his personal as well as professional life.  Check out his site at Password123.org.

NSPMicrocast – BSidesSF – Trey Ford

No responses yet

Mar 07 2014

You have been identified as a latent criminal!

This afternoon, while I ate lunch, I watched a new-to-me anime called Pscho-Pass.  The TL:DR summary of the show is a future where everyone is chipped and constantly monitored.  If their Criminal Coefficient becomes to high, they are arrested for the good of society.  It doesn’t matter whether they’ve commited a crime or not, if the potential that they will commit a crime exceeds the threshold set by the computer, they’re arrested, or killed if they resist arrest. Like many anime, it sounds like a dystopian future that could never happen.  Except when I got back to my desk, I saw Bruce Schneier’s post, Surveillance by Algorithm.  And once again what I thought was an impossible dystopian future seems like a probable dystopian present.  

As Bruce points out, we already have Google and Amazon suggesting search results and purchases based on our prior behaviours online.  With every search I make online, they build up a more detailed and accurate profile of what I like, what I’ll buy and, by extension, what sort of person I am.  They aren’t using people to do this, there’s an extensive and thoroughly thought out algorithm that measures my every action to create a statistically accurate profile of my likes and dislikes in order to offer up what I might like to buy next based on their experience of what I’ve purchased in the past.  Or there would be if I didn’t purposefully share and account with my wife in order to confuse the profiling software Amazon uses.

Google is a lot harder to fool and they have access to a lot more of the data that reveals the true nature of who I am, what I’ve done and what I’m planning to do.  They have every personal email, my calendar, my searches, in fact, about 90% of what I do online is either directly through Google or indexed by Google in some way or shape.  Even my own family and friends probably don’t have as accurate an indicator of who I really am behind the mask as Google does, if they choose to create a psychological profile of me.  You can cloud the judgement of people, since they’re applying their own filters that interfere with a valid assessment of others, but a well written computer algorithm takes the biases of numerous coders and tries to even them out to create an evaluation that’s closer to reality than that of most people.

It wouldn’t take much for a government, the US, the UK or any other government, to start pushing to have an algorithm that evaluates the mental health and criminal index of every user on the planet and alerts the authorities when something bad is being planned.  Another point Bruce makes is that this isn’t considered ‘collection’ by the NSA, since they wouldn’t necessarilly have any of the data until an alert had been raised and a human began to review the data.  It would begin as something seemingly innoccuous, probably similar to the logical fallacies that governments already use to create ‘protection mechanisms’: “We just want to catch the peodophiles and terrorists; if you’re not a peodophile or terrorist, you have nothing to fear.”  After all, these are the exact phrases that have been used numerous times to create any number of organizations and mechanisms, including the TSA and the NSA itself.  And they’re all that much more powerful because there is a strong core of truth to them.

But what they don’t address is a few of the fatal flaws to any such system based on a behavioural algorithm.  First of all, inclination, or even intent, doesn’t equal action.  Our society has long ago established that the thought of doing something isn’t the same as doing the action, whether it’s well-intentioned or malign.  If I mean to call my mother back in the US every Sunday, the thought doesn’t count unless I actually follow through and do so.  And if I want to run over a cyclist who’s slowing down traffic, it really doesn’t matter unless I nudge the steering wheel to the left and hit them.  Intent to commit a crime is not the same as the crime itself, until I start taking the steps necessary to perform the crime, such as purchasing explosives or writing a plan to blow something up.  If we were ever to start allowing the use of algoritms to denote who ‘s a potential criminal and treat them as such before they’ve commited a crime, we’ll have lost something essential to the human condition.

A second problem is that the algorithms are going to be created by people.  People who are fallable and biased.  Even if the individual biases are compensated for, the biases of the cultures are going to be evident in any tool that’s used to detect thought crimes.  This might not seem like much of a problem if you’re an American who agrees with the mainstream American values, but what if you’re not?  What if you’re GLBT?  What if you have an open relationship?  Or like pain?  What if there’s some aspect of your life that falls outside what is considered acceptable by the mainstream of our society?  Almost everyone has some aspect of their life they keep private because it doesn’t meet with societal norms on some level.  It’s a natural part of being human and fallable.  Additionally, actions and thoughts that are perfectly innocuous in the US can become serious crimes if you travel to the Middle East, Asia or Africa and the other way as well.  Back to the issue of sexual orientation, we only have to look at the recent Olympics and how several laws were passed in Russia to make non-heterosexual orientation a crime.  We have numerous examples of laws that have passed in the US only later to be thought to be unfair by more modern standards, with Prohibition being one of the most prominent examples.  Using computer algorithms to uncover people’s hidden inclinations would have a disastrous effect on both individuals and society as a whole.

Finally, there’s the twin ideas of false positives and false negatives.  If you’ve ever run an IDS, WAF or any other type of detection and blocking mechanism, you’re intimately familiar with the concepts.  A false positive is an alert that erroneously tags something as being malicious when it’s not.  It might be that a coder used a string that you’ve written into your detection algorithms and it’s caught by your IDS as an attack.  Or it might be a horror writer looking up some horrible technique that the bad guy in his latest novel is going to use to kill his victims.  In either case, it’s relatively easy to identify a false positive, though a false positive by the a behavioural algorithm has the potential to ruin a persons life before everything is said and done. 

Much more pernicous are false negatives.  This is when your detection mechanism has failed to catch an indicator and therefore not alerted you.  It’s much harder to find and understand false negatives because you don’t know if you’re failing to detect a legitimate attack or if there are simply no malicous attacks to catch.  It’s hard enough when dealing with network traffic to understand and detect false negatives, but when you’re dealing with people who are consciously trying to avoid displaying any of the triggers that would raise alerts, false negatives become much harder to detect and the consequences become much greater.  A large part of spycraft is to avoid any behaviour that will alert other spies to what you are; the same ideas apply to terrorists or criminals of any stripe who have a certain level of intelligence.  The most successful criminals are the ones who make every attempt to blend into society and appear to be just like every other successful businessman around them.  The consequences of believing your computer algorithms have identified every potential terrorist are that you stop looking for the people that might be off the grid for whatever reasons.  You learn to rely to heavily on the algorithm to the exclusion of everything else, a consequence we’ve already seen.

So much of what goes on society is a pendulum that swings back and forth as we adjust to the changes in our reality.  Currently, we have a massive change in technologies that allow for surveillance that far exceeds anything that’s ever been available in the past.  The thought that it might swing to the point of having chips in every persons head that tells the authorities when we start thinking thoughts that are a little too nasty is a far fetched scenario, I’ll admit.  But the thought that the NSA might have a secret data center in the desert that runs a complex algorithm on every packet and phone call that is made in the US and the world to detect potential terrorists or criminal isn’t.  However well intentioned the idea might be, the failings of the technology, the failings of the people implementing the technology and the impacts of this technology on basic human rights and freedoms are something that not only should be considered, they’re all issues that are facing us right now and must be discussed.  I, for one, don’t want to live in a world of “thought police” and “Minority Report“, but that is where this slippery slope leads.  Rather than our Oracle being a group of psychics, it might be a computer program written by … wait for it … Oracle.  And if you’ve ever used Oracle software, that should scare you as much as anything else I’ve written.

 

No responses yet

Feb 10 2014

The Day We Fight Back

Published by under Government,Privacy

I’m of mixed feelings about The Day We Fight Back.  I think it’s a necessary movement, I think our governments have lost their way and are becoming more facist every day.  I blieve we need to reign in what our law enforcement agencies can and should do.  But I have no illusions that a banner on a website and a series of blog posts are going to do anything to change it.  But we have to start somewhere.  I guess I’m just becoming (more) cynical as I grow older.  

No responses yet

Jan 23 2014

But first, BSides…

I’m looking forward to this year’s pilgrimage to San Francisco.  Not that it’s ever been a pilgrimage before, since I lived 60 miles away, but now that I live near London, it’s a much longer trip.  I’ll be arriving in San Francisco a few days early for a couple of reasons.  The first is to visit my family and friends in the Bay Area, who I haven’t seen since I moved away.  The second reason is to attend BSides SF on Sunday and Monday.  Which, in many ways, is also a visit to friends I haven’t seen since moving.

Let’s assume for a second you’ve never attended a BSides event.  It’s community led, it’s free, and each one is unique.  BSides SF is being held in the DNA Lounge, which has been a fixture in San Francisco for as long as I can remember.  Think of a funky, grungy, dark underground bar.  Then add in a couple of hundred hackers, security devotees and a few people who happened to find their way into the event with little or no idea of what’s going on.  The talks range from first time speakers (something that’s strongly encouraged) to some of the best speakers in the realm who want to step outside the confines of a business conference to talk about things that aren’t quite politically correct.  Finally, add in a healthy dose of chaos and an even healthier sprinkling of community and you have some idea of what BSides is.  But unless you actually attend, my description is never going to be adequate to capture the true energy of the event.

I make no bones about it, for me conferences are about meeting the people there, not about the talks.  However, the talks at BSides tend to take a higher priority than they do elsewhere.  While some of the talks are a bit rougher than those at conferences you pay for, the fact that people are speaking with unfiltered passion more than makes up for it.  And a number of the talks simply couldn’t be given at a corporate event.  I’m looking forward to Morgan Marquis-Boire’s (aka @headhntr) talk, even though he hasn’t publicly stated what it’ll be about yet.  Morgan has worked on uncovering a number of government surveillance schemes around the globe, so anything he’s chosen to talk about has to be interesting.  Along the same lines, Christopher Soghoian’s talk about living in a post-Snowden world is a must for me, even though I often find myself disagreeing with with what Chris says publicly.  What can I say, privacy has always been a favorite topic of mine and has never been something that’s more in need of open, public discussion.

I’m also looking forward to seeing three of my friends on one panel, Jack Daniel, Wendy Nather and Javvad Malik discussing how to talk to an analyst, or rather how not to talk to an analyst.  Javvad gave an excellent PK (20 slides, 20 second per slide) talk at RSA EU covering all the horrible slides he sees again and again as an analyst.  The trio will be entertaining at the least, and I might even learn a little about talking to analysts myself.  Ping Yan’s talk on using intelligence looks interesting and has potential for my day job, so I’m going to try to find a seat for that talk as well.  And I have to support my podcast co-host Zach Lanier, even though I usually understand about half of what he’s presenting on any given occasion.

There are other interesting talks, if I can sit through the talks I’ve already mentioned, it’ll probably be the most I’ve seen at one conference in a long time.  I have a pretty short attention (Squirrel!) span, and I’d rather be talking with the presenters than simply listening to them passively.   I’ll have a mic and my Zoom H4, so it’s entirely possible I’ll be able to get a few of them to spend a few minutes doing exactly that.  Which means I can share the conversations with you as well.

 

No responses yet

Jan 19 2014

Prepping for RSA

It’s funny.  There are two distinctive groups I get invites to meet with at the RSA conference: the early invites from companies that are hungry for coverage, any coverage, and the last minute invites from companies who didn’t get as many interviews as they’d like and are looking to fill one or two last interviews from second (or third [or fourth]) tier ‘press’ such as myself.  There are a few invites that come somewhere in the middle, but they stil tend to gravitate towards one of those two ends of the spectrum.  And it makes setting up a schedule for RSA extremely hard sometimes, since I tend to want to leave one or two slots open to make time for the last minute invites I find intesting.

Speaking of interesting, I think the most interesting story of the conference will be the boycott by a few speakers and the reasons behind it.  I wonder how many of the company representatives I speak with are even going to be aware of the fact that a boycott is happening and if it will affect them in any way.  As I’ve said before, I’m not really in support of the boycott, but I understand the reasons a number of professionals are supporting it and I think they have every right to.  So asking other attendees and sponsors how they think the boycott has affected them should get some interesting responses.

In any case, now it’s time to start responding to the invitations to meet I’ve already gotten and try to figure out how I can fit everything in along side my professional duties.  Many years I’ve created microcasts throughout the conference, something that’s incredibly hard to find the time and energy to do.  Last year I mostly abandoned them, but I think I’m going to try to do microcasts again.  But I reserve the right to drop them if time doesn’t allow for it.

No responses yet

Dec 08 2013

Will limits work?

Published by under Government,Privacy

A number of tech giants are petitioning the US federal government to put limits on the surveillance powers of agencies such as the NSA.  Specifically, there are eight organizations, led by Microsoft and Google who are stating that the governmental spying machines are putting them in a bad business position by eroding the trust that the public and other companies have in the systems created by the monitoring efforts.  Here in Europe this is definitey true and as each new revelation of phone tapping and metadata collection is revealed, it only becomes harder and harder for businesses and users to trust.  But the real question is, even if the laws are changed to make the wholesale collection of data harder, will it put a check on the organizations who see it as their mandate to protect the public from ‘terrorists’ no matter what the cost?

I could go on for pages about the problems with the current attitudes of law enforcement, about the problems with justifying all this spying by invoking the specter of terrorism, about the potential for abuse, about the cost in capital and human time to use this data, and the lack of effectiveness of wholesale data collection.  And I want to, but it wouldn’t do much good.  Most people have already made up their minds on the subject, our agencies are addicted to the power this surveillance gives them, and most people are ignorant as to the danger the wholesale capture of data can create.  If the last point were even slightly wrong, we wouldn’t be giving companies our data by the bucketload in order to share pictures of our cats and kids.

I believe in due process, the rule of law and constraints on government power. And I think we’re at a point in history where most of that has been thrown out the window, using a witch hunt as an excuse.  Changing the laws won’t make it any better; either the laws will be written by the very agencies we’re trying to limit, with plenty of loopholes designed to let them keep doing what they’re doing, or the laws will be ignored and circumvented until we have a new leak that sets off another round of … the same exact thing.  I’m pretty pessimistic on the subject.

Can changes in law lead to a reform of the system?  Yes, they can, but the question is, will they?  In the short term, I think it’s impossible for us to have any meaningful change, in part because the system in the US is too drunk on it’s own power.  In the long term, if the public will is strong, then we might see changes.  We’ve had McCarthy and Hoover and Nixon, we’ve made it through dark times before, but it took a long time to recover from each of these people.  The world will survive another round of abused power, but the question is where will we end up as an worldwide population?  Probably with less liberties forever.

No responses yet

Dec 04 2013

Everyone’s moving to PFS

Last month I wrote about Perfect Forward Secrecy (PFS) for the Akamai corporate blog.  But if you’d asked me two months earlier what PFS was, you would have seen me madly scrambling for Google to find out more about it.  And I’m not alone; before this summer only a few deeply technical engineers had heard of PFS, almost everyone else had either never encountered it or dismissed it as an unnecessary burden on their servers.  Except the NSA managed to change that perception over the summer.

Now most companies are looking at PFS, or looking at it again.  In a nutshell, PFS is a method used with SSL that creates a temporary key to transmit the session keys for the browser session and then dumps key from memory afterward.  You can use words like ‘ephemeral elliptic curve cryptography’, but the important part of this is that PFS enables a method of encrypting SSL communications that don’t rely on the master key on the server to protect your traffic, it creates a new key every time.  This means that even if that master key is somehow compromised, it doesn’t allow access to all the traffic for that SSL certificate, the attacker must crack each and every session individually.   Which means you have to have a lot more computing power at your disposal to crack more than a few conversations.

PFS is a good idea we should have instantiated some time ago, but it’s got a downside in that it requires a lot of server overhead. But having to view our own governments as the enemy has given tech companies around the globe the impetus to make the change to PFS.  Google is moving towards encrypting all traffic by default, with PFS being part of this effort.  Facebook has moved in the same direction, with PFS also being a critical piece in the protection puzzle.  And Twitter.  And Microsoft.  And … you get the picture.  Companies are moving to use PFS across the board because it gives them a tool they can point to in order to tell users that they really care about securing end user communications.

I have to applaud these companies for taking this step, but even more, I have to hand it to Google, Yahoo, Facebook, and Microsoft for challenging the current status quo of National Security Letters and the secrecy they entail.  There are more questions than answers when it comes to how NSL’s are being used, if they’re necessary and if they are even something a country like the US should be allowing.  Technology is great and it’ll help with some of the problems we’re just starting to understand, but the only long term changes are going to come if we examine the current issues with the NSA and other agencies slurping up every available byte of data for later analysis.  Changes to the laws probably won’t stop anything immediately, but we have to have the conversation.

Using PFS is just a start in to what will be fundamental changes in the Internet.  Encryption everywhere has to become an integral part of the Internet, something privacy boffins have been saying for years.  It may be too late for this to be an effective measure, but we have to do something. PFS makes for a pretty good first step.

No responses yet

Nov 25 2013

Two more years of Snowden leaks

Published by under Cloud,Government,Privacy,Risk

I’ve been trying to avoid NSA stories since this summer, really I have.  I get so worked up when I start reading and writing about these stories and I assume no one wants to read my realistic/paranoid ranting when I get like that.  Or at least that’s what my cohosts on the podcast have told me.  But one of the things I’ve been pointing out to people since this started is that there were reportedly at least 2000 documents contained in the systems Edward Snowden took to Hong Kong with him.  There could easily be many, many more, but the important point is that we’ve only seen stories concerning a very small number of these documents so far.

One of the points I’ve been making to friends and coworkers is that given how many documents we’ve seen release, we have at least a year more of revelations ahead of us, more likely two or more.  And apparently people who know agree with me: “Some Obama Administration officials have said privately that Snowden downloaded enought material to fuel two more years of news stories.”  This probably isn’t what many businesses in the US who are trying to sell overseas, whether they’re Cloud-based or not.  

These revelations have done enormous damage to the reputation of the US and American companies; according to Forrester, the damage could be as much as $35 billion over the next three years in lost revenue.  You can blame Mr. Snowden and Mr. Greenwald for releasing the documents, but I prefer to blame our government (not just the current administration) for letting their need to provide safety to the populace no matter what the cost.  I don’t expect everyone to agree with me on this and don’t care if they do.  It was a cost calculation that numerous people in power made, and I think they chose poorly.

Don’t expect this whole issue to blow over any time soon.  Greenwald has a cache of data that any reporter would love to make a career out of.  He’s doing what reporters are supposed to do and researching each piece of data and then exposing it to the world.  Don’t blame him for doing the sort of investigative reporting that he was educated and trained to do.  This is part of what makes a great democracy, the ability of reporters (and bloggers) to expose secrets to the world.  Democracy thrives on transparency.

As always, these are my opinions and don’t reflect upon my employer.  So, if you don’t like them, come to me directly.

No responses yet

Nov 04 2013

Attacking the weakest link

Published by under Cloud,Government,Hacking,Privacy,Risk

I spend far too much time reading about governmental spying on citizens, both US and abroad.  It’s a job hazard, since it impacts my role at work, but it’s also what I would be researching and reading about even if it wasn’t.  The natural paranoia that makes me a good security professional also feeds the desire to know as much as possible about the people who really are spying on us.  You could almost say it’s a healthy paranoia, since even things I never would have guessed have come to pass.  

But every time I hear about someone who’s come up with a ‘solution’ that protects businesses and consumers from spying, I have to take it with a grain of salt.  A really big grain of salt.  The latest scheme is by Swisscom, a telecommunications company in Switzerland that wants to build a datacenter in that country to offer up cloud services in an environment that would be safe from the US and other countries’ spying.  The theory is that Swiss law offers many more protections than other countries in the EU and the rest of the world and that these legal protections would be enough to stop the data at rest (ie. while stored on a hard drive in the cloud) from being captured by spies.  The only problem is that even the Swisscom representatives admit that it’s only the data at rest that would be protected, not the data in transit.  In other words, the data would be safe while sitting still, but when it enters or leaves Swiss space, it would be open to interception.  

It was recently revealed that the NSA doesn’t need to get to the data at rest, since they simply tap into the major fiber optic cables and capture the information as it traverses the Internet.  Their counterparts here in the UK do the same thing and the two organizations are constantly sharing information in order to ‘protect us from terrorists’.  Both spy organizations have been very careful to state that they don’t get information from cloud providers without court orders, but they haven’t addressed the issue of data in motion. 

So while the idea of a Swiss datacenter built to protect your data is a bit appealing, the reality is that it wouldn’t do much to help anyone keep their data safe, unless you’re willing to move to Switzerland.  And even then, this solution wouldn’t help much; this is the Internet and you never know exactly where your data is going to route through to get to your target.  If it left Swiss ‘airspace’ for even one hop, that might be enough for spy agencies to grab it.  And history has proven that at least GCHQ is willing to compromise the data centers of their allies if it’ll help them get the data they believe they need.  

No responses yet

Next »