REEP Key Ceremony

The key ceremony for the REEP service took place on 2014-05-18 after the REFEDS meeting in Dublin, Ireland.

I witnessed this ceremony and was convinced that the key attached to this post as a self-signed X.509 certificate was generated during the ceremony within the hardware security module in Sweden that will be used by the REEP service to sign metadata served by it. To certify this, I have generated a detached signature file for reep.pem using my PGP key.

To the extent that you trust me to have taken care while witnessing the ceremony, you may find that validating my signature on reep.pem gives you some comfort that metadata documents signed by the private key associated with reep.pem are, indeed, legitimate outputs of the REEP service.

As an aside about the ceremony itself, proof that a particular computational event has occurred in a particular way is almost impossible in a world of networking and virtual machines. We've known this for a long time: the paranoia goes back at least as far as Ken Thomson's Reflections on Trusting Trust. We're not quite living in The Matrix, but the evidence of ones senses doesn't really go very far towards absolute proof. So what the other witnesses and I did during the ceremony — all we could do, really — was gain confidence by asking questions, taking photographs of the steps and trying to think of ways to validate them. For example, I was later able to verify that the pkcs11-tool command being used was indeed the one which would be installed on a system running 64-bit Ubuntu 12.04. Unless, of course, Leif foresaw that trick and subverted the md5sum command as well. It's turtles all the way down.

New Roots

I run a simple X.509 Certification Authority for internal systems, and certain external systems used by clients (the majority of external systems use commercial certificates). From 2011-01-02, this CA will use a new root certificate:

The SHA1 fingerprint for this certificate is:

  • 34:6E:CB:19:25:15:E7:94:ED:AF:A4:F1:C4:79:BF:92:C5:8B:3C:D5

For reference, the previous root certificate is here:

The last certificate issued under the old root certificate expires on 2011-01-23.


Tiger Team

If you're at all interested in physical security as well as computer security (or, alternatively, if you find it interesting to think about security systems as opposed to just components of those systems) a new TV show called Tiger Team might be worth a look.

The idea is pretty self-explanatory if you've heard of the concept of a tiger team elsewhere: this is a "reality" show in which the heroes break real-world security systems using a combination of technology, brass neck and dumpster diving. Rather like Mission: Impossible but without Peter Graves and (so far) without the rubber masks. What's not to like?

Unfortunately, I can't see any evidence that this series will be shown anywhere here in the UK, but you can stream the pilot episode from the cable channel's web site, at least for now. It's interesting to watch the ways in which the target's (fairly good) security fails when approached in the right way, and the presentation isn't too grating even for my sensitive British ears. Some of what you see is obviously re-enactment, but I guess that's "reality" TV for you.


Dual_EC_DRBG Back Door?

Bruce Schneier reports that one of the pseudo-random number generators in the recently released NIST Special Publication 800-90 (.pdf) appears to include something that looks awfully like an intentional back door:

What Shumow and Ferguson showed is that these numbers have a relationship with a second, secret set of numbers that can act as a kind of skeleton key. If you know the secret numbers, you can predict the output of the random-number generator after collecting just 32 bytes of its output. To put that in real terms, you only need to monitor one TLS internet encryption connection in order to crack the security of that protocol. If you know the secret numbers, you can completely break any instantiation of Dual_EC_DRBG.

It's possible that this is accidental; if it is deliberate, the prime suspects are the NSA, who have been pushing to get this algorithm adopted for some time. So much for the usual outsider's paranoia about how the evil TLA might be compromising our cryptography for their own nefarious ends. That's not the scary part, though; the really scary part is the thought that perhaps that isn't what is going on:

If this story leaves you confused, join the club. I don't understand why the NSA was so insistent about including Dual_EC_DRBG in the standard. It makes no sense as a trap door: It's public, and rather obvious. It makes no sense from an engineering perspective: It's too slow for anyone to willingly use it. And it makes no sense from a backwards-compatibility perspective: Swapping one random-number generator for another is easy.

Shumow and Ferguson's presentation (.pdf) is short, and although there are some squiggly letters in it you don't need to understand the mathematics of elliptic curves to follow the argument.

I look forward to seeing how this one plays out.

(Via Schneier on Security.)

Insecurity Excuse Bingo

In the wake of the Californian voting machine review, Matt Blaze and Jutta Degener invite us to play Security Public Relations Excuse Bingo:

  • We read Schneier's book
  • La, la, la we're not listening
  • You'll be hearing from our lawyers
  • No one would ever think of that
  • Our proprietary encryption algorithms prevent that
  • … and so on ad nauseam

(Via Matt Blaze.)

Ranum on Codependence

Marcus Ranum has started podcasting. The second episode in his Rear Guard podcast is a short but nicely put together rant explaining the parlous state of computer security today in terms of a dysfunctional relationship between practitioners and their organisations:

It's clear that security will be exactly as bad as it can possibly be while still allowing senior managers to survive. Whenever it gets across that line — worse than it can possibly be — there will be a brief fire-drill in order to duct tape things back together again until next time.

Last week a friend remarked, after hearing one of my long rants on an unrelated subject, that I had a very cynical view of the situation. "Thank you", I replied, quite seriously. Marcus Ranum has a very cynical view of the security landscape: not completely without rays of hope, but nevertheless aware that a lot of bad things happen out of pure unenlightened self-interest.


Firefox Cipher Suites

When your browser connects to a web site protected by transport layer security of some kind (usually by accessing an https:// URL) there's a negotiation between the two parties. Each party (browser, server) comes to the negotiation with a list of cipher suites that it is prepared to use, and the result is that one of these suites is chosen for the connection.

Recently I ran into a situation where Firefox 2.0 wasn't connecting to a site which Firefox 1.5 had no problems with. It's pretty hard to figure out which cipher suites Firefox is prepared to use from its documentation, so I decided to determine the answer directly by snooping on the negotiation part of the protocol.

Read on for method and results.

Alice and Bob... and Bruce

I couldn't resist this T-shirt design from the people who bring us Everyone Loves Eric Raymond and Bruce Schneier Facts.

Obviously this is only going to be funny to (a) a very particular kind of nerd with (b) a very particular sense of humour. I suspect I'm not the only member of both sets, though.


In Real-World Passwords, Bruce Schneier analyses a corpus of passwords retrieved from a phishing attack on the MySpace social networking site.

The good news is that it's clear that users are slowly becoming more aware of the security risks of bad password choice. The bad news is that things haven't got all that much better, really. Scheier's punchline:

We used to quip that "password" is the most common password. Now it's "password1." Who said users haven't learned anything about security?

These days, it's hard for me to get up much enthusiasm for any security solution that involves a lot of user education. As well as the apathy factor and the dancing pig factor, we're fast outrunning the ability of even the most well educated user to keep up with the bad guys. I include myself with the mass of the bewildered in this respect, as evidenced by my previous post on remembering secure passwords.

The longer term answer to these problems has to involve a move away from relying solely on inherently weak technologies like passwords and towards technologies like multi-factor authentication and federated identity systems. If we don't have to rely on the human brain's limited ability to remember lots of secure (and therefore inherently hard to remember) passwords, we might stand a fighting chance of building secure systems.


UK Federation Launched

Today was the official launch of the UK Federation, or the UK Access Management Federation for Education and Research to give its Sunday name. This is a huge deal for everyone involved, myself included: some people have been working towards this point since around 2000 (I'm a relative newcomer, only having put a couple of years into it so far).

In the longer term, this will be a fairly important system for many more people: after all, the UK Federation is a federated identity framework for the whole of the UK education and research sectors, which I'm told involve perhaps 18 million people. If we do our job well over the next few years, though, the best case is that like all good infrastructure it will just sink down below the point where people even notice it. That's a hard job, and we've only just started on it.


Subscribe to RSS - Security