Wednesday, 28 September 2011

Public Key Cryptography

Brave New Cipher

As an avid reader of Scientific American in the 1970s, and of Martin Gardner's Mathematical Games column in particular, I was lucky enough to be schooled in the basics of public key cryptography just as it was being invented. Well, two years after its 1975 inception, to be accurate. Firstly through Martin's column, which in the August 1977 issue became the first to publicize the PKC method worldwide; and then in the same journal, subsequently from the writings of Rivest, Shamir and Adleman (RSA) themselves.

It was an exciting time for anyone of a mathematically sympathetic imagination. For the first time in history, the question of the practicability of devising a cipher, to be encoded and decoded rapidly by computer, used repeatedly without changing the key, and unbreakable by sophisticated cryptanalysis, received an answer and a proof; and the answer was affirmative.

Martin's original article can be read here:
The number theory behind the scheme is simple and elegant, as we hope from such things. But that very beauty of theory and expression can distract the student from the equally important subject of housekeeping.

Secure Public Keys

This is the main message in Eric Lippert's recent article, Keep it secret, keep it safe. As an example, most of us will be aware that Alice and Bob each have one private and one public key, based upon a certain mathematical "trapdoor function", namely prime factorisation. Each party must ensure that his or her private key remains known only to him or herself. What's perhaps less obvious is that their initial exchange of public keys must also be made via a guaranteed secure channel.

Why? Surely the strength of the scheme rests on the fact that eavesdropper Eve can do nothing with full knowledge of these keys? True enough, but Eve is not the only shadowy figure in the telephone exchange room.

Historical note: in cryptography discussions such as these, the cast of characters (Alice, Bob, Eve, etc.) is mostly drawn from Bruce Schneier's seminal book, Applied Cryptography.

Eric uses Mallory to denote this malicious, meddlesome, man-in-the-middle, although Trudy (for intruder) is possibly more common. If Mallory can intercept the initial exchange of public key information, then he may modify these keys in transit, replacing them with his own public key.

Any message from Alice will be encrypted using her private key, and Bob's public key. But now Mallory can intercept such messages, and replace them with alternatives containing his own subversive propaganda, encrypted using his own private key, and Bob's public key. Bob will then in turn decrypt the bogus message using his own private key and Mallory's public key, which he still believes to be Alice's; and so all will appear to be well. Oops!

An Opportune Time

This is a particularly urgent time to be reminding ourselves of the basic mechanics of the system. We have recently seen Certifying Authorities become compromised, appearing to issue shedloads of bogus certificates, and even go bust (DigiNotar). Ironically, RSA itself has become the victim of one of the most wide-ranging breaches in the history of the industry, its SecurID token system and thereby the security of its targeted (Lockheed Martin) and collateral clients being utterly compromised. Meanwhile, the SSL itself, in its most common implementation (TLS 1.0/SSL 3.0), has been hacked by a cookie thieving BEAST.

It's only by understanding the detailed operation of PKC that we'll be able to discern the implications of breaches and failures such as these. But the important details are not in the pretty mathematics, they're higher up in the wider system, where the real, exploitable vulnerabilities reside. They're in the arena of public/private key housekeeping. As Eric puts it, solve the key management problem first!

Related articles:
Authencity of Web pages comes under attack
Hackers impersonating 'trusted' Web pages
Secure online payment system requires end-to-end encryption
Pic: Adi Shamir, 2009. Source: Wikipedia.

Monday, 26 September 2011

Risk Shift

The Trouble With Politics?

In the LA Times just over a week ago, there was an interesting report about a recent study of teenage drivers:,0,7056006.story
Basically, the situation in all the states (California being no more than a particular case in point) has been, for over a decade, that the newest drivers, which is to say 16-17 year olds, have been subjected to various restrictions on their ability to (a) carry passengers, and (b) drive at all during certain times of day. There have also been other restrictions adopted by certain states, such as driving without supervision, or the ability to use mobile phones while driving, but these two have been the main and most popular ones in every state since the first of these so-called "graduated driver licensing programs" was originally introduced in Florida in 1996.

What the study has revealed is that the anticipated - and indeed observed - reduction in fatal accidents involving 16-17 year old drivers, has been "almost exactly matched" by a corresponding increase in fatalities involving 18-19 year olds. In other words, the carnage has merely been pushed two years into the future.

(According to my reading of the evidence, the increase is actually closer to half the earlier reduction, but we'll let that go for now.)

Hot Potato

It's obvious with hindsight, that what these programs have done is to mask the immaturity and inexperience of new drivers for a year or two. By preventing them from being exposed to certain driving situations and environments, the restrictions have of course done nothing to prepare them for such eventual exposure.

Politically, this has become a very awkward truth. During the first couple of years of every such program, public officials have been able to point to a real reduction in young driver related fatalities, and boast with confidence that their policies have been saving lives. Now that it turns out in general not to have been the case, what are the chances that these programs will be cancelled overnight?

Absolute zero, obviously. But for more than just the usual reasons, i.e. short elected terms, and blissful ignorance of the actual evidence. This will not become just another case of badly thought out laws surviving through inertia. For if these programs were discontinued, there would be a bow wave of doubled fatalities lasting for as long as the initial reduction, namely two years, as today's pre-existing 18-19 year olds meet the newly liberated 16-17 year old wave of fresh idiots.

Obviously there's a lot that can be done, in terms of education and training, to defuse these unintended consequences and arrive at a program that does make sense. Still, it's an interesting study, and a good reminder of how shortsighted, wilfully or otherwise, we and our politicians can be.

Saturday, 24 September 2011

It Will Be A Good Day

Six For Saturday

▲ What you get when you play Charlotte Hatherley's White video backwards.

▲ They're all good days.

▲ 2005 electro with a dark tone, superbly conveyed by Helen Marnie's understated delivery.

▲ Flautist Kate Holmes (no, not that one) is also "Client A" in CLIEИT, where Charlotte Hatherley once appeared as "Client C" (and when you read that Charlotte has two elder sisters, Abigail and Beatrice, a heartbreaking little narrative springs to mind).

▲ Bars of seven measures from Poland's greatest export! Love the funky piano @ 3:50.
Yes I know there aren't "measures" in a musical "bar". It's a pub gag.

One of my favourite pop songs - Carla J Easton (Futuristic Retro Champions). Nuff said.

Friday, 9 September 2011


Don't you just hate those Microsoft developers?*

You've just started replacing that fusty old, string-based SQL Builder with a modern, class-based model, featuring fully composable queries, query clauses, and expressions, all the way down to basic arithmetic. Suddenly, they bring out LINQ.

Some time later, recognising the monadic algebra behind the new pattern-based query comprehension keywords, you decide to harness it to implement the continuation monad, which will allow you to write asynchronous code in a purely synchronous paradigm. Suddenly, they announce async/await.

No matter how good, how revolutionary your design, the fact remains that you just can't beat a good keyword. There's simply no better or more powerful implementation, nothing tidier than a well conceived and designed primary language feature. This observation highlights the need to know thoroughly, completely and perfectly, the nature of the feature under implementation.

Unless your main business is writing compilers, your greatest enemy is the client specification, perhaps represented by the design engineer, stating non-goals like there is no requirement to compare or subtract two fields. Because you know that there will be. And when that requirement is finally unearthed, you will already have been forced down the Agile pipe, and have implemented a horribly hobbled subset of the operations that would have been required to express properly a well-formed algebra - of queries, or asynchronous tasks, or whatever.

The sad truth is that many of us lack the basic mathematical skills to appreciate the seachange wrought on your design by a proper, complete, closed and proven algebra. And while the scope for such effort and expenditure by application developers continues to shrink, and a good job too, with the increasing power and usability of production compilers and other tools; still it is imperative to be ready to identify such situations when they do occur, and to be prepared to design the solution along veins of sound, algebraic principles.


Eric Lippert's latest blog article is a beautiful exercise in such pure, abstract, algebraic thinking, and brilliantly displays its payoff. He asks, what in C# is a type? The naive answers most of us would propose, these have all by now been discounted with the aid of counterexamples, in the predecessor article. A type is not a name with a set of values. Neither can it be defined as a set of assignment rules. Still less is a type some kind of predicate for determining which among an infinite set of values belong to it. Eric demonstrates the strictly formal application of mathematical logic at work in his conception, when he actually makes essential use of Russell's Paradox in proving the inadequacy of these attempts!

A type, in the mind of compiler writer Eric, is nothing more than an abstract mathematical entity obeying certain algebraic rules. He follows this definition with a good analogy based on Giuseppe Peano's axiomatic definition of natural number. Turning back to his equally axiomatic type definition, he then shows how its axioms can be used to construct compile-time proofs of runtime type safety (ignoring temporarily wrinkles like the cast operator, or unsafe array covariance).

But perhaps the best illustration of the power of the abstract analytical approach is seen when he dips into Gödel's undecidability results, and identifies areas where such proofs cannot be constructed reliably. The lesson here is that still, the formal analysis rigorously controls the progress toward a compiler solution; this time accurately and unambiguously delineating the "holes in the road" where practical workarounds must be applied, or where - in measurably unrealistic, and perfectly predictable situations - a compiler failure can be tolerated.

Expression too complex to analyse.

Pic: Giuseppe Peano, from Wikipedia. Not actually a Microsoft dev.
*I just love those Microsoft developers!

Thursday, 8 September 2011

Project Security Review #1

Threat Modelling

Now we're really moving! It might only have been a 90 minute session last thing in the working day, but yesterday, we finally managed to convene our very first, "formal", Threat Modelling meeting.

  • Good representation - included people from across the spectra of projects, departments, job descriptions and levels - including director!
  • Refresher coverage of threat / vulnerability terminology and STRIDE classifications.
  • Brief look at the latest SDL Threat Modelling software - well, at least it installed and ran OK - but we reverted to a big whiteboard (thanks Colin!), and directed most of our efforts to:
A Game Of Cards

Yes, we finally got to take the wrapper off Adam Shostick's deck, and play a round of Elevation Of Privilege. Now when I say "a round" I really mean "one trick", as it took the best part of an hour to get just those seven cards played. That also meant that we really concentrated on just Tampering vulnerabilities for the whole session, since it's built into the rules of the game that you must start with that suit.

And that was one of the main lessons learned from this exercise. The card game exists only to facilitate and drive forward the brainstorming exercise. It dislodges little hints of system vulnerabilities, encouraging their discussion in a welcoming, improv-style, "Yes, and..." atmosphere. Anything that impedes this process needs to be removed, even when that involves abandoning or changing certain rules of the game.

Now I just have to get that Data Flow Diagram tidied up and entered into the Threat Modelling tool, in time for the imminent follow-up...

Sunday, 4 September 2011

Hannes Bok

The Futurian Artist in Chief

Inveterate name dropper Fred Pohl's first hand reminiscences on the early history of SF, memories from the elder half of the previous century, continue to amaze me with their depth and clarity. Meanwhile his blog sparkles too with pithy one-liners and snarky, short form, political comment. ♥

This weekend's treat: part one of a biographical sketch of Hannes Bok (real name Wayne Woodard), the only member of the pro-crashed fan club Futurians of New York ever to win a Hugo for illustrative drawing and painting. The story covers his fortuitous early association with one aspiring writer named Ray Bradbury, and the role played by the first ever World Science Fiction Convention (WorldCon).

Update (Sep 7):
  1. Futurian Artist in Chief
  2. The story with the unhappy ending