Posts filed under ‘authentication’

UK: Raising the breach barrier, again

When HMRC (Her Majesty’s Revenue and Customs) lost personal information of nearly half the UK population, I called it “mind boggling”. I also thought that it would be the last time I’d write about data breaches. What could top that?

Never underestimate the Brits. They’ve now pushed the bar even higher.

All it took was a flash drive found in the car park of a pub, The Orbital. It had user names and the hashed passwords of Government Gateway accounts, which provides centralised authentication to important online services such as tax returns. Worse, the flash drive had the source code, security software, and a step-by-step guide to how the Government Gateway works. And, the fact that it belonged to Daniel Harrington, an IT analyst at Atos Origin, the company which manages the Government Gateway.

The flash drive was lost about two weeks ago. Daniel must have just started to believe that his prayers had been answered with the flash drive forever lost. No such luck. Tellingly, it was turned into a newspaper (The Mail on Sunday) rather than given back to the government.

The point isn’t that the flash drive was lost. What was all that data doing on it in the first place? The Prime Minister is pointing the finger at Atos Origin which is fingering Daniel for breaching operating procedures. Really? Sounds exactly like Chancellor Alistair Darling pointing to a junior official in the HMRC case. It really shouldn’t be so easy to evade accountability.

Why was the flash drive unencrypted? The passwords were encrypted but, throw enough resources at it, and it shouldn’t be that hard to break. It’s impossible to say how many copies of the flash drive may be in circulation.

Some will use this to question the UK’s plan for a National Identity Card. Others will again proclaim the death of passwords. Yet others will cry that it’s the tip of the iceberg- who knows how many other unreported breaches of this magnitude are happening around the world? I’m sure at least a few will wonder what if it had been biometric templates.

Me, I mourn the blows to trust in government and online services all over the world. And the frightening reality that past lessons are simply being ignored, taking us ever closer to a tipping point.

November 3, 2008 at 11:17 pm 1 comment

Semantic Web & OAuth

I must confess that for a long time I never got this semantic web thing. Now, with the zeal of the recently converted, I see possibilities everywhere.

Part of the reason it took time was an automatic reaction against something being called Web 3.0 (or is it 4.0?). I’m still trying to really understand Web 2.0. Learning about the next big thing could always wait.

Another reason was how early enthusiasts described the semantic web. Calling it the machine readable web doesn’t even begin to make sense.

As far back as 1999, Tim Berners-Lee in Weaving the Web said, “I have a dream for the Web [in which computers] become capable of analyzing all the data on the Web – the content, links, and transactions between people and computers. A ‘Semantic Web’, which should make this possible, has yet to emerge, but when it does, the day-to-day mechanisms of trade, bureaucracy and our daily lives will be handled by machines talking to machines. The ‘intelligent agents’ people have touted for ages will finally materialize.”

Now that’s visionary. Even today, I’m barely beginning to understand that vision.

Thankfully, and perhaps ironically, the very Web 2.0 service Slideshare has some presentations that explain things in a way that we mere mortals can understand. My first pick are the two presentations from Freek Bijl- the first one covers the basics and the second one the technologies. Another one is from Marta Strikland called The Evolution of Web 3.0. This has a great Web 3.0 Meme Map on slide 15 and a comparative list of Web 2.0 and 3.0 on slide 27.

Being more of a graphics person, the final aha came from the one below, thanks to Project 10x (also worth looking at is the original Semantic Social Computing presentation from Mills Davis).

With the semantic web also comes a whole new set of acronyms. A starter list is RDF, SPARQL, SWRL, XFN, OWL, and OAuth. In particular, OAuth being the authentication one is interesting.

OAuth is described as “An open protocol to allow secure API authentication in a simple and standard method from desktop and web applications.” The basic promise is attractive- access to data while still protecting the account credentials. That has the advantage of not requiring people to give up their usernames and passwords to get access to their data. OAuth is a much-improved version of closed proprietary protocols such as Flickr’s API. Importantly, it has support for non-browser access such as desktop applications and mobile services.

So, what are the practical applications of the semantic web? Within the government space, a clear winner is being able to automate the collection of data from multiple government websites and search, filter, or otherwise manipulate the result.

As a simple example, if all government websites had the contact details of their media contact using hCard, it would be easy to have an always up-to-date list that can be displayed, indexed, searched, loaded into an address book, mapped, etc. Even as a relatively simple first step, this would be a big step forward for government.

September 3, 2008 at 11:43 pm Leave a comment

Elusive SSO

I’ve been a fan of usability guru Jakob Nielsen’s regular update (Alertbox) for a long time. It’s admirable how he keeps re-emphasising the fundamentals again and again.

I suspect that half the reason I read the updates so regularly is the futile hope that somehow- maybe by osmosis- his common sense approach will percolate into my sub-conscious and lead to better outcomes for the online services I’m involved in.

Jakob Nielsen would no doubt laugh at such nonsense, throw up his hands, and demand that I user test to objectively determine that one way or another.

Anyway, his latest piece is on enterprise portals. That is not an area that I often venture into but he had some stuff about single sign-on (SSO) that caught my eye:

“Single sign-on is the Loch Ness monster of the intranet world: People hear about it and even believe it exists, but they’ve yet to see it for real…In our initial research 5 years ago, it was already clear that single sign-on could dramatically improve user productivity and satisfaction, as well as immensely reduce support costs.”

“Our second round of research confirmed single sign-on’s potential — and its elusiveness… True single sign-on was and is extraordinarily rare… We can only conclude that it’s very difficult to achieve, despite its promise.”

What’s true of the enterprise is even more so outside it, for the Internet.

The benefits and business case for enterprise SSO are undoubtedly great. But for the Internet? That’s an area that I personally struggle with, notwithstanding that SSO is the original use case for federation and, to some extent, can be provided by OpenID (provided the person has logged on to the OpenID Provider).

Now, Internet SSO does mean convenience. It surely is a good thing to log on once and then be able to do whatever a person wants across the Internet without logging in again.

What worry me are the security and privacy implications. Those aren’t that big a deal within an enterprise context but are on the Internet. And, within government online services on a national scale, even more so.

From a security perspective, it’s about the loss of keys to the kingdom- passwords are just too easy to compromise. Now, if passwords were used appropriately (i.e. only where there is a low level of identity-related risks) then the consequences from a compromised password wouldn’t be too bad. But, realistically, passwords today protect far too much and a compromised password can be a widespread disaster for the person.

Then, there’s privacy. Using the same username & password to do everything (or lots of things) then raises the possibility of aggregation of information and building profiles.

So is Internet SSO a good thing? Yes, provided it is implemented in a secure and privacy-protective manner. Problem is, can that be achieved in an economical manner (that rules out advanced crypto) for the Internet?

July 15, 2008 at 11:16 pm 1 comment

Just what is ‘identity’?

As a term that most of us find intuitively easy to define, it turns out that getting a precise and generally accepted definition of the term ‘identity’ is far from easy.

The first question of course is whether it’s even worth the effort to try and get a precise definition. I think the answer is ‘yes’ for several reasons.

First, identity involves personal information and people expect that government collects and holds their personal information in a secure manner with their privacy appropriately protected.

Secondly, people need to prove who they are many times during a day. While typically people only need to do that with government infrequently, for a government agency it is of critical everyday importance to have confidence in the identity of the person they are dealing with. For example, an agency needs to be sure that government services are being delivered to the right person. Another example is ensuring that the right person has access to their own personal information such as health records or tax records.

On the one hand, people want convenient access to their information and government services. On the other hand, government as a whole has to manage the identity-related risks and ensure that the taxpayer’s money is spent well.

Finally, consider this quote from a recent report by Sir James Crosby to the UK Government, “… those countries with the most effective ID assurance systems and infrastructure will enjoy economic and social advantage, and those without will miss an opportunity. There is a clear virtuous circle. The ease and confidence with which individuals can assert their identity improves economic efficiency and social cohesion…”.

Looking around, both in New Zealand and overseas, we saw that most of the focus on ‘digital identity’ and ‘user-centric identity’. Also, ‘identity management’ is typically defined in technology terms such as ‘authentication’ and ‘authorisation’. And yet, all of these still don’t answer the fundamental question of just what ‘identity’ is in the first place.

To help get us a better insight into the thinking of the academic world and the approaches taken in some other countries, we turned to Victoria University of Wellington. Professor Miriam Lips, with the help of her student Chiky Pang, has now completed her report Identity Management in Information Age Government (PDF, 557 KB) and we have published it on the e-government website.

It turns out that the answer to our questions has a variety of answers. However, it does validate our current approach that one of the useful ways to look at identity is to consider that people have a single, unique identity but many context-dependent partial identities or personas. The result is more of an onion than linear, so that operating at the outer layers of the onion may not have any connection at all with the unique core:

Another interesting insight from the report is the move to an informational definition of identity from a document-based definition. The impact of the Information Age is to make it increasingly necessary for governments to consider identity information- its collection, verification, storage, maintenance, and disposal- rather than just the issue and use of identity documents.

As we look at these issues in finer and finer detail, it remains important to not lose sight of the basics. Such as, people own and control their own identity while government’s role is to manage their identity information well. And, the need to put theory into practice.

So that in the future, when Bill and Jessica want to return home to New Zealand, they have one less thing to worry about.

[Original post at]

July 9, 2008 at 7:52 pm Leave a comment

Authenticating the Queen’s subjects

I’m just back from attending eGovernment 2008 in Canberra. For me, the big draw was an opportunity to attend a three hour workshop focussed on the UK’s Government Gateway. I sure wasn’t disappointed- the insights into the Government Gateway were quite an eye opener.

Attending the conference also led me to reflect on how online authentication is working for the Queen’s subjects in the UK, Australia, and New Zealand. It’s quite fascinating how each of them reflect diverse approaches and are also very much a product of their times.

First, Australia. Still very PKI focussed, as in standard X.509 certs in the user’s computer. There are some good intentions from the federal policy body AGIMO (Australian Government Information Management Office) to move on to solutions that work for people (not computers) but the mindset of the average government official is definitely digital certs.

A good example of this focus is the success of VANguard. VANguard’s authentication service is probably best described as an authentication broker whose main function is to allow for interoperability of digital certs issued by various CAs. This is a good step so that businesses (it’s mostly business-focussed) can use the same digital cert with multiple RPs. It’s a back-end hub so that various front-ends and portals, such as bizgate in South Australia, can draw on its functionality. Still, it has all the limitations inherent in the old PKI designs.

It’ll be interesting to see how AGIMO’s proposed National e-Authentication Framework will differ from their existing AGAF (Australian Government e-Authentication Framework) which is separate for businesses and individuals.

Back to the UK’s Government Gateway. From the outside, so much of the focus has been on the UK’s plans for a national identity card that people, including me, can’t distinguish the good stuff they have done and are continuing to do in the online authentication space from the bad. Jim Purves, Head of Product Strategy in the Cabinet Office gave terrific insights into the chequered history of the Gateway as well as plans going forward.

The Gateway is very privacy-protective, very focussed on providing authentication and SSO for the UK Government’s online services. They are introducing SAML 2 soon but that also has the downside of continued support for all the current protocols. They’ve had some significant funding challenges in the past but now have “strategic investors” from within government so the future is bright. Trust and confidence in the Gateway is at an all-time high.

Purely speculative on my part but I think they’ve got a big cloud on the horizon- when the national identity card folks come calling. That could potentially lead to a fundamental change in approach. That’s the unfortunate steamrolling impact of the national identity card. Also interesting how they handle pan-European interoperability but, with a strong Liberty Alliance foundation, I imagine they are well placed to handle that.

So, how does NZ stack up? The proper comparison is with the GLS or Government Logon Service (which will be re-branded igovt later this year). There’s no doubt that the GLS is the most privacy-protective of the lot and has all the right moving bits.

Once the IVS or Identity Verification Service and then GOAAMS or Government Online Attribute Assertion Meta System is added to igovt, then it’s a whole new ballgame for NZ.

But, there is clearly one area that the GLS should look at- adding a web services (ID-WSF) capability in addition to the current browser re-direct (ID-FF). That will provide many new opportunities off the same infrastructure, such as acting as an authenticating receiver for XML messages. The UK’s Government Gateway currently does that for all electronic tax filings direct from standard tax and accounting packages.

All in all, interesting times and much thinking…

July 2, 2008 at 11:45 pm 1 comment

ID Conference coverage

Had a look around to see the media coverage sparked off by the Identity Conference in Wellington. Given the wide range of things covered, I thought it would provide a good indicator for what the media thinks is news-worthy about identity.

1. The Dom Post was at its in-your-face best, making the Privacy Commissioner’s call for protecting your ‘digital shadow’ as the number one news story (first page, top left). Digital information about people is the “new currency” so maybe it made a good replacement for the usual pessimistic economic lead.

On another note, her full presentation includes, “So should the responsibility to manage identity fall to the public or private sector? Who would you rather have handling your identity? Is it as simple a question as whether we have Microsoft or SSC? I am, of course, being flippant, but the public sector cannot afford to assume it has natural dominion. It is a case of gaining, and then maintaining, New Zealanders’ trust. Identity-driven systems must reflect the multiplicity of modern New Zealand. Those systems must give people options, flexibility and control.”

2. Across at NZ Herald, Peter Griffin blogged (The search for Identity 2.0) about Dick Hardt’s presentation. Good choice but I do wish savvy tech folks understand the difference between identification and authentication. Otherwise we’re going to continue getting some pretty weird conclusions like the need for government-issued photo ID cards to access online services. I sometimes wonder if people take the cards metaphor too far.

3. Still with Peter Griffin but this time in his role as a news reporter, is Identity thieves sharpen their act. The story covers most of the dangerous downsides of the Internet. One particular quote from Dean Winter of TradeMe caught my eye, “Who in New Zealand do we go to and say we’ve identified a botnet?… We get a fantastic response from the hosts of some of these fraudulent networks. But it is still standing at the bottom of the cliff.”

Eve Maler’s obviously found the time and a decent enough broadband connection in Wellington to post her thoughts, Everyday identity and human-centered design. She has a link to her presentation as well as the inspiring work of Don Norman’s usability work in the 80s that continues to be so relevant.

Varied coverage reflecting the varied perspectives of the Conference…

May 1, 2008 at 10:58 pm Leave a comment

Worth keeping an eye on…

… how the Identity Governance Framework (IGF) continues to evolve. There’s a recent Liberty webcast by Phil Hunt of Oracle New Standards to Protect Privacy Through Governing Policy to get a good feel for the state of play.

… how CardSpace and U-Prove integration pans out. Paul’s conjectured integration is food for thought. So is the comment to his post by Christian Paquin (now part of Microsoft’s Identity and Access Group) that”One design goal (at least, for me) will be to minimize the integration changes for all participants involved in the data flow.”

… how identity-based encryption continues to progress. Interesting article in The Register about a research paper released at the Eurocrypt 2008 conference describing a new cryptographically strong “primitive” that advances functional encryption. Functional encryption tries to simplify things over PKI by allowing data to be encrypted using attributes directly tied to the recipients.

… the fascinating discussions at Liberty’s Privacy Summit. An interesting recent presentation by Sun’s Robin Wilton is a good example which gives a good overview of the ‘Ladder’, ‘Onion’ and ‘Silo’ models.

April 24, 2008 at 9:40 pm 3 comments

Why igovt?

For some time now, we’ve been aware of a paradox: we are building and operating user-centric services but use government-centric language to describe them. The launch of the igovt website is a small yet important step towards changing that.

Take the Government Logon Service (GLS) as an example. According to our website, which is intended for a government agencies audience, “In a nutshell, the GLS is an all-of-government shared service to manage the logon process for online services of participating agencies.”

The very name, description, and use of a Three Letter Acronym are so government-centric. What does an average person, say a student who just wants to check his/her account online, make of this? Do we really want to try and explain to people what a “logon” is?

There is of course logic in using government-centric language, especially in the early days of a new service for which there are few, if any, precedents and mental models. Describing as accurately as possible what a service does from a functional perspective allows for precision. It helps external experts and interest groups get an in-depth understanding of what the service does and, sometimes more importantly, what it doesn’t.

But it is more than choice of language alone. It’s also about perspective.

Protecting privacy has been a major driver for the all-of-government authentication services. An important way of designing in privacy is the separation of who a person is (identity) from what they do online (activities) so that data aggregation and building profiles of people aren’t possible. Two different government departments operate two different services based on their respective strengths.

This world-leading approach has been highly acclaimed by privacy experts. Yet, from the view of a person or organisation interested in getting better and quicker government services, it just means more complexity that they have to try and understand and overcome to get to what they are really interested in- the service they want.

The second issue therefore is that people don’t want to integrate and coordinate government’s services; they want government to do that. This desire is reflected at a strategic level in the Development Goals for the State Services. At an everyday level, it means that we had to find a way for our privacy-protective design to be presented to people as a single, integrated online service without diluting the design itself.

And, it was apparent that the time to act was now, before the Identity Verification Service was launched and before future authentication services further increased complexity.

The result is igovt. It is not “just another brand” but, over time, will represent a significant shift. A shift to using user-centric language; a shift to government integrating multiple online services from multiple government agencies for people without any dilution of security and privacy protection; a shift to making it easier and more convenient for people and organisations to get government’s services.

Though there are many models we can learn from, there aren’t any tried and trusted models that we can simply adopt. It is therefore neither possible nor appropriate to try and make the shift in one giant leap. Instead, it’s more of a journey from inside-out thinking to outside-in, learning along the way.

The next step in this journey is to re-brand and re-describe GLS as the first igovt service.

[Original post at]

April 23, 2008 at 10:02 pm Leave a comment

Quenching authentication expectations

Today I went through the whole hot-expectations-followed-by-quenching-reality to re-learn some authentication basics.

First, hot expectations were set off thanks to furious thinking triggered by an article in the Dom Post today Snapper to make a splash.

The article says that, “From June, people will be able to pay for bus tickets and everyday items in shops using the Snapper card, a stored-value smartcard that NZX-listed infrastructure investor Infratil hopes will become more widely used than conventional eftpos cards… Cardholders will be able to top up Snapper cards by credit card over the Internet by clipping their card to a USB device that plugs into a computer… Snapper will also be available in the form of a USB stick, that can be plugged directly into a computer and topped up…Snapper is similar in concept to London’s Oyster cards and the Octopus card in Hong Kong.”

Now the last sentence should have rung loud warning bells but it didn’t. Maybe it was Monday morning and the coffee hadn’t kicked in. Whatever.

Instead, the old brain took off. This could be the key for ubiquitous two-factor authentication in NZ, right? After all, if everyone was carrying a smartcard that could be plugged into a computer, or a smartcard in a USB form factor, surely that would make the perfect second factor for two-factor authentication? Nothing extra to carry around; no extra reader costs; and familiarity with everyday use. Perfect. Heaven. Nirvana.

And then, the quenching reality.

The warning bells finally penetrated this evening. A quick Google search confirmed what the subconscious was trying to tell me. The Oyster card- more specifically, smartcards based on MiFare Classic chips- use MiFare Crypto 1, a lightweight stream cipher which researchers had cracked and concluded, “The security of this cipher is therefore close to zero.”

There are lots of articles about this, including The Register about the Dutch cards, One billion RFID cards vulnerable to hacks (engadget), and Bruce Schneier’s blog.

The critical question of course is whether or not Snapper is based on MiFare Classic chips. There is no publicly available information that I could find which confirms or denies this. It is “similar in concept” but might be based on a stronger version of the chip. That is something I’ll have to find out.

But the point is that reality can be very effective in quenching authentication expectations. And, like quenching, one emerges stronger and remembers authentication basics, such as always check for known vulnerabilities before getting too excited about any authentication method.

April 21, 2008 at 11:16 pm 6 comments

igovt public consultation

There were so many insights from attending focus groups during the igovt public consultation that it’s hard to pick just one. Certainly, one that made a lasting impression was a lady with a disability who spoke emotively about how the service would make a huge positive difference for her in getting services from government. For her, the notion of having to prove who she was once to government and then being able to choose to use the Internet to verify her identity- both across government and the private sector- was compelling.

So, what was the igovt public consultation all about?

Late last year the Department of Internal Affairs, with the support of the State Services Commission, consulted with people about igovt. Specifically, the consultation was about the Identity Verification Service, one of the two igovt services.

The details and context for the service have evolved since the previous public consultation in 2003. It was therefore important to seek the views of the public about key aspects of the proposed service before the service design was finalised.

Public consultation was also essential for continuing the transparency that has been a hallmark of developing igovt services. In particular, for services based on policy principles such as opt-in and acceptability, it is important to check with people that the service design has resulted in a service that is indeed of value to them.

The public consultation process asked people to get information from the website and send in submissions. At the planning stage for the consultation it was clear that we needed to be more proactive to get deeper and wider participation.

21 focus groups were therefore held in 8 places across New Zealand- Whangarei, Manukau, Tokoroa, New Plymouth, Porirua, Westport, Christchurch, and Invercargill. The workshops were three hours long and included a demonstration of how the service would work. It turned out that the demonstration was critical in helping people understand the service and thereby provide well-informed responses.

I was personally present at a few of these workshops to do the demonstration and also answer any questions about the service that people had. For me, it was an immensely rewarding experience. To get firsthand insight into people’s views is far richer and meaningful than getting it from a report.

The public consultation report (PDF) has now been received and published with a Summary Report.

[Original post at]

April 17, 2008 at 10:42 pm Leave a comment

Older Posts

This blog is no longer updated. See the About page for more info. I'm currently active on Twitter.

Follow me on twitter