Apparently one of the reasons states have been reluctant to publish legal material online is that there is a concern over how relying parties can tell if the material is authentic and has not been tampered with.
In an attempt to address this concern a law has been proposed called the Uniform Electronic Legal Material Act (UELMA) the text of which at a high-level states this must be addressed.
“An official publisher of legal material in an electronic record that is designated as official under Section 4 shall authenticate the record. To authenticate an electronic record, the publisher shall provide a method for a user to determine that the record received by the user from the publisher is unaltered from the official record published by the publisher.”
UELMA which was proposed in 2011 has been enacted into law in 12 states (including California, Colorado, Connecticut, Delaware, Hawaii, Idaho, Illinois, Minnesota, Nevada, North Dakota, Oregon, and Pennsylvania). With that said it looks like it may be yet another an unfunded mandate in that there doesn’t appear to be much activity in the way of publishing data signed data.
As with most US laws UELMA doesn’t specify how one would meet this requirement but the most obvious way would be to publish these documents as PDF files and sign them using PAdES. In many cases (especially legal text) this would be the ideal solution given how easy it is to both apply and verify signatures thanks to the broad support of the standard.
But why is there such broad support for this standard? It’s simple the EU takes a totally different approach to the problem of specifying what makes a “legal” electronic signature than we do. The US basically doesn’t specify any format or requirements for signatures while the EU specifies 4 formats (each with a different use cases) that are allowable of which PAdES is one.
But why did they choose four formats and not just one? That is easy. A signed PDF may be an great way to make content accessible and verifiable to people it is not a good solution for structured data that would be parsed by machines. In these machine readable cases the Europeans rely on CAdES, XAdES and ASiC which are better signature formats for machine readable data.
Since the US doesn’t specify how one should address this problem a non-profit called US Open Data is advocating a solution they helped develop called Data Seal which is a web application that sits on top of PGP to verify files to be used for all of the above cases.
In my opinion this is a bad approach, here are just a few reasons:
- PGP is 24 years old and has a wonderful mix of usability and interoperability issues that have not been solved in a meaningful way (though there are many who are trying [like Data Seal] but even many of these supporters now see it a lost cause).
- Dependency on what is today in-essence a single vendor commercial solution, even if is based on an open standard and open sourced means that if these tiny vendors go out of business there is no practical way for “real users” to verify the authenticity of the documents/data.
- Legal documents need to be verifiable long into the future and and this approach does not consider the concept of long term signature verification (time-stamping, crypto-periods, etc).
- Pushing for the adoption of a single machine readable signature format (PGP) across the board at the expense of providing an easy-to-use and verify human readable solution is a short-sighted and bad tradeoff.
- The world is getting smaller, interoperability is more important today than ever. If were going to adopt a different way of solving the same problem than a large majority of the globe it should provide sufficient material benefits to offset the interoperability and accessibility impacts such a decision caries with it.
I could even argue the that as architected Data Seal actually doesn’t even meet the ULEMA requirements in that ULEMA requires that the solution preserves the data and makes it permanently available but the solution does not provide a way for the signatures themselves to be verified long-term.
Anyway all of this is an interesting side-effect of the US approach to legislature. We try to allow innovation by not overly specifying how the market solves a problem while the EU tends to be overly specific and restrictive which tends to hurt innovation. I am almost always a fan of the US approach as governments move much much slower than the market and tend to create structural barriers to innovation. With that said I think interoperability is a case where standards are needed and when it comes to how governments publish and authenticate documents there should be a standard.
I just had a conversation with Waldo Jaquith of the US Open Data about my post on Twitter. It seems my post was not received well. In the end he seemed to suggest that I was saying yet-another standard was needed vs using PGP.
To be clear I am suggesting the Data Seal is the one proposing yet-another standard, a standard (plain ol-PGP) that does not solve any problem that the existing solutions do not already solve, and in-fact it solves less.
The standards in use for document authentication have been around and in use since 1999, with over a decade of interoperability and international case-law behind them. As a result there are numerous commercial and open source solutions that are built on them as well.
You want an open source signing and verification solution? There is SignServer or maybe you want a library you can use to build a a web based solution on? There is JSRSASign or maybe you want to build your custom solution in Python like they did if so theres a module for that too.
So to be clear my post above is not about them providing a turn key web page for verifying a signature. It is that they are encouraging adoption of a yet-another digital signature format for document/file authentication that doesn’t consider some very fundamental problems like how you verify the signature long-term.
Additionally based on the post they made they seem to be framing their solution as an alternative to PDF signing which is supported natively in a bunch solutions in which the verification of the signatures is built in to the viewing experience. If they were saying hey, here is a solution that augments that for machine readable files if there isn’t a standard available it would be different.
Just imagine a world where 50 states each adopt their own format and each county within each state choses its own thing. And of course the each agency in federal government will do its own thing too.
I like what Waldo and the US Open Data initiative is doing and I hope they keep it up but I hope they re-think their advocacy strategy to one that includes considering standards, interoperability and long term signature verification.