Signed, Auditable, Offline-Tolerant, PQ Secure QR Codes

Signed, Auditable, Offline-Tolerant, PQ Secure QR Codes

A few months ago I wrote about what it would take to make a QR code verifiable in a post quantum world. In this post I wanted to explore what it would look like if we wanted one that is genuinely verifiable, not just signed, but auditable, offline-tolerant, and ready for a post-quantum world. That post was mostly conceptual. A conversation with Bruno Couillard last week nudged me to put down the thoughts I had been carrying about exactly that.

The design draws heavily on the draft for Merkle Tree Certificates, which is working through the IETF right now. MTC is aimed at TLS, but the core insight is that you can replace per-certificate signatures with compact Merkle inclusion proofs against a periodically updated signed root, and that insight translates directly to QR codes once you think carefully about the offline constraint. If you haven’t read it, the draft is at datatracker.ietf.org/doc/draft-davidben-tls-merkle-tree-certs.

The result of applying that idea to the QR problem is MTA-QR, a working implementation of what I’ve been calling Merkle Tree Assertions for QR codes. The demo is live at mta-qr.peculiarventures.com, and the full source is at github.com/PeculiarVentures/mta-qr-demo. There are Go and TypeScript implementations, a browser-only demo that generates and verifies without any backend, and an interoperability test matrix that exercises all three signing algorithms against both runtimes in every combination.

To be clear, this isn’t a production-ready library, but building it helped me identify things I had missed while whiteboarding it in my head.

The size problem is real but solvable

The original post flagged signature size as the central constraint. An ML-DSA-44 signature is 2,420 bytes. A Version 40 QR code at medium ECC holds about 1,273 usable bytes. Those two numbers don’t fit in the same sentence without a solution.

The solution is separating what goes in the QR from what you need to verify it. The QR carries the assertion content, a Merkle inclusion proof, and coordinates pointing to a signed checkpoint. The checkpoint itself contains the issuer signature, lives outside the QR, and gets cached on the verifier’s device, typically during a charge cycle before the device ever sees a QR code. Once cached, verification is fully offline.

The proof is the interesting part. A two-level tiled Merkle tree, with an inner batch tree and an outer parent tree, caps the total proof at eight hashes regardless of how large the log grows. Eight hashes is 256 bytes. That’s the ceiling, forever. The QR version stays fixed. The code never gets denser as the issuer accumulates millions of entries.

In practice, a Mode 1 QR carrying bearer claims and a Merkle inclusion proof fits comfortably within a Version 10 to 15 code at medium ECC, well under 500 bytes total. ML-DSA-44 doesn’t appear in the QR at all. The issuer signature lives in the checkpoint that the verifier fetched during its last charge cycle.

ML-DSA-44 won’t fit in a single QR in Mode 0, the fully embedded mode where the signature is in the QR itself. Mode 0 is the bootstrap mode: it works on air-gapped verifiers, on paper QR codes printed before any checkpoint infrastructure exists, and for scenarios where prefetch is operationally impractical. It’s not a niche failure case; it’s the starting condition for any new deployment. Mode 0 with PQC will require waiting for NIST to finalize smaller-signature algorithms, or accepting larger QR codes. Mode 1 is the practical path to PQC today.

Offline tolerance is mostly a framing problem

There’s a habit of treating offline verification as binary, either the device has connectivity at scan time, or it doesn’t. That framing creates a false constraint.

Every verifier with a battery has a window where it is stationary, connected, and idle. That’s when it charges. Fetching a checkpoint during a charge cycle is trivially cheap compared to everything else happening during that window. The relevant question isn’t whether the device has connectivity at scan time. It’s whether the assertion being scanned was issued before the verifier’s last checkpoint fetch.

For the common case, the answer is yes. A concert ticket issued last week, a prescription filled this morning, a badge issued at enrollment, all of these predate the verifier’s cached checkpoint by hours or days. Verification is fully offline because the relevant checkpoint was already there.

The narrow failure case is an assertion issued and scanned within the same charge cycle, before any checkpoint fetch. That falls back to a single cache-miss network call, which then covers every subsequent scan of the same batch. One round trip, then fully offline for the rest of the operational period.

Witnessing is where the transparency guarantee actually lives

The issuer’s signature proves the assertion came from a specific key. That’s useful, but it doesn’t prevent a compromised issuer from presenting different views of the log to different verifiers. Split-view attacks are subtle and hard to detect after the fact.

Witnesses solve this. A witness cosigns a checkpoint only after verifying it extends the previous one they saw, establishing a consistency guarantee across the full history of the log. Once multiple independent witnesses have cosigned a checkpoint, the issuer cannot retroactively rewrite or fork the log without those witnesses catching it.

The witness protocol comes from c2sp.org/tlog-cosignature, the same infrastructure underpinning the transparency.dev witness network. I worked on that witness network during my time at Google, so it was never far from my mind when designing this. Connecting MTA-QR to it means the issuance of every assertion can be monitored by parties with no relationship to the issuer. That’s the difference between a signed QR and an auditable one.

The implementation uses Ed25519 for witness cosignatures regardless of what algorithm the issuer uses for checkpoints. That’s not a design choice I made, it’s what the spec requires. It means an issuer can use ML-DSA-44 for the checkpoint signature while the witness infrastructure stays on stable, widely deployed Ed25519 keys. The two concerns are separated cleanly, and that separation matters. The quantum threat to the issuer signature and the operational threat to the witness network are different problems on different timelines.

What I had wrong in the original post

The earlier post mentioned UOV and SQISign as especially promising for QR codes because of their smaller signature sizes. That framing isn’t wrong exactly; smaller signatures do help with the size constraint, and both algorithms are genuinely interesting work. But the NIST competition covering them isn’t finished, which means neither is practical for anything you’d want to deploy or standardize against today. More importantly, once you separate the checkpoint from the payload, signature size matters only for the checkpoint, which isn’t size-constrained anyway. The Merkle structure removes the problem that UOV and SQISign were addressing. They may still have a role in Mode 0 once the standards are settled, but they’re not the lever that makes the design work.

What’s still missing

The spec has a revocation mechanism based on index ranges that a verifier checks at scan time, but the format for distributing and authenticating those revocation lists isn’t fully defined yet. This is the most operationally significant open item. An unsigned revocation list is vulnerable to a stale-list attack at the network layer. An adversary who can delay or suppress list delivery can extend the validity of a revoked assertion. The natural fix is issuer-signed lists using the same key that signs checkpoints, but that format isn’t written yet. Until it is, revocation is a weak link in any deployment that takes revocation seriously.

Type 0x02 key assertions, where the QR proves possession of a private key rather than just embedding bearer claims, are defined in the log entry format but the challenge-response protocol isn’t specified. Two implementations can’t interoperate on key assertions without it.

The C2SP tlog-checkpoint format needs registrations for ECDSA and ML-DSA before those algorithms can interoperate with standard tlog-checkpoint parsers. Ed25519 is fully specified today. ECDSA and ML-DSA work in the reference implementation but aren’t interoperable with external tooling yet. This is a practical blocker for adoption by anyone not using the reference implementation, and it’s the right next conversation to have with the C2SP and MTC communities.

Try it

The browser demo runs entirely in-page with no backend. It generates Ed25519 or ML-DSA-44 keys in your browser, issues assertions, builds the Merkle tree, produces QR codes, and runs the full 15-step verification trace. The tamper panel lets you flip proof bytes, corrupt the TBS, zero the proof, or truncate the payload, and watch exactly which verification step catches each failure. It’s a useful way to build intuition for what the protocol is actually checking and why each step is there.

The repo is at github.com/PeculiarVentures/mta-qr-demo. Pull requests welcome, especially on the open items.

Leave a Reply

Your email address will not be published. Required fields are marked *