How to tell DV and OV certificates apart

Introduction

There are in essence three kinds of SSL certificates: Domain Validated, Organization Validated and Extended Validated. I am not going to write about the differences here it seems that there are hundreds of articles on this topic on the Internet.

What I think has not been given sufficient coverage is how one is able to look at a certificate and determine what type it is.

One would think that this would be easy; In theory if nothing was explicitly stated it would be a Domain Validated certificate (since it is the weakest validation), otherwise someone would put something in the certificate making it clear that the certificate was either Organization Validated or Extended Validated.

Unfortunately it’s not this simple, the main issue being the historic lack of coordination within the CA industry.

Each Certificate Authority (CA) has its own unique practices relating to how they mark their certificates so with the existing deployed certificates there is no singular rule or approach can be used to definitively know what level of validation was done for a given certificate.

Thankfully it looks like that this problem is betting better thanks to the adoption of the Baseline Requirements but in the meantime we have to make do with heuristics.

Deterministic Approach

Today the only way to know with confidence that a certificate is of a specific type is to know the practices of each CA.

In X.509 the way an issuer is supposed to express something like this is via the Certificate Policies extension which is defined in RFC 5280.

This allows a CA to express a unique identifier (an OID) in their certificates that maps to a document that describes its practices associated with this certificate. This identifier can be used programmatically to do make trust decisions about a certificate or to differentiate the user interface in an application based on what type of certificate is being used.

This is exactly how browsers today can tell if a certificate is an Extended Validation (EV) certificate. In essence they have some configuration that says “I trust GlobalSign to issue EV certificates, when a certificate is presented to me from them that has this policy OID show the EV user experience”.

The Baseline Requirements use the same approach defining identifiers for Domain Validated and Organization Validated certificates, these are:

Type Policy Identifier
Domain Validated 2.23.140.1.2.1
Organization Validated 2.23.140.1.2.2

 

Having these identifiers takes us a long way towards our goal of deterministic evaluation of certificate issuance policy — that said not all CAs have adopted them which is technically alright since the Baseline Requirements do allow them to use their own Policy Identifiers.

Heuristic Approach

Since the Baseline Requirements were only established this year it will take some time for the existing install base of certificates to be re-issued to use these Policy Identifiers called about above. This doesn’t mean you can’t tell the certificates apart today, it does mean it is quite a bit messier though.

Here is some pseudo-code provided to me as an example from a friend that they used in one of their projects:

type = null;

if (cert is self-signed) then

     type = SS;        /* SS = Self-signed */

else if (cert was issued by a known “CA”) then

     type = DV;  /* DV = Domain Validation */ else if (cert contains a known EV Policy OID) then

     type = EV;  /* EV = Extended Validation */ else if (cert “Subject O” and “Subject CN” are the same or “Subject OU” contains “Domain Control Validated”) then {

     if (cert contains no Subject L, St or PostalCode) then

         type = DV;

}

else if (cert “Subject O” is “Persona Not Validated” and the cert’s issuer was StartCom

     type = DV;

if (type is null)

     type = OV;

This logic is not comprehensive but should work well enough for most uses.

Summary

Unfortunately today there is not a deterministic way to tell if a certificate was Domain or Organization Validated, that said things are changing and within a few years hopefully it will be possible.

In the mean-time there are heuristics you can use that help tell these types of certificates apart.

Windows XP and Name Constraints

Recently I blogged about how Windows XP processes Name Constraints a little different than the RFC specifies — with the help of a friend I have a good set of examples of what would work and what would not work that illustrate what it does.

Assuming our Subject was:

C = US;S = Washington;L = Kirkland;O = GlobalSign;CN = globalsign.com

 

And that our Constraint was:

Permitted

     [1]Subtrees (0..Max):

          DNS Name=globalsign.com

We would see different results when validating a certificate on XP than we would on a later version of Windows.

Notice we did not include any directoryName attributes? That is supposed to mean that there is no constraints on the directoryName. On Windows XP however if you include a directory name in the subject there MUST be at least one Directory Name attribute in the RDN to match against otherwise it will not pass its Name Constraints check.

So if we instead made our constraint:

Permitted

     [1]Subtrees (0..Max):

          RFC822 Name=globalsign.com

     [3]Subtrees (0..Max):

          Directory Address:

          C = US

          S = Washington

          L = Kirkland

          O = Globalsign

Excluded=None

 

A certificate with the following subject would match:

  • An empty DN, no RDNs
  • C = US
  • C = US;S = Washington
  • C = US;S = Washington;L = Kirkland
  • C = US;S = Washington;L = Kirkland;O = Globalsign
  • C = US;S = Washington;L = Kirkland;O = Globalsign;CN = globalsign.com
  • C = US;S =””;L = Kirkland;O =””;CN = globalsign.com

When XP processes the RDNs it starts with the first and progresses from there. You can’t skip an RDN. If an RDN is present it must match the entire RDN value or be empty.

As such the following wouldn’t match in our example:

  • S = Washington (Skipped the first RDN)
  • C = US;L = Kirkland (Skipped the second RDN)
  •  C = US;S = Washington;L = Kirkland;O = Globalsign Development Center (partial “O” value).

The prior blog post on this topic I described before talks about how an enterprise can work around this behavior (by setting some registry keys) but a public certificate issuer can too, for example by inserting just one RDN value and ensuring the subordinate CA issues with that RDN value in its certificates.

This way a site can have the flexibility it wants to change its directory structure without re-issuing the certificate containing the Name Constraints.

Ryan

Algorithms, key size and digital certificates

Introduction

On the surface the digital certificates are not complicated — a third-party (a certificate authority) verifies some evidence and produces a piece of identification that can be presented at a later date to prove that the verification has taken place.

As is usually the case when we look a little deeper things are not that simple. In the this case we have to care about a few other things, for example what are the qualifications of the third-party, what are their practices and what cryptographic algorithms did they use to produce the digital certificate?

As an administrator using digital certificates like in the case of SSL these things also can have impact on your operational environment – by using a certificate from a certificate authority you take dependencies on their practices and operational environment.

This is especially true when it comes to decisions relating to what cryptographic algorithms and key lengths are accepted and used by that third-party.

Thankfully you do not need to be a cryptographer to make good decisions on this topic, first we need to start with an understanding of the history, future and then considerations.

History

In recent history the industry has relied on two algorithms, the first being an encryption algorithm called RSA the second being a hash algorithm called SHA-1. Both of which have are considered weaker now due to advances in cryptanalysis.

RSA’s strength and performance is based on the size of the key used with it, the larger the key the stronger and slower it is.

These advances in cryptanalysis have driven the increase in key size used with this algorithm which in turn has increased the amount of computing power necessary to maintain the same effective strength.

The problem with this is that that every time we double the size of an RSA key the decryption operations with that key become 6-7 times slower.

As a result as of all of this as of January 2011 trustworthy Certificate Authorities have aimed to comply with NIST (National Institute of Standards and Technology) recommendations by ensuring certificates all new RSA certificates have keys of 2048 bits in length or longer.

Unfortunately this ever increasing key size game cannot continue forever, especially if we ever intend do see SSL make up the majority of traffic on the internet – the computational costs are simply too great.

That takes us to SHA-1, hash algorithms take a variable amount of input and reduce it to a typically shorter and fixed length output the goal of which being to provide a unique identifier for that input. The important thing to understand is that hash algorithms are always susceptible to collisions and the advances in the cryptanalysis have made it more likely that such a collision can be made.

The problem here is that there is no parameter to tweak that makes this problem harder for an attacker, the only way to address this issue is to change to a stronger algorithm to produce the hash.

Future

For the last decade or so there has been slow and steady movement towards using two new algorithms to address these advances — SHA-2 and ECC.

ECC has the potential for significant performance benefits over RSA without reducing security and SHA-2 has three versions each with progressively longer lengths which help it both address the current risks and give it some longevity.

Considerations

Our goal in configuring SSL is enabling users to communicate with us securely; to accomplish this goal we need to be able to do this with the fewest hassles, lowest costs and comply with any associated standards.

Interoperability is the key that ensures the fewest hassles — if it was not for this we would simply switch to these new algorithms and be done with it. As is normally the case when it comes to security this is where Windows XP rears its ugly head, SHA-2 was added to XP in Windows XP Service Pack 2 and ECC in Windows Vista.

These facts set the adoption clock for these new algorithms; if you care about XP (about 30% of the Internet today) you can’t adopt ECC and SHA-2 in full for about 5 years.

This leaves us with RSA 2048 and SHA-1 which thankfully is broadly considered sufficient for the next decade.

Performance is always a concern as well — a RSA 2048-bit RSA certificate used in SSL will result in around a 10% CPU overhead not huge but something to keep in mind.

As mentioned previously we can’t forget compliance — whether it is the Payment Card Industry / Data Security Standards (PCI/DSS), Federal Information Processing Standards (FIPS) 140-2 or some other set of criteria you need to meet this always needs to be considered.

Conclusion

The decision of what algorithm’s and key lengths to use in your digital certificates is dependent on a number of factors including security, interoperability, performance and compliance. Each situation may require a different trade-off to be made however a rule of thumb if you stick with SHA-2 and RSA 2048-bit certificates and keys you should be fine for now.

 

Resources

[1]   BlueKrypt Cryptographic Key Length Recommendations

[2]   Recommendation for Key Management, Special Publication 800-57 Part 1 Rev. 3, NIST, 05/2011

[3]   Fact Sheet Suite B Cryptography, NSA, 11/2010

[4]   Worldwide Operating System Statistics, Stat Counter, 9/2012

[5]   RSA Algorithm, Wikipedia

[6]   RSA Key Lengths, Javamex

[7]   ECC Algorithm, Wikipedia

[8]   Performance Analysis of Elliptic Curve Cryptography for SSL, Sun

[9]   Using ECC keys in X509 certificates, UnmitigatedRisk

[10] Using SHA2 based signatures in X509 certificates, UnmitigatedRisk

[11]Payment Card Industry / Data Security Standards – PCI

[12]Federal Information Processing Standards 140-2 – NIST

XP and the undefined Name Constraints

So Qualified Subordination is super important, it’s what really allows us to implement Least Privilege in PKI hierarchies.

This concept is implemented in Windows XP as of SP3 because there was a back-port of the Windows Vista certificate chain validation logic included in SP3.

With that said there is at least one difference between VISTA, Windows 7 and Windows 8 chain validation logic in the way Name Constraints is processed.

More precisely once it sees a name constraint applied to a certificate it requires that only the names and scope of names expressed in the Name Constraint extension are present in the certificate.

For example, lets say I restrict a CA to issue only for the DNS domain of example.com, once this is put in the certificate I can no longer include “O=Example Company Name” in the subject of the certificates issued by that CA.

If I want the CA to be able to include that organization name in the certificates it issues I have to express that using the DirectoryName constraint.

This is not compliant with the RFC and was later changed so VISTA, Windows 7 and Windows 8 do not behave this way.

That said you can change the behavior in XP by tweaking the following registry key:

HKLM\Software\Policies\Microsoft\SystemCertificates\Root\ProtectedRoots

This is a bitmask represented as a REG_DWORD, it is defined in WinCrypt.h as but the flag following definitions will tell you how to tweak this one part:

#define CERT_PROT_ROOT_FLAGS_VALUE_NAME L”Flags”

// Set the following flag to disable checking for not defined name

// constraints.

//

// When set, CertGetCertificateChain won’t check for or set the following

// dwErrorStatus: CERT_TRUST_HAS_NOT_DEFINED_NAME_CONSTRAINT.

//

// In LH, checking for not defined name constraints is always disabled.

#define CERT_PROT_ROOT_DISABLE_NOT_DEFINED_NAME_CONSTRAINT_FLAG 0x20

You could deploy this behavior via group policy if you did not want the behavior, it’s probably easier to just include the names you are willing to let the CA issue to but changing the behavior in this way in an option for some.

Hope this helps,

Ryan

Updated my script for Qualified Subordination testing

I did some testing with ECC and SHA2 today and as such decided to update my script for testing Qualified Subordination to make it easy to get certificates that use these algorithms.

There are now several configurable variables in makepki.bat:

  • key = possible values include RSA or ECC
  • rsasize = possible values include 1024,2048,4096
  • eccsize= possible values include secp256r1,secp384r1,secp521r1
  • hash = possible values include sha1,sha256,sha384,sha512

With these its very easy to get chains to do testing with that include these algorithms, have fun.

Ryan

Using ECC keys in X509 certificates

Recently the CAB Forum published a document called the Baseline Requirements for the Issuance and Management of Publicly Trusted Certificates.

This document was authored by both browsers and public CAs and is used by the browser vendors to mandate what minimum technical requirements need to be met for inclusion into their “Root Programs”.

One of the changes specified in this document is that subscriber certificates (aka SSL certificates) containing RSA keys must have a bit length of 2048. This is a change for a lot of CAs (GlobalSign had made this change some time ago) one that has implications to server operators.

Just take a look at the Crypto Plus Plus Benchmarks to see how much more expensive 2048 bit RSA. For most users this additional computational cost won’t be an issue but in some cases customers may need to increase the computing power they allocate for SSL establishment.

But what alternatives do you have? Well there is one, certificates with ECC keys; using these have the potential to significantly decrease the computational costs for SSL negotiations (even over your old 1024bit RSA certificate) but they come at a significant penalty – compatibility.

ECC was not supported in Windows until VISTA which was released in 2009, this basically means 100% of the XP clients out there (around 29% of the browsers on the internet as of July 2012) would be unable to establish a session with your website if you switched exclusively to ECC.

This is important for more than just Internet Explorer users since even Chrome and Safari use CryptoAPI for certificate validation when on Windows.

This would mean these users would see something like this:

 

That is pretty scary, so how long until we can use this more broadly? It’s hard to say there is a good article titled “The developers guide to browser adoption rates” that sheds some light, that and the historic gs.statcounter.com results. Based on these unless there is a sudden change (which is possible these machines are getting pretty old) I would assume that we have around 4-5 years of XP out there yet.

Hope this helps,

Ryan

Using SHA2 based signatures in X509 certificates

It’s been an exciting decade for cryptography; as a result we see smaller key sizes and weaker algorithms getting deprecated.

One driver of such things is the U.S. Federal Government, specifically NIST.

One example of this would be NIST Special Publication 800-131A which disallows the use of SHA1 after December 2013. What this means is if you are in the U.S. Federal Government or you work with them you may have to revise your technology strategy to use SHA2 in its place.

But what if you don’t have any policy mandate forcing you to do this switch? Well it’s a good idea but it has consequences too, namely compatibility.

You see SHA2 was published in 2001 so anything produced before then will not support it. The most notable example is Windows XP which as of July 2012 has about 29% presence on the Internet.

This is important for more than just Internet Explorer users since even Chrome and Safari use CryptoAPI for certificate validation when on Windows.

The good news is that XP SP3 which was released in 2008 added support for this new suite of hash algorithms, that begs the question how many of those XP machines have XP SP3?

Unfortunately I don’t have any public references that can answer this question but let’s that 85% of all XP machines on the Internet have gotten this update (I have good confidence in this number) that means that 15% of those 29% would not be able to connect to your server over SSL if you used SHA2.

This would mean these users would see something like this:

 

 

That is pretty scary, so how long until we can use this more broadly? It’s hard to say there is a good article titled “The developers guide to browser adoption rates” that sheds some light, that and the historic gs.statcounter.com results. Based on these unless there is a sudden change (which is possible these machines are getting pretty old) I would assume that we have around 4-5 years of XP out there yet.

Hope this helps,

Ryan

How to get your own OID arc

X509 uses Object Identifiers (OIDs) to uniquely identify things, for example one assigns a OID to their Certificate Policy Statements (CPS) so that it is possible to programmatically detect if a certificate meets a specific policy.

OIDs are managed as a namespace, this prevents “collision”. As such one needs to request an OID be assigned to them.

The “arc” part comes when you get your OID, you can assign any number you want at the end of your OID. For example, one might be assigned 1.1.1.1 and decide to “break” it up into chunks as follows:

  1. 1.1.1.1.2 – Documents
  2. 1.1.1.1.3 – Certificate Extensions
  3. 1.1.1.1.4 – Resource Identifiers

Underneath each of these you would assign unique numbers by appending a new number, for example 1.1.1.1.2.1 might be your CAs Certificate Practice Statement (CPS).

So how do you get one of these OIDs then? That’s easy it’s Internet Assigned Numbers Authority (IANA) who assigns these, they call them Private Enterprise Numbers. Getting one is easy enough just fill out a web application form. To do that you will only need 7 pieces of information, these include:

  1. Organization Name
  2. Organization Address
  3. Organization Phone
  4. Contact Name
  5. Contact Address
  6. Contact Phone
  7. Contact Email

Remember the idea is that the information you provide here will be used for people to reach you if they want to ask questions about these things you have uniquely identified so choose the values wisely.

It can take up to 60 days to get one of these (although usually the application is processed in about one week).

Once you got the object identifier, you should register the code on the site www.oid-info.com and/or www.alvestrand.no in this way will be easily accessible by those who are seeking information about the owner of object identifier.

Hope this helps.

Ryan

Government CAs in the Microsoft Root Program

Microsoft was the first Root program in a browser to have an open and transparent process for becoming a CA as well as the first to have public policy, audit and technical requirements that CAs must comply with.

Today while the other browsers have joined on and even raised the bar significantly Microsoft continues to operate their root program in an open and clear way.

One example of this is the list they publish of the companies who meet their requirements; you can see this list here.

There are a number of interesting things we can gleam from this list; one of them is how many governments have their own certificate authorities.

For example as of March 11, 2011 we know that there are a total of 46 government owned and operated “Root Certificates” in the Microsoft Root Program, these include:

Current CA Owner Country Thumbprint
Government of Austria, Austria Telekom-Control Commission Austria e7 07 15 f6 f7 28 36 5b 51 90 e2 71 de e4 c6 5e be ea ca f3
Government of Brazil, Autoridade Certificadora Raiz Brasileira Brazil 8e fd ca bc 93 e6 1e 92 5d 4d 1d ed 18 1a 43 20 a4 67 a1 39
Government of Brazil, Instituto Nacional de Tecnologia da Informação (ITI) Brazil ‎70 5d 2b 45 65 c7 04 7a 54 06 94 a7 9a f7 ab b8 42 bd c1 61
Government of Finland, Population Register Centre Finland fa a7 d9 fb 31 b7 46 f2 00 a8 5e 65 79 76 13 d8 16 e0 63 b5
Government of France France 60 d6 89 74 b5 c2 65 9e 8a 0f c1 88 7c 88 d2 46 69 1b 18 2c
Government of Hong Kong (SAR), Hongkong Post Hong Kong (SAR) d6 da a8 20 8d 09 d2 15 4d 24 b5 2f cb 34 6e b2 58 b2 8a 58
Government of Hong Kong (SAR), Hongkong Post Hong Kong (SAR) e0 92 5e 18 c7 76 5e 22 da bd 94 27 52 9d a6 af 4e 06 64 28
Government of India, Ministry of Communications & Information Technology, Controller of Certifying Authorities (CCA) India 97 22 6a ae 4a 7a 64 a5 9b d1 67 87 f2 7f 84 1c 0a 00 1f d0
Government of Japan, Ministry of Internal Affairs and Communications Japan 96 83 38 f1 13 e3 6a 7b ab dd 08 f7 77 63 91 a6 87 36 58 2e
Government of Japan, Ministry of Internal Affairs and Communications Japan ‎7f 8a b0 cf d0 51 87 6a 66 f3 36 0f 47 c8 8d 8c d3 35 fc 74
Government of Korea, Korea Information Security Agency (KISA) South Korea 5f 4e 1f cf 31 b7 91 3b 85 0b 54 f6 e5 ff 50 1a 2b 6f c6 cf
Government of Korea, Korea Information Security Agency (KISA) South Korea 02 72 68 29 3e 5f 5d 17 aa a4 b3 c3 e6 36 1e 1f 92 57 5e aa
Government of Korea, Korea Information Security Agency (KISA) South Korea f5 c2 7c f5 ff f3 02 9a cf 1a 1a 4b ec 7e e1 96 4c 77 d7 84
Government of Korea, Ministry of Government Administration and Home Affairs (MOGAHA) South Korea 63 4c 3b 02 30 cf 1b 78 b4 56 9f ec f2 c0 4a 86 52 ef ef 0e
Government of Korea, Ministry of Government Administration and Home Affairs (MOGAHA) South Korea 20 cb 59 4f b4 ed d8 95 76 3f d5 25 4e 95 9a 66 74 c6 ee b2
Government of Latvia, Latvian Post Latvia 08 64 18 e9 06 ce e8 9c 23 53 b6 e2 7f bd 9e 74 39 f7 63 16
Government of Latvia, Latvian State Radio & Television Centre (LVRTC) Latvia c9 32 1d e6 b5 a8 26 66 cf 69 71 a1 8a 56 f2 d3 a8 67 56 02
Government of Lithuania, Registru Centras Lithuania 97 1d 34 86 fc 1e 8e 63 15 f7 c6 f2 e1 29 67 c7 24 34 22 14
Government of Macao, Macao Post Macao SAR ‎89 c3 2e 6b 52 4e 4d 65 38 8b 9e ce dc 63 71 34 ed 41 93 a3
Government of Mexico, Autoridad Certificadora Raiz de la Secretaria de Economia Mexico 34 d4 99 42 6f 9f c2 bb 27 b0 75 ba b6 82 aa e5 ef fc ba 74
Government of Portugal, Sistema de Certificação Electrónica do Estado (SCEE) / Electronic Certification System of the State Portugal ‎39 13 85 3e 45 c4 39 a2 da 71 8c df b6 f3 e0 33 e0 4f ee 71
Government of Serbia, PTT saobraćaja „Srbija” (Serbian Post) Serbia d6 bf 79 94 f4 2b e5 fa 29 da 0b d7 58 7b 59 1f 47 a4 4f 22
Government of Slovenia, Posta Slovenije (POSTArCA) Slovenia ‎b1 ea c3 e5 b8 24 76 e9 d5 0b 1e c6 7d 2c c1 1e 12 e0 b4 91
Government of Slovenia, Slovenian General Certification Authority (SIGEN-CA) Slovenia 3e 42 a1 87 06 bd 0c 9c cf 59 47 50 d2 e4 d6 ab 00 48 fd c4
Government of Slovenia, Slovenian Governmental Certification Authority (SIGOV-CA) Slovenia 7f b9 e2 c9 95 c9 7a 93 9f 9e 81 a0 7a ea 9b 4d 70 46 34 96
Government of Spain (CAV), Izenpe S.A. Spain 4a 3f 8d 6b dc 0e 1e cf cd 72 e3 77 de f2 d7 ff 92 c1 9b c7
Government of Spain (CAV), Izenpe S.A. Spain ‎30 77 9e 93 15 02 2e 94 85 6a 3f f8 bc f8 15 b0 82 f9 ae fd
Government of Spain, Autoritat de Certificació de la Comunitat Valenciana (ACCV) Spain a0 73 e5 c5 bd 43 61 0d 86 4c 21 13 0a 85 58 57 cc 9c ea 46
Government of Spain, Dirección General de la Policía – Ministerio del Interior – España. Spain b3 8f ec ec 0b 14 8a a6 86 c3 d0 0f 01 ec c8 84 8e 80 85 eb
Government of Spain, Fábrica Nacional de Moneda y Timbre (FNMT) Spain 43 f9 b1 10 d5 ba fd 48 22 52 31 b0 d0 08 2b 37 2f ef 9a 54
Government of Spain, Fábrica Nacional de Moneda y Timbre (FNMT) Spain b8 65 13 0b ed ca 38 d2 7f 69 92 94 20 77 0b ed 86 ef bc 10
Government of Sweden, Inera AB (SITHS-Secure IT within Health care Service) Sweden 16 d8 66 35 af 13 41 cd 34 79 94 45 eb 60 3e 27 37 02 96 5d
Government of Switzerland, Bundesamt für Informatik und Telekommunikation (BIT) Switzerland ‎6b 81 44 6a 5c dd f4 74 a0 f8 00 ff be 69 fd 0d b6 28 75 16
Government of Switzerland, Bundesamt für Informatik und Telekommunikation (BIT) Switzerland ‎25 3f 77 5b 0e 77 97 ab 64 5f 15 91 55 97 c3 9e 26 36 31 d1
Government of Taiwan, Government Root Certification Authority (GRCA) Taiwan ROC f4 8b 11 bf de ab be 94 54 20 71 e6 41 de 6b be 88 2b 40 b9
Government of The Netherlands, PKIoverheid The Netherlands 10 1d fa 3f d5 0b cb bb 9b b5 60 0c 19 55 a4 1a f4 73 3a 04
Government of The Netherlands, PKIoverheid The Netherlands 59 af 82 79 91 86 c7 b4 75 07 cb cf 03 57 46 eb 04 dd b7 16
Government of the United States of America, Federal PKI USA 76 b7 60 96 dd 14 56 29 ac 75 85 d3 70 63 c1 bc 47 86 1c 8b
Government of the United States of America, Federal PKI USA cb 44 a0 97 85 7c 45 fa 18 7e d9 52 08 6c b9 84 1f 2d 51 b5
Government of the United States of America, Federal PKI USA ‎90 5f 94 2f d9 f2 8f 67 9b 37 81 80 fd 4f 84 63 47 f6 45 c1
Government of Tunisia, Agence National de Certification Electronique / National Digital Certification Agency (ANCE/NDCA) Tunisia 30 70 f8 83 3e 4a a6 80 3e 09 a6 46 ae 3f 7d 8a e1 fd 16 54
Government of Tunisia, Agence National de Certification Electronique / National Digital Certification Agency (ANCE/NDCA) Tunisia d9 04 08 0a 49 29 c8 38 e9 f1 85 ec f7 a2 2d ef 99 34 24 07
Government of Turkey, Kamu Sertifikasyon Merkezi (Kamu SM) Turkey 1b 4b 39 61 26 27 6b 64 91 a2 68 6d d7 02 43 21 2d 1f 1d 96
Government of Uruguay, Correo Uruguayo Uruguay f9 dd 19 26 6b 20 43 f1 fe 4b 3d cb 01 90 af f1 1f 31 a6 9d
Government of Venezuela, Superintendencia de Servicios de Certificación Electrónica (SUSCERTE) Venezuela ‎dd 83 c5 19 d4 34 81 fa d4 c2 2c 03 d7 02 fe 9f 3b 22 f5 17
Government of Venezuela, Superintendencia de Servicios de Certificación Electrónica (SUSCERTE) Venezuela ‎39 8e be 9c 0f 46 c0 79 c3 c7 af e0 7a 2f dd 9f ae 5f 8a 5c

 

With a closer look we see that these 46 certificates are operated by 33 different agencies in 26 countries.

 

Wikipedia tells us there are 207 governments and now we know apparently 14% of them operate their own globally trusted root.

 

Though I love to travel and I consider myself a citizen of the world I have never needed to communicate with any of these governments using their private PKIs so I personally have marked them as “revoked” in CryptoAPI, I also manage which of the commercial root CAs I trust manually.

There are some other interesting observations we can gleam from the Root Program membership also, I will do more posts on these later.

Serving OCSP POST responses on a CDN

The other day I did a blog post on how we are using a CDN to front our OCSP services, the CDN we are using is CloudFlare who is one of our partners.

In that blog post I mentioned that POSTs requests from an OCSP client would normally be a cache miss for a CDN and this introduces additional latency in serving these responses.

Even though the response times we were getting had this additional latency the performance was still acceptable but we wanted to do better so a few weeks ago I went to the CloudFlare office and worked with them on making their services OCSP aware.

Specifically we made it so that when they get an OCSP POST they can determine what cached response to return (for example from a prior GET) instead of going back to our responders to have our responder do that for them.

As of today this change has gone live, as you know I love numbers which is why I have been publishing these repository performance numbers. What you will see if you look at those is that our worldwide average is right around 100ms, if we take out china and Australia that figure drops to around 60ms.

The problem with these numbers is they only test the GET variant of the protocol, that is because neither Pingdom nor Monitis let me simulate binary POSTs (which is what the OCSP POST variant looks like).

With that said we can extrapolate what the numbers look like fairly easily; from my current network location (which is slow) this is what I see:

 

 

What I want you to notice here is that over 15 requests when I do a POST to the CDN instance of our responder I get the same response time average as I do when I perform the same test with the GET variant.

What this tells us is that POST is performing the exact same as a GET which from this we can safely say that in our case the performance numbers I have been publishing for GET are also accurate for POSTs.

What you will also see that our responder is slower to server via POST, this is because it is designed around nonced OCSP requests and as such isn’t optimized as much as it could be for caching them.

I should also note that our decision to put a CDN in front of our OCSP responder does not break clients that want to send nonced requests (no browsers do by default btw), these are simply treated as a cache miss.

Ryan