How to do OCSP requests using OpenSSL and CURL

 

It pretty easy, the OpenSSL and CURL manuals make it fairly easy but I thought I would put it all here in a single post for you.

First in these examples I used the certificates from the http://www.globalsign.com site, I saved the www certificate to globalsignssl.crt and its issuer to globalsignssl.crt.

Next you will find a series of commands used to generate both POSTs and GETs for OCSP:

1. Create a OCSP request to work with, this also will produce a POST to the OCSP responder

openssl ocsp -noverify -no_nonce -respout ocspglobalsignca.resp -reqout ocspglobalsignca.req -issuer globalsigng2.cer -cert globalsign.com.cer -url "http://ocsp2.globalsign.com/gsextendvalg2" -header "HOST" "ocsp2.globalsign.com" -text

2. Base64 encode the DER encoded OCSP request

openssl enc -in ocspglobalsignca.req -out ocspglobalsignca.req.b64 -a

3. URL Encode the Base64 blob after removing any line breaks (see: http://meyerweb.com/eric/tools/dencoder/ for a decoder)

4. Copy the Base64 into the URL you will use in your GET

http://ocsp2.globalsign.com/gsextendvalg2/{URL encoded Base64 Here}

5. Do your GET:

curl --verbose --url http://ocsp2.globalsign.com/gsextendvalg2/MFMwUTBPME0wSzAJBgUrDgMCGgUABBSgcg6ganxiAlTyqPWd0nuk87cvpAQUsLBK%2FRx1KPgcYaoT9vrBkD1rFqMCEhEhD0Xjo%2FV7lgq3ziGoWG69rA%3D%3D

 

If you like you can also re-play the request that was generated with OpenSSL as a POST:

curl --verbose --data-binary  @ocspglobalsignca.req -H "Content-Type:application/ocsp-request" --url http://ocsp2.globalsign.com/gsextendvalg2

Hard revocation checking and why it’s not here yet.

If you follow discussions around x.509 and SSL you have likely heard that “Revocation Checking is Broken”, you might even hear it will never work therefore we should start over with a technology that isn’t dependent on this concept.

There are some merits to these arguments but I don’t agree with the conclusion, I thought I would summarize what the problems are in this post.

Fundamentally the largest problem is that, as-deployed, all x.509 revocation technologies introduce a communication with a third-party (the Certificate Authority).

This isn’t necessarily a deal breaker but it does have consequences, for example in the case of SSL:

  1. It can slow down the user’s experience.
  2. It introduces a new point of failure in a transaction.

These issues can be mitigated through intelligent deployments and engineering but unfortunately this really has not happened, as a result Browsers have implemented what is called “Soft-fail revocation checking”.

With soft-fail revocation checking browsers ignore all conditions other than an authoritative “revoked” message, in the case of OCSP that means if they reach the responder and it says “I don’t know the status” or if it fails to reach the responder it assumes it is “good”.

This behavior is of course fundamentally flawed, the Browsers say they have no choice (I disagree with this conclusion but that’s a topic for another post) other than to behave this way, but why?

The rational is as follows:

  1. Revocation repositories are not reliable.
  2. Revocation repositories are slow.
  3. Revocation repositories are not always available (captive portals).
  4. Revocation messages are too large to be returned in time.
  5. There are too many revocation messages to be returned in time.

These are all legitimate concerns, ones that are unfortunately as true today as they were almost a decade ago.

They are not however insurmountable and I think it’s time we as an industry did something about it.

Using OpenSSL to create a test Qualified Subordination PKI hierarchy

The other day posted about “Least Privilege and Subordinate Certificate Authorities”, this post talked about how you can delegate only a limited set of rights to a subordinate CA. I thought you might find a set of configuration files and batch files I put together to test these scenarios useful, here you go.

I threw this together on a Windows machine that had the Shinning Light OpenSSL distribution on it, it has several batch files:

  • CleanPKI.bat – Remove all generated content
  • MakePKI.bat – Make a new PKI

Then there is the OpenSSL configuration files, most of this is straight forward but for the stuff that is not check out the OpenSSL documentation.

Least Privilege and Subordinate Certificate Authorities

One of the most fundamental design principals when designing a secure system is that of least privilege, in the case of CAs one scenario where this can be applied is the subordination of another CA.

The application of this concept in this scenario is referred to as qualified subordination,  it was first formalized in the IETF standards for X.509 in 1999 in RFC 2459 through the introduction of the Basic Constraints, (see section 4.2.1.10), Name Constraints (see section 4.2.1.11) and Policy Constraints (see section 4.2.1.11).

Unfortunately broad product support did not begin to emerge until the RFC 3280 was released in 2002.

The development and deployment of these concepts was primarily driven by the US Federal Government’s deployment of PKI as a foundational technology for their security infrastructure. One of the many benefits of the government adopting these concepts was that NIST published a robust Test Suite to validate conformance with their interpretations of RFC 3280 which included extensive coverage of Qualified Subordination.

When these concepts are used together a Root CA is able to delegate the right to issue certificates to another CA while restricting them from creating other CAs or issuing certificates for names they are not authoritative for.

The Federal Bridge made extensive use of these concepts; they were able to do so through the mandate to use software that met the published guidelines. Adoption on the Internet however took much longer given the historically slow adoption rates for browsers, that gladly has changed and there is now sufficient browser support to deploy these restrictions.

In addition Microsoft introduced another mechanism to restrict the scope in which a CA is trusted for, they did this by treating the Extended Key Usage (see section 4.2.1.13) extension as a means to delegate only certain issuance capabilities to a Certificate Authority.

It accomplishes this by using the same logic specified in RFC 3280 for Certificate Policies (see section  4.2.1.5), more specifically it assumes when an issuer lists an Extended Key Usage (such as the one for S/MIME encryption) in a CA certificate that its issuer intended to restrict the usage of that CA to the EKUs present in the certificate. A simplified version of this logic was also adopted by OpenSSL for SSL certificates.

Given the Microsoft behavior is more restrictive than the behavior specified in RFC 3280 it does not break applications that do not support it and allows a CA to restrict behavior even further for clients that use the Windows certificate validation logic (nearly 70% of the deployed browsers today).

 

Client Compatibility

Most browsers and email clients support these concepts, however unfortunately not all of them support Name Constraints.

Despite that that they all do support honoring the RFC 3280 behavior for critical extensions (see section 4.2), which states:

A certificate using system MUST reject the certificate if it encounters a critical extension it does not recognize

This means by marking the Name Constraints extension Critical those implementations that do not support the concept will “fail-closed”.  This means it can be used as an effective way to technically enforce that CAs are not trusted for names they are not authoritative for, it also means that there will be cases where they may be authoritative but clients cant trust the certificates they issue.

This issue can be addressed by not marking the extension Critical, when this is done the clients that understand Name Constraints will continue to honor the policies expressed in it and those that do not will simply ignore the extension.

This is of course a trade-off of security in exchange for compatibility, with that said one with far more positive trade-offs than negative ones.

Specifically this approach means users of clients that do not support the extension are no-worse off than they are without its use and those with support get the additional protection from cases where a subordinate CA has been compromised or is willfully issuing certificates that it is not authoritative for.

With that said, support for Name Constraints is actually quite good as the following table illustrates.

 

Honor Criticality Support Basic Constraints Supports DNS Name Constraints Supports RFC 822 Name Constraints Supports Policy Constraints Supports constrained EKU Successfully enforces
IE [1] Yes Yes Yes N/A Yes Yes Yes (Open)
Outlook [1] Yes Yes Yes Yes Yes Yes Yes (Open)
Firefox [1] Yes Yes Yes Yes Yes No Yes (Open)
Thunderbird [1] Yes Yes Yes Yes Yes Yes Yes (Open)
Opera [1] Yes Yes No[2] No[2] No[2] Yes (SSL only) [3] Yes (Closed)
Windows / Safari [1] Yes Yes Yes Yes Yes Yes Yes (Open)
OSX / Safari[4] Yes Yes No[5] No[5] No[5] No Yes (Closed)

 

What this table shows is:

  1. It is possible to rely on the Name Constraints extension as an effective enforcement technique if the extension is marked as critical.
  2. It is possible to rely on the Basic Constraints extension as an effective enforcement technique.
  3. In the case of Safari and Opera that this success is due to these browsers support of honoring the semantics for critical extensions vs. understanding the Name Constraints extension.

For customers this means if you must interoperate with Opera or Safari (yes even on iPad and iPhone) the use of a certificate with a “Critical” “Name Constraints extension” in it will result in the certificate chain looking invalid.

Thankfully according to StatCounter these represent less than 6% of all browsers on the Internet and antidotal evidence shows almost no use in the enterprise.

With that said most environments business requirements will not allow them to fail even for such a small number, in these environments deploying Name Constraints as a non-critical extension will be required, not 100% of the security benefits are realized with this approach but it does significantly reduce the risk.

In such cases it is recommended that once the remaining legacy clients that do not support Name Constraints have been replaced with more recent versions that do the CAs be re-issued with the extension marked as critical.

 


[1] Tests on Windows were completed with Windows 7, IE 9.0, Outlook 2007, Safari 5.05, Opera 11.61, Firefox/Thunderbird 10.0.2.

[2] OpenSSL supports name constraints for both name forms as well as policy constraints, Opera has chosen not to enable thee capabilities until demand was present. This work was done in OpenSSL in 2008 as part of a contract to Google.

[3] Opera uses OpenSSL which supports restricting a CA from issuing valid SSL server certificates if it’s parent did not place the SSL EKU  in it’s certificate.

[4] Tests on OSX were completed with Lion and Safari 5.05

[5] Safari on the Mac uses the PKITS tests so they are aware of the deficiency in their validation logic, they have not publically stated they will support them but we expect support in the future.

 

Server Compatibility

If you have server that accepts or validates client certificates you will also care about their support for validating certificates that have these constraints.

Each environment is a little different and the number of server choices one sees in these cases feels limitless at times, as such we are only able to provide more abstract guidance here.

In the case of Windows servers such as IIS the important factor is what version of Windows you are running on as the support for PKI is built into the Windows platform. Applications are most commonly built on this platform when they are designed for Windows and is always the case for Microsoft applications.

The concepts discussed here were all supported since Windows 2003, though there were significant improvements in the 2008 release.

The net of the above is that if your server platform is built on this API you gain support for these concepts, on other platforms it of course depends on which libraries they chose to use for support for certificate validation.

 

SSL/TLS Deployment Best Practices

SSL/TLS seems simple, you go to a CA to prove who you are they give you a credential, you install it on your server, turn on SSL and then you are done.

Unfortunately there is more to it than that, I recently had an opportunity to contribute to a Best Practices Guide (PDF)  that aims to provide clear and concise intructions to help administrators understand how to people deploy it securely.

The intention is to work on an advanced version of this document in the future that covers more details and advanced topics as well (think OCSP Stapling, SPDY, etc).

I hope you find it useful.

Leaving Microsoft, My Goodbye Letter

Here is my goodbye letter to all of the amazing people I worked with over the last decade:

It was December 2001 when I came back to Microsoft. I joined the team chartered to build security technologies into Windows; I could think of nowhere else I wanted to be. After all, what other technology company in the world had the opportunity to positively impact the security of so many?

In my time here I have had the honor of working with some of the best and brightest our industry has to offer, working on some of the largest and most ambitious software engineering challenges in the world.

I have had the opportunity to work on platforms for cryptography, public key infrastructure, smart cards, biometrics, network authentication and policy, network isolation, cloud authentication, document signing, code signing, secure boot, volume encryption, enthusiast user experience, helped secure the advertising platform and so much more.

All the while I had the honor (and responsibility) of representing Microsoft in standards forums, working closely with industry partners and leaders to deliver the technology that has laid the groundwork for the consumerization of IT we are experiencing today.

My time here has taught me more than I ever thought it would; as much as the experiences themselves made me better, my greatest lessons came from you. Sometimes these lessons were a result of the folks I worked with respectfully helping me grow, but in many cases they came from simply watching how easy you all make the stuff we do look.

For these lessons I want to thank you.

Ten years later, a new set of challenges are emerging; Certificate Authorities are being forced to re-evaluate how they do business as a result of Advanced Persistent Threats and emerging technologies changing the way trust is communicated on the Internet. These challenges, of course, also represent an opportunity.

As such, I have accepted a position with GlobalSign as their Chief Technology Officer, where I have an opportunity to re-think what it means to be a trusted third-party. My last day will be January 20th.

Please keep up the good work and don’t be a stranger,

– Ryan Hurst
rmh (at) unmitigatedrisk.com
http://unmitigatedrisk.com/
@rmhrisk on Twitter

How to clear the CryptNet cache in Windows 7

OK, so this is going to be geeky and I wouldn’t normally post stuff like this to my Facebook page but for various reasons I can’t post to my blog right now and I want to capture this somewhere.

So in Windows there are several services related to the cryptography, certificates and smartcards; services are able to perform actions for the user and system in the background and enable application developers to do things in a least-privileged way.

One of the core services in these scenarios is the “Cryptographic Services” service; it does a bunch of things including the wire retrievals for CryptoAPI.

Specifically it is the worker for CryptRetrieveObjectByUrl which is used by Windows and other applications to gather evidence necessary to validate certificates, such evidence includes intermediate certificates, CRLs, OCSP responses and a file called commonly referred to as the Windows Certificate Trust List.

This API (at least in Windows 7) maintains a single cache for the whole system of the objects it has downloaded.

These files are kept in a hidden system folder called CryptNetUrlCache, in some cases you may want to test a scenario without relying on the cache, to do that you must flush the cache. The easiest way to do that is to open an administrative command prompt and run the following commands:

cd %WINDIR%\ServiceProfiles\LocalService\AppData\LocalLow\Microsoft\CryptNetUrlCache

attrib .\Content\*.* -s

del .\Content\*.*

attrib .\MetaData\*.* -s

del .\MetaData\*.*

 

%WINDIR%\SysWOW64\config\systemprofile\AppData\LocalLow\Microsoft\CryptnetUrlCache

attrib .\Content\*.* -s

del .\Content\*.*

attrib .\MetaData\*.* -s

del .\MetaData\*.*
%WINDIR%\System32\config\systemprofile\AppData\LocalLow\Microsoft\CryptnetUrlCache

attrib .\Content\*.* -s

del .\Content\*.*

attrib .\MetaData\*.* -s

del .\MetaData\*.*
%WINDIR%\ServiceProfiles\NetworkService\AppData\LocalLow\Microsoft\CryptnetUrlCache

attrib .\Content\*.* -s

del .\Content\*.*

attrib .\MetaData\*.* -s

del .\MetaData\*.*

 

Alternatively you can call this command:

certutil -URLcache * delete

 

No reboot is necessary, next time a component calls the CryptRetrieveObjectByUrl API it will not be able to satisfy that request with the cached data and will be forced to go on the wire.

One of the functions the service offers is the Automatic Update of the root store, a way to validate the cache is not being used is to:

  1. Remove all “Trusted Third Party CertificateAuthorities” from the Computer Account’s store using the Certificate Managementconsole.
  2. Clear the cache as described above
  3. Visit https://www.godaddy.com
  4. in IE
  5. Open Even Viewer\Application
  6. Sort on “Event ID”, find the 4097

Since every time a root is added a new event log entry is created you will see something that says “Successful auto update of third-party root certificate” in the event log, you will also see a few files in the above directories you previously cleared.

This all tells you that new wire retrieval took place and that the cache was not used.

You can of course also use tools like Reg/FileMon as well as Network Monitors to infer much of the same.

 

Hope this helps someone someday,

Ryan

How to mitigate the risk of the DigiNotar *.google.com SSL certificate

Given the recent news relating to DigiNotar issuing a certificate to an entity claiming to represent google that has turned out to be a malicious entity it’s probably most appropriate to cease trusting the DigiNotar root until the specifics of the issue have been identified.

As a practical matter they do little work outside the EU and are a very small player so your experience on the internet is not likely to be diminished as a result of not trusting them anyways.

That begs the question of how to do that? On the surface you might think what you need to do is to remove the DigiNotar root from your root store and in the case of Firefox, Opera that would do it (at least until they next patch and it gets added back in, that is unless they nix it too.).

In the case of IE and Chrome (which uses the Windows trust anchors) this is insufficient, there is a feature called “Automatic Root Update” (http://netsekure.org/2011/04/automatic-ca-root-certificate-updates-on-windows) that maintains the roots for you based on a policy that Microsoft maintains. When its enabled Windows will check with Windows Update as part of certificate validation to see if it should add a root to enable the path to build. You do not have to use this capability but I would not recommend disabling it unless you are a PKI savvy.

If that’s the path you follow, be sure to delete the root certificate from your Computer Accounts Third-Party Certification Authorities store also (if it happens not to be there, don’t fret if it isn’t that just means you have never encountered a certificate from them).

You also might want to check out a couple posts that Nasko has done relating to managing your own certificate store like this one (http://netsekure.org/2010/05/results-after-30-days-of-almost-no-trusted-cas/) and this one (http://netsekure.org/page/2/).

For everyone else all I would recommend is:

  1. Download the DigiNotar Root Certificate (http://www.diginotar.nl/files/Rootcertificaten/DigiNotar%20root%20CA2007.crt)
  2. Run mmc.exe
  3. Add the Certificate Management console
  4. Target it at the Computer Account certificate stores
  5. Add the DigiNotar Root Certificate to the “Untrusted Certificate” store

Now this is a bit more draconian than you may strictly need to but until its clear if it was the root that was compromised, a subordinate or their vetting practices the right thing to do is not to trust any certificates from them.

As I said before this is not likely to have any negative effects on your experience on the web and it will protect you from the attacks this issue represents.

I should note that I am assuming (and you know what they say about assumptions) that this is the only DigiNotar root, trusted by the browsers; I checked a few sources and it seems like that is the case but all of the CA trust programs do a poor job publishing this stuff these days. When I ran the Windows program we maintained a KB with the trusted CAs and their certificates in it, that doesn’t seem to be the case any longer, sigh.

Good luck,

Ryan

P.S. Nasko has also done a good post on how to manage the root stores on Windows you can find it here (http://netsekure.org/2010/04/how-to-disable-trusted-root-certificates/)

P.P.S. I have verified that for sites that have been pinned Chrome (only Chrome and only Pinned) google will flag these, IMHO this is good but you still need to remove it to be safe in the other cases. (see: http://www.breitbart.com/article.php?id=CNG.f17dd620575edb02954a7f8f0971f63b.4c1&show_article=1)

P.P.P.S. Looks like all 3 major browsers have untrusted DigiNotar

http://www.microsoft.com/technet/security/advisory/2607712.mspx?pubDate=2011-08-29.http://blog.mozilla.com/security/2011/08/29/fraudulent-google-com-certificate/.http://code.google.com/p/chromium/issues/detail?id=94673.

The Contrition of a Security Practitioner

The Encarta World Dictionary says that Contrition is “the deep and genuine feelings of guilt and remorse”. Having been involved in information security for 20 years, now, I think I can sincerely say that many security practitioners would say this is how they feel about the early days of their careers.

Why, you ask? Well, in my case, I started my career doing work for large financial institutions and governments. Back then these sorts of customers often had a “security at any price” mantra. While one would need to assess the risk of a system to secure it, these sorts of customers would also plan to mitigate as many of the identified risks as possible.

For these customers this was not necessarily a bad approach, but that had more to do with what was at risk than it did with the approach being a sound one.

Today the world is a different place; security is something that even the smallest businesses need to consider. This change did not occur overnight. It was gradual and I guess this is where the contrition comes in.

You see many applied the same approaches that worked with those financial and government customers with Fortune 500 and later Fortune 1000 companies. While in some cases this was appropriate, in most cases it was not.

The modern security practitioner needs to take a more holistic look at the business and platform they are servicing to understand its schedule and technological needs along with what the immediate business risks are.

Beyond that, the breadth of the role has changed and expanded. Security practitioners are now commonly responsible for Compliance, Reliability and Privacy, as well.

This puts the security practitioner in an interesting position; with this more complete view they can now help improve:

  • time to market, by recommending solutions that are risk-appropriate for the business;
  • engineering efficiencies, by identifying areas where work is being done inefficiently;
  • systems and processes, by identifying gaps and potential failure points that can negatively impact the business;
  • how teams allocate their scarce resources, by identifying opportunities where they’ll do the most good, based on risk vs. return.

This represents a significant shift from a decade or two ago, and requires the security practitioner to no longer simply be an outside expert but become part of the development team they support.

This is one of the reasons the Security Champion model is used in many teams here at Microsoft. While it has its challenges, as a member of the feature team a champion has the opportunity to have and share these more holistic insights as I called out above.

A good example of this is the application of cryptography to solve business problems. Cryptography is a powerful tool, but it’s often misapplied, introducing fragility and operational overhead that can be avoided; I think this is best summed up by this quotation:

If you think cryptography will solve your problem, then you don’t understand cryptography… and you don’t understand your problem. — Bruce Schneier

So, my ask of you as an engineering manager is to have a formal Security Assurance program for your team and as a software engineer incorporate your security specialists early and often. They either have direct experience in the areas I discussed here, or are in the position to bring those resources to your aid … to not only help you secure your offerings, but to do so in record time, as well.