Skip to main content
banner image
venafi logo

The Demise of 1024-bit Certificates

The Demise of 1024-bit Certificates

November 15, 2013 | Seth Shearer

Nearly everyone understands the need to use data encryption to protect data both in transit and at rest, but I have found that there is some confusion about the strength of the key that is used to encrypt that data. Some of this confusion is in part due to the fact that we have been warned for so long that certain keys and certificates are not strong enough, yet organizations that issue these certificates continue to allow us to acquire these weak encryption assets. This practice reminds me of the proverb of the boy who cried wolf. For nearly four years, the U.S. National Institute of Standards and Technology (NIST) has been telling us that we should be using 2048-bit keys, but we have collectively ignored those warnings. Now that the requirement to use 2048-bit certificates is upon us, many are decrying the fact that this requirement is being foisted upon the industry without much warning. These people are forgetting that we have grown complacent with the constant warning cries—much like the people who ignored the boy crying wolf. Now that the danger is upon us, we are caught unaware and unprepared.

What’s the big deal anyway? Hackers may be able to use the increasing computational power available—either on their physical hardware or through renting cloud computing hours—to overcome the encryption strength of algorithms and key sizes that were once deemed sufficient. A few years ago, a “brute force” attack that cracked an encryption algorithm was unthinkable, but such an attack is fast approaching the horizon of reality. In a brute force attack, a machine systematically tries all possible encryption combinations in an attempt to crack open the data. To understand how a simple brute attack works, think of an old-fashioned, three-digit briefcase lock. If you wanted to, you could try rotating the wheels in a systematic fashion until you stumbled across the right combination to open the lock. You may discover the right combination of digits on the first try, or the hundredth, or the last possible combination, but you will discover the combination sooner or later.

According to Moore’s Law, computational power doubles roughly every 18-24 months. This means that a key strength that was adequate a few years ago is rapidly reaching obsolescence at an ever increasing rate. And if quantum computing evolves and makes its way to the masses, encryption algorithms will also need to evolve to provide the strength to resist even more powerful attacks. Even encryption techniques that we feel are adequate today may be rendered useless if quantum computing continues to evolve and makes its way to the masses. Therefore, correcting a problem today—such as the move from 1024 to 2048-bit encryption—will not ensure that you will never have to do this again.

The Certificate Authority/Browser Forum (CA/B Forum) seems to have sounded the death knell for the 1024-bit certificate. The forum has instructed Certificate Authorities (CAs) to support only 2048-bit certificates and larger by the end of 2013. Responding to this requirement, many CAs stated that they would revoke all active certificates that are below 2048 bits on October 1, 2013. Other CAs have been a bit more gracious in not overtly revoking them on that day, but all have stood by the CA/B forum’s edict to require 2048-bit certificates by 1/1/2014. See the blog post titled “Gone in 60 Months” which you can read if you want additional information on this topic.

What is the result of this requirement? In short, the most visible action may be that individual Internet browsers will begin to disallow the use of certificates that are less than 2048 bits. For example, Mozilla provided an implementation timeline of December 31, 2013. “Soon after this date [December 31, 2013], Mozilla will disable the SSL and Code Signing trust bits for root certificates with RSA key sizes smaller than 2048 bits.[CA:MD5and1024]” And on September 24, 2013, Google Chrome’s team stated in a submission to the CA/B Forum that “in early 2014, Chrome will begin warning users who attempt to access sites with certificates issued by publicly-trusted CAs, that meet the Baseline Requirements' effective date, and with key sizes smaller than those specified in Appendix A [that is, less than 2048 bits].”[Upcoming changes to Google Chrome's certificate handling].

“What effect will this new requirement have on my organization?” is a question I have fielded with ever-increasing frequency. Most people ask this question with a sense of confidence, clearly thinking that there will be little or no impact. Unfortunately my experience and research have shown that this will not be the case. Even just a cursory review of some of the most common banking, retail, airline, software providers, and even government sites will yield a plethora of 1024-bit certificates.

This problem becomes even more difficult to tackle because organizations incorrectly assume that they can rely on their internal and external CAs to help them identify all their weak certificates. If the only certificates in use were those generated by a CA, then this would be a reasonable solution. However, almost any software application that we install and most hardware devices on which we rely have certificates, and a huge portion of these certificates are 1024 bits. These certificates are found in email systems, virtualization platforms, routers and switches, printers, databases, telecom equipment, and almost all other systems and devices we rely upon in our daily operations. I’m certainly not trying to be an alarmist and say that the world will come to a screeching halt on 1/1/2014 ala Y2K. There is a real possibility, however, that users may have a particularly bad experience when they access websites to transact business and they see browser warnings. Perhaps it will be the inability of network administrators may be unable to log in remotely and manage the network virtualization infrastructure or even perform simple tasks such as adjusting wireless access points.

The first step in remediating this issue is really quite simple: we must create a comprehensive inventory of all certificates on all of our devices. The second step is also relatively easy: we must continuously monitor the entire network to ensure that “weak” certificates are not introduced into the environment as new applications come online and new devices are installed. The third step is a bit more involved: we must move quickly to replace all weak certificates identified in the first step above. When more weak certificates are discovered on the network during continuous monitoring (the second step above) they must be replaced immediately. It becomes more challenging when these certificates are found on hardware devices such as printers and telecom equipment, yet no weak certificate or key should be ignored. Only with constant vigilance, until the industry from top to bottom fully complies with the requirement to use 2048 bit certificates, will we be able to eradicate these weak keys and certificates.

Subscribe to our Weekly Blog Updates!

Join thousands of other security professionals

Get top blogs delivered to your inbox every week

Subscribe Now

See Popular Tags

You might also like

TLS Machine Identity Management for Dummies

TLS Machine Identity Management for Dummies

Certificate-Related Outages Continue to Plague Organizations
White Paper

CIO Study: Certificate-Related Outages Continue to Plague Organizations

About the author

Seth Shearer
Seth Shearer
Read Posts by Author
get-started-overlay close-overlay cross icon
get-started-overlay close-overlay cross icon

How can we help you?

Thank you!

Venafi will reach out to you within 24 hours. If you need an immediate answer please use our chat to get a live person.

In the meantime, please explore more of our solutions

Explore Solutions

learn more

Email Us a Question

learn more

Chat With Us

learn more