Monday, 17 November 2025

How to avoid Q-Day?

IEC E-Tech

article here

According to the 2024 Microsoft Digital Defense Report, there are an estimated 600 million cyber attacks every day around the world, and things could be about to get significantly worse. Rapidly developing quantum computing technology is poised to render most current methods of encryption redundant.

At the same time, emerging post-quantum cryptography (PQC) promises a new defence that would lock, in theory, not just critical national assets but all digital systems from quantum hacks. The race is on to build worldwide resilience before what most pundits call Q day – the collapse of digital security.

Cryptography’s role in our digital societies

Business and infrastructure are under constant attack from both cyber criminals and nation-state actors. Multinational car manufacturers, European airports, Japanese weather forecasters and Australian pension funds are among those to have been hacked this year with potentially disastrous financial and, in the case of airports, fatal consequences.

“Cryptography has become the lynchpin of our digital society,” says Robert Oates, associate director at a UK-based cyber security specialist. “It provides three key services: keeping data confidential; guaranteeing [the website] you are connected to is the one you think it is; and preventing messages from being tampered with. The internet would not be the place it is today if you could no longer trust who you were speaking to or you couldn't guarantee security of payment details when we transact online. Cryptography is fundamental to the way our society works.”

Algorithmic systems like RSA, a form of asymmetric encryption, named after its creators Rivest-Shamir-Adleman, and Elliptic Curve Cryptography (ECC), employed in various digital signature schemes, are foundational to securing modern digital infrastructure from military communications to medical records, financial information and private emails, yet these are the systems most vulnerable to being broken by quantum computing.

RSA and ECC are based on classical mathematics, where a digit (or bit) is either 0 or 1. Since a bit (or qubit) in quantum computing can be in multiple states (or superpositions) at the same time, in theory, it could crack conventional encryption in a fraction of time. “A sufficiently capable quantum computer would be able to sift through a vast number of potential solutions to these problems very quickly, thereby defeating current encryption,” warned the U.S. Department of Commerce’s National Institute of Standards and Technology (NIST) last year.

Crucially, RSA and ECC are also asymmetric systems where one key is public and used to encrypt the data and another key is private and used for decryption. “It was always assumed that it would take millions of years to calculate the private key from the public key,” Oates explains. “The crux of the threat with quantum is that it can calculate the private key in hours. That's a real problem because you put your public key out there, and suddenly anyone can read messages intended for you. Quantum can do it in a few hours, not because it's faster, but because it works in a fundamentally different way to classic computers.” That a cryptanalytically relevant quantum computer (CRQC) capable of cracking conventional cryptography has yet to be built does not undermine the certainty that, at some point, it will be.

There are differences of opinion as to when this might happen but government agencies are taking no chances. The US federal agency NIST suggests such capability could exist within a decade, “threatening the security and privacy of individuals, organizations and entire nations”. The UK’s National Cyber Security Centre has tasked organizations with having a plan for being quantum safe by 2028, to migrate high priority systems by 2031, and a deadline of 2035 to upgrade all security with resilience to the threat from “future large-scale, fault-tolerant quantum computers”. The EU has detailed a similar time line.

Julian Van Velzen, Head of the Quantum Lab at a French multinational IT company, says, “RSA could be broken by quantum almost overnight, were a suitable computer to be developed. It's just a matter of time. The risk of a CRQC being ready by 2035 is maybe 50/50 but you need to prepare now.” The scale of the upgrade is the biggest issue. “All cryptography has to be replaced,” Van Velzen says. “It’s in hardware. It's in software. It's in operating systems, databases and VPNs. It's like Y2K [the millennium bug], but, since 2000, the internet has exploded in complexity and therefore exploded in cryptography. The challenge is huge.”

The risk is real today. Encrypted data is believed to be intercepted by criminals with a view to storing it for potential decryption with quantum computing in the future. So called “harvest now, decrypt later” attacks pose a problem for everybody, says Daryl Flack, Partner at one of the IT security companies supporting the UK government on the first phase of national PQC migration. “Any malicious actor who's listening in now could capture all of this traffic. It may be encrypted for now, but once you get a quantum relevant computer, you can effectively reverse engineer the math.”

Post-quantum cryptography

Perhaps surprisingly, the solution to the quantum threat is to rewire every existing cryptographic system with new algorithms based on classical mathematics. Post-quantum cryptography (PQC) has been worked on by mathematicians for nearly a decade. NIST led the charge by running a competition to find new algorithms resistant to quantum computing, whittling a shortlist of 69 down to three sets, which it published in 2024.

“We know that quantum computers can do very specific computations very well, but there are plenty of things that they are not good at,” says Van Velzen. “PQC is designed in such a way that even quantum compute won't be able to break it. Maybe one day we'll devise a new quantum algorithm that will break PQC, but the design process has been such that it's believed to be resilient against quantum computers.”

One set of PQC algorithms published by NIST is based on numerical lattices, another on hash functions. Oates explains, “We're swapping out the problems at the heart of cryptography with algorithms that we have confidence are difficult to break for both quantum and classical computers. We are fairly confident – because you can't really prove this – that quantum computers will struggle to solve these [lattice and hash-based] problems.”

Yet, one of the shortlisted NIST algorithms was broken at a fairly late stage in the process, prompting calls to build in cryptographic agility. “We may need to think about designing systems where cryptography is easier to switch out of so we don’t go through this pain again,” Oates advises. “That raises a whole set of other challenges because we've seen that, in the past, when people have tried to do that, they've accidentally made things vulnerable. Hackers have gone in and essentially forced them into using a weak cryptography algorithm.” He adds, “The general consensus is that we've enjoyed a long period of stability with RSA and elliptic curves, where they were pretty much untouchable, but now we might enter a period where every few years we have to try something else.”

Quantum key distribution shows potential

A further line of defence could come with the development of quantum key distribution (QKD). This method of encryption relies solely on quantum technologies and on the development of suitably powerful and affordable quantum computers.

QKD encodes messages using the properties of light particles. As outlined by the IEC, the only way for hackers to unlock the key is to measure the particles, but the very act of measuring changes the behaviour of the particles, causing errors that trigger security alerts.

“QKD is very good at ensuring that you have encrypted data between two points, but it's not very good at solving the problem of being confident that who you're talking to is the person you're talking to,” explains Oats. “Government agencies believe QKD is not ready for mainstream use, but it is being viewed as another potential layer of security. For example, you might end up with a post-quantum crypto certificate that proves your identity and you communicate using QKD exchange keys.”

“But there's a big economic barrier to QKD. You would need dedicated quantum infrastructure although advances are being made in using fibre optics or even over-the-air QKD. The big barrier is distance. There are serious limits on how long a QKD connection can be without it degenerating.”

Van Velzen agrees with this assessment, “It's our view that within the timeline laid down, QKD is not a solution. It may provide additional security and benefits further down the road, but in the next five to ten years, it's not going to make a significant contribution to getting us quantum safe. The complete solution relies on PQC.”

Standards indispensable to underpin security

The global market for quantum computing is expected to hit USD 50 billion by the end of the decade, potentially rising to USD 1,3 trillion by 2035. That huge leap is predicated on breakthroughs in the technology which will see quantum computing systems scale up. “The main areas of risk are for organizations which opted to manage their own cryptography,” says Flack. “If you're buying a service off a SaaS [software-as-a-service] provider or a big technology provider, then a lot of the necessary upgrades will happen in the background. But if you've got your own crypto which runs on your own hardware security modules, then all of that tech is going to need to be upgraded, and how you communicate with devices will need to change. That’s where setting new technical standards for devices and protocols is all important.”

The IEC and ISO have already published a couple of standards, ISO/IEC 23837-1 and ISO/IEC 23837-2, related to the security of QKD systems. ISO/IEC 23837-1 specifies a general framework for the security evaluation of QKD modules. It includes a baseline set of common security functional requirements for both the conventional network components and the quantum optical components of QKD systems The international standard also outlines the entire implementation of QKD protocols and analyzes potential security problems that QKD modules might face in their operational environment.

ISO/IEC 23837-2 focuses on the test and evaluation methods for the security evaluation of QKD systems. It describes the evaluation activities necessary to test the security functional requirements of QKD protocols, quantum optical components and conventional network components. These standards are the work of the joint IEC and ISO technical committee JTC 1/SC 27. It is responsible for developing international management and technical standards for information security and privacy protection and related topics. 

The European Telecommunications Standards Institute (ETSI) recently launched a new Technical Committee on Quantum Technologies to develop specifications that address the implications of quantum on global communications networks. A key area of activity is “establishing methodologies to assess hardware vulnerabilities and side-channel attack risks”.

The IEC has been tracking the power of quantum cryptography for years, noting that if the technology falls into the wrong hands, it “could break encryption and wreak havoc on critical infrastructures”. That’s why the joint IEC and ISO technical committee for quantum technologies, ISO/IEC JTC 3, was formed. According to international expert Michael Egan, who is a member of JTC 3, the task force will help prepare for a quantum future by “harnessing the collaborative power that exists in international standards to enable trade and technology adoption”.

JTC 3 has established formal liaisons with relevant regional and international organizations, including the European Commission, the IEEE and ITU-T. Working groups have been set up to examine quantum’s impact on sensors and the supply chain and to benchmark quantum computers. The foundational standard, ISO/IEC 4879, defines the terms and vocabulary commonly used in quantum computing with work ongoing in several other standards to support the technology and expertise required to build quantum systems.

 

 


No comments:

Post a Comment