Page Content

Tutorials

Post Quantum Cryptography in Quantum Computing

What Quantum Computers Offer as a Danger to Current Cryptography

Cybersecurity depends on cryptographic algorithms to confirm secure communication, data storage, and online transactions. An important portion of this cryptography, public-key cryptography like RSA and Diffie-Hellman, is based on the computational difficulty of mathematical problems such as integer factorization and the discrete logarithm problem. These problems are considered inflexible for classical computers.

The development of quantum computers positions an important threat to this security model. In 1994, Peter Shor developed Shor’s algorithm, a quantum algorithm capable of efficiently solving both the integer factorization and discrete logarithm problems in polynomial time. Fault-tolerant quantum computer skilled of running Shor’s algorithm at scale is built, these widely used public-key cryptosystems could be reduced insecure.

The breach in modern public-key cryptography could compromise the security of several essential systems and protocols, such as HTTPS, secure email, digital signatures, and financial transactions. Moreover, material encrypted with these susceptible methods today may be retained and then decrypted by attackers equipped with quantum computers in the future – a situation referred to as “decrypt today’s secrets tomorrow”. This imminent threat requires a proactive transition to cryptography techniques capable of resisting assaults from both conventional and quantum computers.

What is Post-Quantum Cryptography?

Post-quantum cryptography (PQC), also known as quantum-safe cryptography, is a cryptography focused on cryptographic algorithms that are believed to be secure against attacks by both classical computers and future quantum computers. This is in direct difference to the security basis of most currently deployed public-key cryptography, which depend on mathematical problems that Shor’s algorithm can solve efficiently on a quantum computer.

The goal of PQC research is to standardize cryptographic algorithms based on different mathematical problems that are currently hard for both classical and quantum computers. These algorithms to provide long-term security in a world where quantum computers may become a reality. Which requires specialized quantum hardware for its implementation, PQC algorithms are designed for classical computers using existing infrastructure, while potential adjustments in performance and key sizes.

Post-Quantum Cryptography Algorithms

Cryptographic algorithms are being actively researched and considered as potential candidates for standardization in the post-quantum era. These families depend on diverse mathematical problems believed to be resistant to known quantum algorithms:

  • Code-based cryptography: It is originating their security from the difficulty of decoding general linear codes; a problem has been studied for decades and is believed to hard even for quantum computers. The McEliece cryptosystem is a well-known and example of a code-based public-key encryption scheme. Code-based cryptography is measured to have relatively large key sizes compared to other PQC algorithms.
  • Lattice-based cryptography: It is one of the PQC algorithms whose security is based on the hardness of problems in lattice theory, Shortest Vector Problem (SVP) and the Learning with Errors (LWE) problem. Lattice-based cryptography is attractive due to its relatively strong security assumptions, mathematical foundations, and potential for efficiency in terms of key sizes and computational speed. Examples of lattice-based schemes  NTRU and SABER, both were finalists in the NIST standardization process.
  • Hash-based cryptography: It controls the security properties of cryptographic hash functions, like collision resistance and preimage resistance. The security of hash functions is not affected by known quantum algorithms like Shor’s. Hash-based digital signature schemes like XMSS and SPHINCS+ are stateless and can offer strong security based on well-understood cryptographic primitives. These schemes have large signature sizes but are practical for firmware updates.
  • Multivariate polynomial cryptography: It was depended on the difficulty of solving systems of multivariate polynomial equations over finite fields. While considered a promising direction and has faced delays. Rainbow was a multivariate polynomial-based signature scheme that reached the final round of the NIST standardization process but was subsequently cracked using classical computation.
  • Isogeny-based cryptography: It utilizes the mathematical properties of isogenies between elliptic curves or abelian varieties over fields. Super Singular Isogeny Key Encapsulation (SIKE) was another in the NIST standardization process and was measured a strong contender. However, it was also recently broken relatively quickly using a classical computer. These recent breaks underscore the essential risks and the rapid evolution of cryptanalysis in the PQC.

Post quantum Cryptography NIST

  • The National Institute of Standards and Technology (NIST) in the US started a multi-year standardization process in 2016 to review and choose PQC algorithms. They did this because they knew how important it was to have quantum-safe cryptography standards. Multiple rounds of entries, public review, and cryptanalysis by the international community of cryptographic researchers made this process very strict. The objective was to identify an appropriate group of methods that could be used for different cryptographic jobs, such as public-key encryption, key establishment, and digital signatures.
  • The NIST process was determine methods not only had good theoretical security against both classical and quantum threats but also worked in the terms of key and signature sizes and how quickly they could be computed. Several “NIST PQC Standardization Conferences” and a lot of public talk were needed to finish the process.
  • While the process has made significant progress in identifying strong candidates, the recent attacks on finalists Rainbow and SIKE serve as a stark reminder of the challenges in ensuring the long-term safety of new cryptographic methods. The worth of ongoing study, thorough cryptanalysis, and meticulous algorithm selection for standardization is highlighted by these events. By July 2022, NIST had made its first choices for quantum-resistant cryptography standards public. This was a big step forward in the process of change. Some of the algorithms that were chosen for digital signatures and general encryption were lattice-based and hash-based, respectively. Key encapsulation and general encryption were also part of these picks.

Challenges in the Transition to Post-Quantum Cryptography

Potentially quantum-vulnerable cryptography to quantum-safe alternatives presents some important challenges.

  • Infrastructure Repair: Existing cryptographic infrastructure is in-built in current IT systems, protocols, and standards. Migrating to new PQC algorithms will require large effort in terms of research, development, standardization, implementation, and widespread deployment across varied applications and platforms. This is a complex and time-consuming responsibility.
  • Performance Trade-offs:  PQC algorithms have different act compared to current algorithms. This includes larger key sizes, longer signature lengths, or slower computational speeds, which impact the efficiency of applications and require careful optimization.
  • Security Guarantee: PQC algorithms are moderately compared to traditional public-key cryptosystems. Wide cryptanalysis by the research community is critical to increase confidence in their long-term security against both known and yet-to-be-discovered attacks, both classical and quantum. The recent cracking of NIST finalists highlights the ongoing nature of this challenge.
  • Side-Channel and Implementation Security: The battle of PQC algorithms to side-channel attacks and the security of their operations in hardware and software also require investigation. New algorithms introduce original vulnerabilities need to understood and moderated.
  • Doubt and Evolution: The field of quantum computing is fast evolving; there is still doubt about the exact development of cryptographically valid quantum computers. Furthermore, advances in cryptanalysis could potentially impact the security of the most promising PQC candidates. This requires a flexible and adaptive approach to the transition.

Post-Quantum Cryptography vs. Quantum Cryptography (Quantum Key Distribution)

It is critical to differentiate between post-quantum cryptography (PQC) and quantum cryptography, also known as quantum key distribution (QKD). Both fields address security in the context of quantum computing; they adopt different approaches.

Quantum Key Distribution (QKD) uses the principles of quantum mechanics, like the Heisenberg uncertainty principle and the no-cloning theorem, to establish a secret key between two parties, referred to as Alice and Bob. The security of QKD is based on the laws of physics, for the detection of any eavesdropping attempt by a third party (Eve). Protocols like BB84 and B92 are well-known examples of QKD protocols. QKD offers the possibility of information-theoretic security, meaning its security is not predicated on unproven computational assumptions.

QKD has sure limits. It requires specialized quantum hardware for key generation and transmission. Practical challenges and limitations in data transmission rates, distance restrictions due to signal loss in transmission media like optical fibers, and the cost of deploying and maintaining the quantum infrastructure. Also, even genuine users need to possess quantum capabilities (albeit potentially simpler than a full-fledged quantum computer).

In contrast, post-quantum cryptography (PQC) to develop cryptographic algorithms can be applied and run on classical computers using existing infrastructure but is designed to remain secure against attacks from both classical computers and future quantum computers. PQC’s security depends on the expected computational stiffness of specific mathematical problems that are believed to be difficult for both types of computers. Hybrid approaches that combine classical and post-quantum algorithms are also discovered as a planned approach during the transition period.

And finally, the quantum computing offering an important and evolving threat to the security of currently deployed public-key cryptography. Post-quantum cryptography is a critical field of research to develop and standardize cryptographic algorithms that can survive this quantum threat while operating on classical computing infrastructure. While the transition to PQC offerings presents important challenges in terms of infrastructure updates, performance considerations, and ensuring long-term security, the potential for a quantum computer to break current cryptographic systems requires a proactive and hardworking approach. The ongoing efforts in the PQC algorithms, despite recent setbacks and the continuous research and cryptanalysis, are important steps towards securing the digital future in the face of advancing quantum technologies.

Index