Preparing for the Effects Of Quantum-Centric Supercomputing
Public announcement of the 433-qubit IBM Quantum Osprey processor at the 2022 IBM Quantum Computing Summit on Nov. 9 represents another evolutionary milestone in the development of universal quantum computers. Not content with the increase in computational power provided by the Osprey quantum processor unit (QPU), IBM intends to release the 1,121-qubit Condor processor in 2023, followed by the Kookaburra platform with over 4,158 qubits in 2025. Kookaburra will exploit quantum parallelization of large processors, and its planned release will coincide with the availability of enhanced error-correction tooling that is essential for the practical application of IBM’s vision of quantum-centric supercomputing.
The near prospect of scalable multi-QPU clustering, underpinned by growth in the size of universal quantum processors, means that increased attention must now be paid to the disruptive effects associated with quantum computing. While the research investment, technical skills, and operational considerations associated with the development of universal quantum computers suggest an adoption path analogous to that of digital supercomputers, with centralized availability of computing resources concentrated in the hands of wealthy corporations, international scientific collaboratives, and nation-states, the implications of the IBM development trajectory affect us all.
Back in 1994, two significant events in the history of quantum computing occurred when the publication of an algorithm by Peter Shor was followed in August of that year by the first NIST Workshop on Quantum Computing and Communication. Side-stepping the complex mathematics involved, in the context of the cryptographic methods we currently use to protect data, Shor’s algorithm established a method for efficiently resolving the prime factors of a given number using quantum computation. Lov Grover subsequently defined a different quantum mechanical algorithm in 1996 that delivers an exponential reduction in the time required to search for a discrete value within a list of possible alternatives.
Shor’s algorithm is significant because it provides an efficient means to derive the cryptographic keys on which modern public key cryptography depends, rendering established cryptographic methods such as RSA and Diffie-Hellman key exchange obsolete. Grover’s contribution affects existing data security practices because it reduces the effective bit-length security of symmetric AES cryptographic keys by half under a fault-tolerant quantum brute-force attack.
It’s Time To Get Ready for Quantum Computing
The implications of universal quantum computers for our data security are well documented. Indeed, IBM reiterated the issues for data encryption in its announcement of the IBM Osprey QPU, commenting on the need for the industry to prepare for the inevitable impact now. It’s worth noting, however, that to defeat RSA 2,048-bit cryptography, approximately 6,190 fault-tolerant logical qubits would be needed — predicating parallel configuration of large QPUs. Even with computation resources at the necessary scale, the effective computational cost of breaking RSA 2048-bit encryption remains in excess of 1.17 megaqubitdays (i.e., 1.17 million qubits performing a prime factorization attack in a 24-hour period).
While it has been a perennial forecast that efficient universal quantum computers are “a decade away,” that prospect now seems a legitimate possibility. Neven’s Law of double exponential improvement in quantum computational power may soon be observed in the pace of technical development. The 1994 NIST Workshop on Quantum Computing gave rise to the NIST Post-Quantum Cryptography Standardization project, which seeks to derive new cryptographic protocols that are resistant to quantum attacks and assume increased importance in the wake of the IBM Osprey announcement. The search for effective post-quantum cryptography (PQC) was advanced in July 2022, following the third round of algorithm evaluations, with the communication by NIST that four algorithms are candidates for standardization. The search continues, with a fourth round of evaluation now underway.
So, how should organizations heed the advice of IBM and providers of data security in order to adapt to the immanence of quantum-centric supercomputing? The first practical step is to recognize the implications of the new quantum paradigm and the importance of maintaining a flexible approach to developments in PQC. Under an expectation that universal quantum computers of the scale necessary to perform effective attacks on today’s encrypted data will be concentrated in the hands of sufficiently capable service providers and nation-states, companies must consider that encrypted data appropriated today could be successfully decrypted in years to come.
The lifespan of data and its sensitivity must, therefore, be considered within contemporary risk assessments. Where exposure of data in the quantum era must be prevented, organizations should consider the adoption of the early candidate algorithms for NIST standardization. This requires the implementation of key management systems and cryptographic interfaces that retain the necessary agility should new PQC algorithms supersede the methods selected at this point in time. If standards change, or candidate protocols are found to be exploitable, it is vital that data can be encrypted to new levels of security, without constraint.
Handle With Care
A second, more cerebral, step in preparing for the impact of quantum computing at scale is to reflect on the way that quantum-centric supercomputing, as envisioned by IBM and others, will affect the way that sensitive data must be handled during processing. Whereas migration to public cloud services has yielded new operational models and economies of scale for customer organizations, it also has brought challenges in terms of the dual concerns regarding data sovereignty and retention of control over data security by the data owner. Understanding the additional implications for data security, and the competitive advantages to be gained from the adoption of quantum computing methods, should now be the focus of senior data and information security professionals and business leaders.
The NIST PQC Standardization project points to potential solutions to the quantum challenge to privacy and confidentiality that is rapidly approaching. Niels Bohr, one of the preeminent figures in the history of atomic physics, once said that if you are not shocked by the implications of quantum theory, then you have not understood it. The same could be said of the implications of universal quantum computers for global data security. Although the future remains opaque, it’s certain that today’s organizational leaders and technology providers must begin the path to the adoption of PQC. Their data depends on it.
Read More HERE