## Abstract

We will soon reach the physical limits on how small a transistor can be to act in a deterministic manner rather than being governed by quantum phenomena, as predicted by Moore's Law all the way back in 1965. We took MOSFET scaling to 800 nm in 1987 and are likely to be shipping standard devices at 2 nm by 2024. Better techniques to miniaturize, cool, design, and fabricate have allowed us to maintain a brisk pace but as we get towards 1 nm we get at an atomic scale, although not yet at the scale of various subatomic particles.

Will new quantum computing discoveries pose a threat to current cryptography algorithms in the near future? Many encryption techniques rely on the notion that some mathematical problems, such as factorization of a huge integer, are impossible to solve with a traditional computer. Google presented experimental evidence of so-called quantum supremacy in Nature in October 2019, demonstrating that their quantum computer can do calculations that a traditional system cannot (Arute et al., 2019).

Over the past few decades and increasingly in the past few years, there has been a lot of discussion about what the impact of quantum computing means for society and how it impacts the future of computer science and information security. This was part of the diligence done with the first generations of computing as we transitioned from tubes to transistors to integrated circuits and then microchips and remains true with future generations of computing, especially when there are greater than incremental jumps in performance across them.

## Aim

This article aims to provide an overview of how quantum computing works from both a hardware and software perspective, as well as to aggregate recent findings on how to provide quantum-safe cryptography.

## Objectives

Explain the hardware and software strengths and weaknesses of quantum computing by introducing quantum physics concepts.

Investigate quantum computing's huge implications for cryptography.

Discuss and assess the existing suggestions for using quantum-safe cryptography algorithms in a critical manner.

## Scope and Delimits

According to physicist Richard P. Feynman's famous comment, "I think I can reasonably assert that nobody knows quantum mechanics" (Feynman, 1965: 161), this topic is described as both challenging (when studied in depth) and enormously vast. This abridged study covers some general quantum physics concepts that are useful in understanding quantum computing and discusses the major implications on cryptography; however, it will not include a detailed scientific explanation of how quantum mechanics works in the real world, owing to the fact that the scientific state of the art on this topic is incomplete.

Also, this study will focus on a few applications of quantum computing, specifically on cryptography, while many more important applications such as drug development, financial modelling, traffic optimisation etc., are not described and in fact have greater bodies of work available.

## 1. Introduction

### 1.1 Quantum Mechanics

Quantum mechanics is a branch of mechanics (and so physics) that studies how nature works at the atomic level. Classical mechanics can explain how everyday things, such as a car or a ball, act, as well as their velocity and spatial placement. Not everything obeys the classical mechanics equations when it comes to subatomic particles or objects with velocities close to the speed of light. Enter the uncertainty principle.

Quantum mechanics goes deeper than classical and easily observable mechanics and describes the properties of atomic particles, how they act, and how they react to certain conditions.

##### 1.1.1 History

Classical Newtonian mechanics was a commonly recognized technique to describe how the world operated for a long time in the history of physics. Aristotle founded Aristotelian physics around 2,400 years ago. He believed that all substances could be analyzed into matter and form. The next two thousand years showed slow but steady progress in understanding classical mechanics, like motion, and over time physics itself became so large a body of knowledge that it branched from classical mechanics to include thermodynamics, electromagnetism and photonics, relativistic mechanics, optics, acoustics, matter physics, and much more. One branch was quantum mechanics after JJ Thomson discovered the electron in 1897.

German physicist Max Plank discovered quantum mechanics in 1900, which he won the Nobel Prize in Physics for. Here, we see breaking everything in nature down to atoms and particles. He built on work done by John Dalton on the law of multiple proportions and Robert Brown, who noticed that grains of dust in water moved erratically, a phenomenon we now call Brownian motion.

Albert Einstein proved that theory in 1897 and Jean Perrin built on that work to start measuring the mass of those molecules and provide evidence and thus the proof of particles. And they were able to build on top of the work from British Chemist John Newlands to add to the periodic table he started back in the mid 1800s. And our very idea of how atoms were comprised was emerging as fellow Brit JJ Thomson realized cathode rays weren’t electromagnetic waves but instead far, far lighter than the lightest atom, hydrogen. And thus the electron was the first of the subatomic particles to be discovered and were in fact the particles that carried electrical current over wire. Einstein introduced the special theory of relativity in 1905 and over time we began to realize one of the definitions of a particle is in fact that it is a quantum excitation of a field. And another is that particles could be measured in a detector.

Many of these discoveries were made possible by the fact that our tools to measure objects were getting smaller and more precise. After Plank, Schrödinger, Bohr, Heisenberg, and others explored and continued the development of quantum mechanics, giving us matrix mechanics and wave mechanics (Griffiths, 2018). John von Neumann, Hilbert, and others then formalized the theories and unified them and in some ways predicted the use of electronic flows in semiconductors.

Future research, ranging from Einstein's General Relativity theory (Einstein, 1907) to electromagnetism, were founded on this set of laws (Carroll, 2019). According to Rutherford's classical atom model (Rutherford and Royds, 1909), electrons orbit the nucleus as illustrated in figure 1.1, but this movement causes electromagnetic radiation, which causes the electron to lose energy.

Figure 1.1: Rutherford’s atomic model (Durini and Arutinov, 2014)

This energy-draining particle should not remain in the nucleus, but should spiral towards the nucleus's centre in less than 1011 seconds, implying that matter made up of atoms could not exist at all. Many experiments, such as Young's Double-slit experiment (figure 1.2) and Maxwell's electromagnetism, showed that electrons should be thought of as waves with orbitals of various shapes.

Figure 1.2: Young’s double-slit experiment (Joaddar, 2018)

Besides, in 1926 Schrodinger described the behavior of the wave function Ψ with this equation:

Figure 1.3: Schrodinger’s save function

Electrons, on the other hand, behaved like particles in tests like Planck's black-body radiation and Einstein's photoelectric effect. Is light a wave or a particle? It is both, and quantum mechanics explains how matter behaves. The widely accepted Copenhagen interpretation, proposed by Bohr, Werner, and Heisenberg in 1920, considers light to be in its quantum superposition state until it hits a detector, which localises the particle in a specific position and causes a reduction of the wave, forcing the particle to choose a state and thus loose quantum correlations (Kupervasser, 2017).

This work went from theory to practical use following the Manhattan project, Oppenheimer’s Project Y, and the Trinity Test, which ushered in the nuclear age. And yet few saw the connection between Cold War nuclear buildups and the emerging revolution in transistorized computing. That work began with first understanding quantum entanglement, initially written about by Einstein, Boris Podolsky, and Nathan Rosen (or EPR) in 1935 and then explored much more thoroughly by Erwin Schrödinger. Here, we see a group of particles that can be generated and share proximity such that each can be described and potentially measured; Much as the state of electrons flowing through a gate printed in silicon can be.

Others worked on measuring the state of electrons, such as David Bohm, who argued that the EPR theories couldn’t yet be formalized as we hadn’t yet worked out how to prove the theories with physical precision. The issue wasn’t that we couldn’t measure entanglement but instead that there were what John Stewart Bell described as loopholes, or certain types of entanglements that have a problem with their design or findings. Thus he developed the idea of entanglement witnesses, or functionals that can tell the difference between entangled states. This was still the early 1960s and so experiments were more experimental than practical.

Stephen Wiesner published a paper with the ACM in 1968 introducing the world to conjugate coding, the first quantum coding application for cryptography. In the paper, Wiesner introduced quantum multiplexing, where photons were polarized as cubits to transfer numbers in such a way that each destroys the next. And so quantum cryptography was born, although still only in theory.

David Deutsch, working at the University of Oxford laid out theories in 1985 on how to engineer quantum logic gates for a universal Turing machine developed on a quantum computer. That became practical in 1995 when NIST’s Chris Monroe and David Wineland developed the first quantum gate. Then in 1998 Isaac Chuang working with the Los Alamos National Laboratory, Neil Gershenfeld out of the Massachusetts Institute of Technology (MIT), and Mark Kubinec at the University of California at Berkeley created the first quantum computer (2-qubit) that could properly process data to a known, solvable solution. Now quantum cryptography was more than a theory but ready to become a reality.

##### 1.1.2 Quantum Properties

Now that we understand the historical significance and velocity of quantum mechanics as applied to computing, let’s introduce a few basic principles to help explain the properties of a particle we can use to measure for the purpose of computing. The three major qualities outlined – superposition, entanglement, and interference – are used to control the state of a qubit.

Superposition. A particle can exist in multiple states at the same moment. When the superposition is measured, it collapses into a single outcome, obliterating any state information that existed before to measurement. Furthermore, despite being specified by particular statistical qualities, the individual outputs of the measurement are unpredictable (Black et al., 2002).

Entanglement. The states of two entangled particles are correlated, and they can be described independently of one another. When one particle is measured, the other collapses into one of the measurement bases. However, because the particles are entangled, measuring one tells you what to expect when you measure the other (Black et al., 2002).

Interference. When multiple particles interact, quantum particles carry information about their phase, a characteristic that creates interference. The amplitudes add if the two particles are in phase; if they are out of phase, the amplitudes cancel (Black et al., 2002).

These become the basis for measuring state. The most promising techniques being researched (and built and sold) include nuclear magnetic resonance (NMR), ion traps, and quantum dots. That measured state then provides the ability to store information and develop logic on top of the known and measurable qualities of particles.

### 1.2 Quantum Computers

Classical computers employ bits to store information in a 0 or 1 state. Quantum computers analyze and alter data using qubits (quantum bits) and quantum mechanical processes (see section 1.1.2). When a qubit loses its quantum superposed state, it is characterised as a superposition of the states |0i and |1i, each of which has a specified probability of being measured as 0 or 1 when the qubit loses its quantum superposed state. |ψi = α |0i+β |1i is a qubit state that satisfies |α|2+ |β|2 = 1. |α|2 and |β|2 represent the probability of the qubit being measured in the logic state 0 or 1 respectively (Meng and Pian, 2016).

A Bloch sphere can be used to represent a qubit's superposition state:

Figure 1.4: Bloch sphere representation of a qubit (Abramsky, 2014)

When a quantum measurement is performed, these limitless possibilities for a qubit value collapse into 0 or 1. The qubit is within range or in a desired state or it isn’t. The result is probabilistic, and it is determined by the distance between the qubits and the sphere's poles. Qubits with values on the sphere equator are equally likely to be 0 or 1 once measured, yet they are still quite different. When two qubits interact, their position determines changes in phase value, which causes interference effects. Thus quantum computers don’t have 100% fidelity.

Two qubits can be in a superposition state in which the results of measurements are unknown but will almost surely be the same or opposite for both qubits (entanglement). This means that to describe more than one qubit at a time, we keep track of the probabilities of all conceivable measurement outcomes. As the number of qubits in a system grows, so does the number of possible entangled states. Although quantum computers are capable of encoding and processing certain types of data, quantum decoherence makes the quantum superposition and entanglement states extremely vulnerable to external interference.

The ideal situation is for the environment around the qubits to be perfectly clean, but without the ability to control and interact with the system, it would be ineffective. As a result, maintaining the superposition state for long enough to process valuable data is a significant challenge in developing good quantum electronics. Companies like Google, IBM, Amazon, and Microsoft are working to scale up their quantum computer hardware, which is still in its early stages when compared to the amount of memory they need to tackle larger and more practical problems.

Figure 1.5: IBM’s System Q quantum computer (Toon, 2020)

Today's quantum-computer processors must operate at temperatures near absolute zero in cryogenic containers (thus the unique appearance), but the electronics required for read and control do not function at these temperatures (IEEE Spectrum, 2020). There is still room for cabling to work outside the cryogenic enclosures in today's quantum computers with less than 100 qubits, but in the future, ultralow power control chips will have to operate inside the enclosures (Fairley, 2020).

## 2. Applications of Quantum Computing

Quantum information processing makes use of quantum phenomena to do tasks that would take a traditional computer a long time to complete or in some cases would be impossible. Sections 2.1 and 2.2 provide an overview of two ways quantum computing might be used to more efficiently tackle specific sorts of issues (Kaye et al., 2007).

### 2.1 Business and Engineering Challenges

JP Morgan has published work on quantum work to price option contracts and do financial modeling. Daimler AG looking to harness better battery simulation, Volkswagen worked with D-Wave on the traveling salesman problem. Goldman Sachs is writing algorithms, Honeywell is writing algorithms.

### 2.2 Simulation of Molecules

Richard Feynman's study "Simulating physics with computers," released in 1982, laid the groundwork for exploiting quantum systems to solve complex problems. Feynman highlighted how computers can assist physicists in modelling physical processes in this paper. When replicating the movements of an atomic model, for example, it would be necessary to keep track of all possible electron locations (Feynman, 1982). A conventional computer could help with these simulations, but it would quickly run out of memory; however, a quantum system with qubits can easily track these probabilities because that is how it naturally behaves (Fernandez, 2018).

### 2.3 Cryptography

Cryptography is now an important part of how computers transmit information since it ensures that only authorized users have exchanged keys that allow them to access encrypted secrets and read encrypted messages (Campagna et al., 2015). Cryptography is divided into two categories:

Symmetric Cryptography. The sender encrypts the communication with a secret key that only the receiver has access to in order to decrypt it. Both the sender and the receiver must keep this key secret during the connection, but this is difficult to do, which is why asymmetric encryption was invented (Mavroeidis et al., 2018).

Asymmetric Cryptography (PKC or PKI). Both the transmitter and the receiver use the RSA technique to generate a pair of public and private keys that are linked to each other on their computers. The public key is used to encrypt data that can only be decrypted by the matching key. This method is utilised in HTTPS websites, the PGP email protocol, digital signatures, and a variety of other secure communication protocols (Mavroeidis et al., 2018).

PKC’s security lies in the complexity of factoring large prime numbers and the problem of the discrete logarithm. A traditional computer finds this task computational infeasible (Buchanan and Woodward, 2016).

In 1997, Peter Shor developed an algorithm that solves both the discrete logarithm problem and the large prime integer factorization problem using a theoretical quantum computer (Shor, 1997). Asymmetric cryptographic approaches can be rendered obsolete by converting these issues from an NP to a BQP (bounded error quantum polynomial time) complexity class. A more extensive account of the impact of quantum computing on encryption systems may be found in Table 2.1, which was adapted from NIST.

Table 2.1: Impact analysis of quantum computing on encryption schemes (L. Chen., 2016)

The Penetrating Hard Targets project was leaked in 2014, describing ways the US National Security Agency was working on quantum computing to reverse cryptography. The processing capacity of today's quantum computers is insufficient to break some of these encryption algorithms; for example, a computer would need 20 million qubits to break a 2048- bit RSA system (Gidney and Ekera, 2019). This means that quantum computers are unlikely to break traditional encryption methods in the very near future using existing systems, although when they can it will be close to instantaneous to do so.

Funding streams and larger today than ever. We also continue to have a more mature understanding of the physics underlying quantum computing. These help facilitate the rapid rise from 2 to 127 qubit capabilities (Gent, 2021), new languages to handle a different type of logic, new design patterns emerging for quantum processing, and the rate of improvements are increasing faster. IBM predicts to be at nearly 10 times the 127 qubit capabilities in just 2 more years. The amount of time we can keep particles in the required state for processing continues to increase as well.

Furthermore, the amount of data to re-encode is so large that making it safe in the event of a quantum breach on cryptographic systems may take years. Yet it takes years to replace cryptographic systems on enterprise-grade or large-scale compute systems. As a result, researchers and governments are interested in developing solutions for post quantum cryptography (Giles, 2020).

## 3. Discussion

Many post quantum cryptography techniques developed by researchers are already being worked on by the US National Institute of Standards and Technology (NIST, 2016). The National Security Agency (NSA) has announced its intention to switch to quantum-resistant algorithms (Gao et al., 2017). Researchers from all over the world propose algorithms for three types of arithmetic problems: lattice-based, code-based, and multivariate. Multivariate key is one of the main ways that is more likely to replace current encryption methods for digital signatures since it is computationally efficient and uses few resources (Lin et al., 2012; Kundu et al., 2020).

M.-S. Chen et al. (2016) presented the MQDSS framework, which is based on the MQ problem's intractability, described as a system of multivariate quadratic polynomials in a finite field. The authors demonstrate that their protocol is based on an NP-hard problem, however this does not guarantee that the cryptosystem is secure. The study still has to do a thorough security analysis of the parameters, and there is room for improvement.

Kundu et al. (2020) propose a new framework based on the MC problem (Multivariate cubic) rather than the MQ problem used by M.-S. Chen et al. (2016), which has been shown to have more methods to be solved and so is less safe. With a more intractable problem-based protocol, the security component improves. The study claims to provide a more secure technique than the MQ approach, but due to a huge public key size, this proposal is inefficient.

Bos et al. suggested the Frodo solution, which is based on algebraically unstructured lattices (2016). FrodoKEM, a key-encapsulation variant, was submitted to NIST's Call for Proposals (Gao et al., 2017). Standard lattice-based methods are usually not practicable for embedded systems due to huge parameters, according to Howe et al. (2018), but FrodoKEM provides a solution by adding a ring structure to the learning with errors (LWE) issue. Although the authors go into great detail about security concerns, the concept features bigger key sizes than prior lattice-based systems.

Bhattacharya et al. propose a lattice-based method with more competitive key sizes (2018). Round5, a proposed cryptosystem, combines non-ring and ring lattice rounding problems (LWR) to drastically minimise the amount of bandwidth necessary to exchange secure keys. Although the structured and unstructured lattice variations are effective, the decapsulation failure rate remains high, and parameter adjustment may be able to fix this problem.

The security of Aragon et al. (2017)'s code-based solution is based on the fact that a general error-correcting code is difficult to decode. The suggested cryptographic technique, known as BIKE, is built on three problems: quasi-cyclic syndrome decoding (QCSD), codeword finding (QCCF), and a bit-flipping decoder. BIKE offers a more efficient key generation process since it avoids polynomial inversion, which is used by many other code-based techniques. One disadvantage of this method is that the key size increments. The work provides a good proof of security based on a well-known approach first proposed by McEliece (1978), however the decoding failure rate requires more investigation.

## Conclusion

Quantum computing is not without challenges. It is currently expensive, prone to fidelity issues, qubits have limited lifetimes, and there are competing branches that result in a lack of standardization. Additionally, quantum computers work differently than the boolean logic employed in traditional computers. Therefore, code, and in fact entire branches in how we develop new languages and design patterns for programming, will need to be reassessed.

Still, quantum computers represent problems which scientists all over the world are working to solve before they appear. The need for more research into the security and feasibility of quantum-safe cryptographic algorithms that have been proposed in the last decade was highlighted in this study. NIST is currently reviewing the solutions mentioned in the discussion section in order to discover an efficient and secure replacement for public key cryptographic algorithms; it is reasonable to assume that the best characteristics from several frameworks will be combined to create an acceptable solution.

However, most focus on new purely key-oriented techniques that require greater amounts of processing to attack. This means we will always be in a race with the quantum equivilent to Moore’s Law in order to develop more difficult ciphers and algorithms rather than rethinking how we secure our secrets both at rest and while in transit.

Most of the actual scientific outcomes still need a sufficient qubit count, longer lifetimes, or require further stabilization of noise. So large societal impact derived from quantum computing is still likely a decade away. Yet in 2019 a quantum computer at Google performed a calculation in a couple of days that a classical transistorized supercomputer would have taken 10,000 years to complete.

Colossus, one of the first computers (although arguably not yet a Turing machine) put into production, was built at Bletchely Park in 1944 to help break German encryption. It then went into production in the UK secret service until newer computers came along. That remained a secret until decades after the war had ended.

Early mechanical, tube, and then transistorized computers came about by scientists to solve scientific problems as well (as did the Internet). Many computing companies such as Control Data Corporation, Cray, and arguably Oracle (given the name was derived from a CIA project) were initially funded by governments, seeking to better protect their own secrets or gain access to those their adversaries had encrypted. But when those companies finally turned their attention to business, profound changes swept through the world almost overnight.

Those waves came about during a less globally connected world with physical and artificial barriers that no longer exist. Now those barriers come in the form of cryptographic hashes and keys based on standards that take years to work through standards bodies, rise to the top of backlogs for security products, and replace infrastructure that potentially introduces breaking changes to networks and applications downstream. The next evolution in private key encryption needs to come soon if it’s to be protecting humans before quantum techniques catch up to the existing options available.

Traditional sponsors of such research include governments. While research and development is certainly being conducted, more is needed in the private sector if new techniques are to be used to protect classical computing systems from the advancements in quantum computing that have been slowly coming to fruition over the past four decades.

## References

Abramsky, S. (2014), ‘Contextual semantics: From quantum mechanics to logic, databases, constraints, and complexity’, Contextuality from Quantum Physics to Psychology.

Aragon, N., Barreto, P., Bettaieb, S., Bidoux, L., Blazy, O., Deneuville, J.-C., Gaborit, P., Gueron, S., Guneysu, T., Melchor, C. A. et al. (2017), ‘Bike: bit flipping key encapsulation’. URL: https://hal.archives-ouvertes.fr/hal-01671903/document

Arute, F., Arya, K., Babbush, R., Bacon, D., Bardin, J. C., Barends, R., Biswas, R., Boixo, S., Brandao, F. G., Buell, D. A. et al. (2019), ‘Quantum supremacy using a programmable superconducting processor’, Nature 574(7779), 505–510.

Bhattacharya, S., Garcia-Morchon, O., Laarhoven, T., Rietman, R., Saarinen, M.-J. O., Tolhuizen, L. and Zhang, Z. (2018), ‘Round5: Compact and fast post-quantum public-key encryption.’, IACR Cryptology ePrint Archive 2018, 725. URL: https://round5.org/doc/round5paper.pdf

Black, P. E., Kuhn, D. R. and Williams, C. J. (2002), ‘Quantum Computing and Communication’, Advances in Computers 56, 189–244.

Bos, J., Costello, C., Ducas, L., Mironov, I., Naehrig, M., Nikolaenko, V., Raghunathan, A. and Stebila, D. (2016), Frodo: Take off the ring! practical, quantum-secure key exchange from lwe, in ‘Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security’, pp. 1006–1018.

Buchanan, W. and Woodward, A. (2016), ‘Will quantum computers be the end of public key encryption?’, ResearchGate pp. 1–22.

Campagna, M., Chen, L., Dagdelen, O., Ding, J., Fernick, J., Gisin, N., Hayford, D., Jennewein, T., Lutkenhaus,¨ N., Mosca, M. et al. (2015), ‘Quantum safe cryptography and security: An introduction, benefits, enablers and challenges’, European Telecommunications Standards Institute pp. 1–64.

Carroll, S. (2019), Something Deeply Hidden: Quantum Worlds and the Emergence of Spacetime, Dutton.

M.-S. Chen, Hulsing, A., Rijneveld, J., Samardjiska, S. and Schwabe, P. (2016), From 5- pass mq-based identification to mq-based signatures, in ‘International Conference on the Theory and Application of Cryptology and Information Security’, Springer, pp. 135 – 165.

L. Chen, Jordan, S., Liu, Y.-K., Moody, D., Peralta, R., Perlner, R. and Smith-Tone, D.

(2016), ‘Report on Post-Quantum Cryptography’, CSRC | NIST .

Durini, D. and Arutinov, D. (2014), ‘1 - Fundamental principles of photosensing’, High Performance Silicon Imaging pp. 3–24.

Fairley, P. (2020), ‘India grapples with vast solar park’, IEEE Spectrum 57(2), 8 – 9. URL: https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=8976889

Fernandez, J. J. (2018), ‘Richard feynman and the birth of quantum computing’, Medium . URL: https://medium.com/quantum1net/richard-feynman-and-the-birth-of-quantumcomputing-6fe4a0f5fcc7

Feynman, R. P. (1965), The Character of Physical Law, Penguin Press Science, Penguin Books, Limited ( UK ).

Feynman, R. P. (1982), ‘Simulating physics with computers’, International journal of theoretical physics 21, 467–488. URL: https://catonmat.net/ftp/simulating-physics-with-computers-richard-feynman.pdf

Gao, X., Ding, J., Liu, J. and Li, L. (2017), Post-quantum secure remote password protocol from rlwe problem, in ‘International Conference on Information Security and Cryptology’, Springer, pp. 99–116.

Gent, E. (2021), ‘IBM’s 127-Qubit Eagle Is the Biggest Quantum Computer Yet, SingularityHub URL: https://singularityhub.com/2021/11/22/ibms-127-qubit-eagle-is-the-biggest-quantum-computer-yet/

Gidney, C. and Ekera,˚ M. (2019), ‘How to factor 2048 bit RSA integers in 8 hours using 20 million noisy qubits’, arXiv . URL: https://arxiv.org/abs/1905.09749

Giles, M. (2020), ‘Explainer: What is post-quantum cryptography?’, MIT Technology Review . URL: https://www.technologyreview.com/2019/07/12/134211/explainer-what-is-postquantum-cryptography

Griffiths, G. (2018), ‘Introduction to Quantum Mechanics 3rd Edition’ Cambridge University Press

Howe, J., Oder, T., Krausz, M. and Guneysu,¨ T. (2018), ‘Standard Lattice-Based Key Encapsulation on Embedded Devices’, 1. pp. 372–393.

IEEESpectrum (2020), ‘4 Ways to Make Bigger Quantum Computers - IEEE Spectrum’. URL: https://spectrum.ieee.org/computing/hardware/4-ways-to-make-bigger-quantumcomputers

Joaddar, S. A. (2018), ‘Reasons why the Copenhagen Interpretation of Quantum Mechanics shouldn’t be considered complete’, Medium .

Kaye, P. R., Laflamme, R. and Mosca, M. (2007), An introduction to quantum computing, 1 edn, Oxford University Press, USA.

Kundu, N., Debnath, S. K., Mishra, D. and Choudhury, T. (2020), ‘Post-quantum digital signature scheme based on multivariate cubic problem’, Journal of Information Security and Applications 53, 102512. URL: https://www.sciencedirect.com/science/article/pii/S221421261931107X

Kupervasser, O. (2017), ‘Chapter 3 - Principal Paradoxes of Quantum Mechanics’, Application of New Cybernetics in Physics pp. 73–120.

Lin, D., Faugere,` J.-C., Perret, L. and Wang, T. (2012), ‘On enumeration of polynomial equivalence classes and their application to MPKC’, Finite Fields Appl. 18(2), 283–302. URL: https://www.sciencedirect.com/science/article/pii/S1071579715000593

Mavroeidis, V., Vishi, K., Zych, M. D. and Jøsang, A. (2018), ‘The Impact of Quantum Computing on Present Cryptography’, arXiv .

McEliece, R. (1978), ‘A public key cryptosystem based on algebraic coding theory. deep space network progress report 42 44, jet propulsion laboratory’.

Meng, X. and Pian, Z. (2016), ‘Chapter 5 - Vulnerability Assessment of the Distribution Network Based on Quantum Multiagent’, Intelligent Coordinated Control of Complex Uncertain Systems for Power Distribution Network Reliability pp. 91–126.

NIST (2016), ‘Announcing Request for Nominations for Public-Key Post-Quantum Cryptographic Algorithms’. [Online; accessed 4. May 2020]. URL: https://federalregister.gov/a/2016-18150

Rutherford, E. and Royds, T. (1909), ‘Xxi. the nature of the α particle from radioactive substances’, The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science 17(98), 281–286.

Shor, P. W. (1997), ‘Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer’, SIAM J. Comput. 26(5), 1484–1509.

Toon, J. (2020), ‘University to collaborate on developing software stacks for quantum computers’, Control Engineering

## Comentários