Nomenclature and notation XXXI 10001 controlled-NOT 其 0 10 001 L001 0 [1 00 01 0 swap 文 10 0 0 001] 100 1 controlled-Z 010 0 0 0 0 0 0 01 controlled-phase 枣 000 0 0 02 0 0 0 0 0 0 01 王 00 0 0 Toffoli 0 0 1 0 0 0 00 0 00 0 0 0 0 0 0 0000 00 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Fredkin (controlled-swap) 王 0 1 0 0 0 0 0 0 00 1 0 0 00 0 0 0 0 0 0 0 0 0 010 00 0 0 0 0 1 measurement Projection onto 0)and |1) qubit wire carrying a single qubit (time goes left to right) classical bit wire carrying a single classical bit n qubits wire carrying n qubits
Nomenclature and notation controlled- ⎡⎢⎣ 1000 0100 0001 0010 ⎤⎥⎦ swap ⎡⎢⎣ 1000 0010 0100 0001 ⎤⎥⎦ controlledZ •Z = ⎡⎢⎣ 100 0 010 0 001 0 000 − 1 ⎤⎥⎦ controlled-phase ⎡⎢⎣ 1000 0100 0010 000 i ⎤⎥⎦ Toffoli ••⊕ ⎡⎢⎢⎢⎢⎢⎢⎢⎢⎣ 10000000 01000000 00100000 00010000 00001000 00000100 00000001 00000010 ⎤⎥⎥⎥⎥⎥⎥⎥⎥⎦ Fredkin (controlled-swap) •×× ⎡⎢⎢⎢⎢⎢⎢⎢⎢⎣ 10000000 01000000 00100000 00010000 00001000 00000010 00000100 00000001 ⎤⎥⎥⎥⎥⎥⎥⎥⎥⎦ measurement ✙✙✙✙✙ ❴❴❴❴❴✙ ❴❴❴ ✤✤✤✤✤✤✤❴❴❴❴❴❴❴❴✤✤✤✤✤✤✤ Projection onto | 0 and | 1 qubit wire carrying a single qubit (time goes left to right) classical bit wire carrying a single classical bit n qubits wire carrying n qubits xxxi
I Fundamental concepts 1 Introduction and overview Science offers the boldest metaphysics of the age.It is a thoroughly human construct,driven by the faith that if ive dream,press to discover,explain,and dream again,thereby plunging repeatedly into new terrain,the world will some- how come clearer and we will grasp the true strangeness of the universe.And the strangeness will all prove to be connected,and make sense. -Edward O.Wilson Information is physical. Rolf Landauer What are the fundamental concepts of quantum computation and quantum information? How did these concepts develop?To what uses may they be put?How will they be pre- sented in this book?The purpose of this introductory chapter is to answer these questions by developing in broad brushstrokes a picture of the field of quantum computation and quantum information.The intent is to communicate a basic understanding of the central concepts of the field,perspective on how they have been developed,and to help you decide how to approach the rest of the book. Our story begins in Section 1.1 with an account of the historical context in which quantum computation and quantum information has developed.Each remaining section in the chapter gives a brief introduction to one or more fundamental concepts from the field:quantum bits(Section 1.2),quantum computers,quantum gates and quantum cir- cuits(Section 1.3),quantum algorithms(Section 1.4),experimental quantum information processing(Section 1.5),and quantum information and communication(Section 1.6). Along the way,illustrative and easily accessible developments such as quantum tele- portation and some simple quantum algorithms are given,using the basic mathematics taught in this chapter.The presentation is self-contained,and designed to be accessible even without a background in computer science or physics.As we move along,we give pointers to more in-depth discussions in later chapters,where references and suggestions for further reading may also be found. If as you read you're finding the going rough,skip on to a spot where you feel more comfortable.At points we haven't been able to avoid using a little technical lingo which won't be completely explained until later in the book.Simply accept it for now,and come back later when you understand all the terminology in more detail.The emphasis in this first chapter is on the big picture,with the details to be filled in later. 1.1 Global perspectives Quantum computation and quantum information is the study of the information process- ing tasks that can be accomplished using quantum mechanical systems.Sounds pretty
I Fundamental concepts 1 Introduction and overview Science offers the boldest metaphysics of the age. It is a thoroughly human construct, driven by the faith that if we dream, press to discover, explain, and dream again, thereby plunging repeatedly into new terrain, the world will somehow come clearer and we will grasp the true strangeness of the universe. And the strangeness will all prove to be connected, and make sense. – Edward O. Wilson Information is physical. – Rolf Landauer What are the fundamental concepts of quantum computation and quantum information? How did these concepts develop? To what uses may they be put? How will they be presented in this book? The purpose of this introductory chapter is to answer these questions by developing in broad brushstrokes a picture of the field of quantum computation and quantum information. The intent is to communicate a basic understanding of the central concepts of the field, perspective on how they have been developed, and to help you decide how to approach the rest of the book. Our story begins in Section 1.1 with an account of the historical context in which quantum computation and quantum information has developed. Each remaining section in the chapter gives a brief introduction to one or more fundamental concepts from the field: quantum bits (Section 1.2), quantum computers, quantum gates and quantum circuits (Section 1.3), quantum algorithms (Section 1.4), experimental quantum information processing (Section 1.5), and quantum information and communication (Section 1.6). Along the way, illustrative and easily accessible developments such as quantum teleportation and some simple quantum algorithms are given, using the basic mathematics taught in this chapter. The presentation is self-contained, and designed to be accessible even without a background in computer science or physics. As we move along, we give pointers to more in-depth discussions in later chapters, where references and suggestions for further reading may also be found. If as you read you’re finding the going rough, skip on to a spot where you feel more comfortable. At points we haven’t been able to avoid using a little technical lingo which won’t be completely explained until later in the book. Simply accept it for now, and come back later when you understand all the terminology in more detail. The emphasis in this first chapter is on the big picture, with the details to be filled in later. 1.1 Global perspectives Quantum computation and quantum information is the study of the information processing tasks that can be accomplished using quantum mechanical systems. Sounds pretty
2 Introduction and overview simple and obvious,doesn't it?Like many simple but profound ideas it was a long time before anybody thought of doing information processing using quantum mechanical sys- tems.To see why this is the case,we must go back in time and look in turn at each of the fields which have contributed fundamental ideas to quantum computation and quantum information-quantum mechanics,computer science,information theory,and cryptography.As we take our short historical tour of these fields,think of yourself first as a physicist,then as a computer scientist,then as an information theorist,and finally as a cryptographer,in order to get some feel for the disparate perspectives which have come together in quantum computation and quantum information. 1.1.1 History of quantum computation and quantum information Our story begins at the turn of the twentieth century when an unheralded revolution was underway in science.A series of crises had arisen in physics.The problem was that the theories of physics at that time(now dubbed classical physics)were predicting absurdities such as the existence of an'ultraviolet catastrophe'involving infinite energies,or electrons spiraling inexorably into the atomic nucleus.At first such problems were resolved with the addition of ad hoc hypotheses to classical physics,but as a better understanding of atoms and radiation was gained these attempted explanations became more and more convoluted.The crisis came to a head in the early 1920s after a quarter century of turmoil, and resulted in the creation of the modern theory of quantum mechanics.Quantum mechanics has been an indispensable part of science ever since,and has been applied with enormous success to everything under and inside the Sun,including the structure of the atom,nuclear fusion in stars,superconductors,the structure of DNA,and the elementary particles of Nature. What is quantum mechanics?Quantum mechanics is a mathematical framework or set of rules for the construction of physical theories.For example,there is a physical theory known as quantum electrodynamics which describes with fantastic accuracy the interac- tion of atoms and light.Quantum electrodynamics is built up within the framework of quantum mechanics,but it contains specific rules not determined by quantum mechanics. The relationship of quantum mechanics to specific physical theories like quantum elec- trodynamics is rather like the relationship of a computer's operating system to specific applications software-the operating system sets certain basic parameters and modes of operation,but leaves open how specific tasks are accomplished by the applications. The rules of quantum mechanics are simple but even experts find them counter- intuitive,and the earliest antecedents of quantum computation and quantum information may be found in the long-standing desire of physicists to better understand quantum mechanics.The best known critic of quantum mechanics,Albert Einstein,went to his grave unreconciled with the theory he helped invent.Generations of physicists since have wrestled with quantum mechanics in an effort to make its predictions more palatable. One of the goals of quantum computation and quantum information is to develop tools which sharpen our intuition about quantum mechanics,and make its predictions more transparent to human minds. For example,in the early 1980s,interest arose in whether it might be possible to use quantum effects to signal faster than light-a big no-no according to Einstein's theory of relativity.The resolution of this problem turns out to hinge on whether it is possible to clone an unknown quantum state,that is,construct a copy of a quantum state.If cloning were possible,then it would be possible to signal faster than light using quantum effects
2 Introduction and overview simple and obvious, doesn’t it? Like many simple but profound ideas it was a long time before anybody thought of doing information processing using quantum mechanical systems. To see why this is the case, we must go back in time and look in turn at each of the fields which have contributed fundamental ideas to quantum computation and quantum information – quantum mechanics, computer science, information theory, and cryptography. As we take our short historical tour of these fields, think of yourself first as a physicist, then as a computer scientist, then as an information theorist, and finally as a cryptographer, in order to get some feel for the disparate perspectives which have come together in quantum computation and quantum information. 1.1.1 History of quantum computation and quantum information Our story begins at the turn of the twentieth century when an unheralded revolution was underway in science. A series of crises had arisen in physics. The problem was that the theories of physics at that time (now dubbed classical physics) were predicting absurdities such as the existence of an ‘ultraviolet catastrophe’ involving infinite energies, or electrons spiraling inexorably into the atomic nucleus. At first such problems were resolved with the addition of ad hoc hypotheses to classical physics, but as a better understanding of atoms and radiation was gained these attempted explanations became more and more convoluted. The crisis came to a head in the early 1920s after a quarter century of turmoil, and resulted in the creation of the modern theory of quantum mechanics. Quantum mechanics has been an indispensable part of science ever since, and has been applied with enormous success to everything under and inside the Sun, including the structure of the atom, nuclear fusion in stars, superconductors, the structure of DNA, and the elementary particles of Nature. What is quantum mechanics? Quantum mechanics is a mathematical framework or set of rules for the construction of physical theories. For example, there is a physical theory known as quantum electrodynamics which describes with fantastic accuracy the interaction of atoms and light. Quantum electrodynamics is built up within the framework of quantum mechanics, but it contains specific rules not determined by quantum mechanics. The relationship of quantum mechanics to specific physical theories like quantum electrodynamics is rather like the relationship of a computer’s operating system to specific applications software – the operating system sets certain basic parameters and modes of operation, but leaves open how specific tasks are accomplished by the applications. The rules of quantum mechanics are simple but even experts find them counterintuitive, and the earliest antecedents of quantum computation and quantum information may be found in the long-standing desire of physicists to better understand quantum mechanics. The best known critic of quantum mechanics, Albert Einstein, went to his grave unreconciled with the theory he helped invent. Generations of physicists since have wrestled with quantum mechanics in an effort to make its predictions more palatable. One of the goals of quantum computation and quantum information is to develop tools which sharpen our intuition about quantum mechanics, and make its predictions more transparent to human minds. For example, in the early 1980s, interest arose in whether it might be possible to use quantum effects to signal faster than light – a big no-no according to Einstein’s theory of relativity. The resolution of this problem turns out to hinge on whether it is possible to clone an unknown quantum state, that is, construct a copy of a quantum state. If cloning were possible, then it would be possible to signal faster than light using quantum effects
Global perspectives 3 However,cloning-so easy to accomplish with classical information(consider the words in front of you,and where they came from!)-turns out not to be possible in general in quantum mechanics.This no-cloning theorem,discovered in the early 1980s,is one of the earliest results of quantum computation and quantum information.Many refinements of the no-cloning theorem have since been developed,and we now have conceptual tools which allow us to understand how well a(necessarily imperfect)quantum cloning device might work.These tools,in turn,have been applied to understand other aspects of quantum mechanics. A related historical strand contributing to the development of quantum computation and quantum information is the interest,dating to the 1970s,of obtaining complete con- trol over single quantum systems.Applications of quantum mechanics prior to the 1970s typically involved a gross level of control over a bulk sample containing an enormous number of quantum mechanical systems,none of them directly accessible.For example, superconductivity has a superb quantum mechanical explanation.However,because a su- perconductor involves a huge(compared to the atomic scale)sample of conducting metal, we can only probe a few aspects of its quantum mechanical nature,with the individual quantum systems constituting the superconductor remaining inaccessible.Systems such as particle accelerators do allow limited access to individual quantum systems,but again provide little control over the constituent systems. Since the 1970s many techniques for controlling single quantum systems have been developed.For example,methods have been developed for trapping a single atom in an 'atom trap',isolating it from the rest of the world and allowing us to probe many different aspects of its behavior with incredible precision.The scanning tunneling microscope has been used to move single atoms around,creating designer arrays of atoms at will. Electronic devices whose operation involves the transfer of only single electrons have been demonstrated. Why all this effort to attain complete control over single quantum systems?Setting aside the many technological reasons and concentrating on pure science,the principal answer is that researchers have done this on a hunch.Often the most profound insights in science come when we develop a method for probing a new regime of Nature.For example,the invention of radio astronomy in the 1930s and 1940s led to a spectacular sequence of discoveries,including the galactic core of the Milky Way galaxy,pulsars,and quasars.Low temperature physics has achieved its amazing successes by finding ways to lower the temperatures of different systems.In a similar way,by obtaining complete control over single quantum systems,we are exploring untouched regimes of Nature in the hope of discovering new and unexpected phenomena.We are just now taking our first steps along these lines,and already a few interesting surprises have been discovered in this regime.What else shall we discover as we obtain more complete control over single quantum systems,and extend it to more complex systems? Quantum computation and quantum information fit naturally into this program.They provide a useful series of challenges at varied levels of difficulty for people devising methods to better manipulate single quantum systems,and stimulate the development of new experimental techniques and provide guidance as to the most interesting directions in which to take experiment.Conversely,the ability to control single quantum systems is essential if we are to harness the power of quantum mechanics for applications to quantum computation and quantum information. Despite this intense interest,efforts to build quantum information processing systems
Global perspectives 3 However, cloning – so easy to accomplish with classical information (consider the words in front of you, and where they came from!) – turns out not to be possible in general in quantum mechanics. This no-cloning theorem, discovered in the early 1980s, is one of the earliest results of quantum computation and quantum information. Many refinements of the no-cloning theorem have since been developed, and we now have conceptual tools which allow us to understand how well a (necessarily imperfect) quantum cloning device might work. These tools, in turn, have been applied to understand other aspects of quantum mechanics. A related historical strand contributing to the development of quantum computation and quantum information is the interest, dating to the 1970s, of obtaining complete control over single quantum systems. Applications of quantum mechanics prior to the 1970s typically involved a gross level of control over a bulk sample containing an enormous number of quantum mechanical systems, none of them directly accessible. For example, superconductivity has a superb quantum mechanical explanation. However, because a superconductor involves a huge (compared to the atomic scale) sample of conducting metal, we can only probe a few aspects of its quantum mechanical nature, with the individual quantum systems constituting the superconductor remaining inaccessible. Systems such as particle accelerators do allow limited access to individual quantum systems, but again provide little control over the constituent systems. Since the 1970s many techniques for controlling single quantum systems have been developed. For example, methods have been developed for trapping a single atom in an ‘atom trap’, isolating it from the rest of the world and allowing us to probe many different aspects of its behavior with incredible precision. The scanning tunneling microscope has been used to move single atoms around, creating designer arrays of atoms at will. Electronic devices whose operation involves the transfer of only single electrons have been demonstrated. Why all this effort to attain complete control over single quantum systems? Setting aside the many technological reasons and concentrating on pure science, the principal answer is that researchers have done this on a hunch. Often the most profound insights in science come when we develop a method for probing a new regime of Nature. For example, the invention of radio astronomy in the 1930s and 1940s led to a spectacular sequence of discoveries, including the galactic core of the Milky Way galaxy, pulsars, and quasars. Low temperature physics has achieved its amazing successes by finding ways to lower the temperatures of different systems. In a similar way, by obtaining complete control over single quantum systems, we are exploring untouched regimes of Nature in the hope of discovering new and unexpected phenomena. We are just now taking our first steps along these lines, and already a few interesting surprises have been discovered in this regime. What else shall we discover as we obtain more complete control over single quantum systems, and extend it to more complex systems? Quantum computation and quantum information fit naturally into this program. They provide a useful series of challenges at varied levels of difficulty for people devising methods to better manipulate single quantum systems, and stimulate the development of new experimental techniques and provide guidance as to the most interesting directions in which to take experiment. Conversely, the ability to control single quantum systems is essential if we are to harness the power of quantum mechanics for applications to quantum computation and quantum information. Despite this intense interest, efforts to build quantum information processing systems
Introduction and overview have resulted in modest success to date.Small quantum computers,capable of doing dozens of operations on a few quantum bits (or qubits)represent the state of the art in quantum computation.Experimental prototypes for doing quantum cryptography -a way of communicating in secret across long distances-have been demonstrated,and are even at the level where they may be useful for some real-world applications.However,it remains a great challenge to physicists and engineers of the future to develop techniques for making large-scale quantum information processing a reality. Let us turn our attention from quantum mechanics to another of the great intellectual triumphs of the twentieth century,computer science.The origins of computer science are lost in the depths of history.For example,cuneiform tablets indicate that by the time of Hammurabi(circa 1750 B.C.)the Babylonians had developed some fairly sophisticated algorithmic ideas,and it is likely that many of those ideas date to even earlier times. The modern incarnation of computer science was announced by the great mathemati- cian Alan Turing in a remarkable 1936 paper.Turing developed in detail an abstract notion of what we would now call a programmable computer,a model for computation now known as the Turing machine,in his honor.Turing showed that there is a Universal Turing Machine that can be used to simulate any other Turing machine.Furthermore, he claimed that the Universal Turing Machine completely captures what it means to per- form a task by algorithmic means.That is,if an algorithm can be performed on any piece of hardware (say,a modern personal computer),then there is an equivalent algorithm for a Universal Turing Machine which performs exactly the same task as the algorithm running on the personal computer.This assertion,known as the Church-Turing thesis in honor of Turing and another pioneer of computer science,Alonzo Church,asserts the equivalence between the physical concept of what class of algorithms can be performed on some physical device with the rigorous mathematical concept of a Universal Turing Machine.The broad acceptance of this thesis laid the foundation for the development of a rich theory of computer science. Not long after Turing's paper,the first computers constructed from electronic com- ponents were developed.John von Neumann developed a simple theoretical model for how to put together in a practical fashion all the components necessary for a computer to be fully as capable as a Universal Turing Machine.Hardware development truly took off,though,in 1947,when John Bardeen,Walter Brattain,and Will Shockley developed the transistor.Computer hardware has grown in power at an amazing pace ever since,so much so that the growth was codified by Gordon Moore in 1965 in what has come to be known as Moore's law,which states that computer power will double for constant cost roughly once every two years. Amazingly enough,Moore's law has approximately held true in the decades since the 1960s.Nevertheless,most observers expect that this dream run will end some time during the first two decades of the twenty-first century.Conventional approaches to the fabrication of computer technology are beginning to run up against fundamental difficulties of size.Quantum effects are beginning to interfere in the functioning of electronic devices as they are made smaller and smaller. One possible solution to the problem posed by the eventual failure of Moore's law is to move to a different computing paradigm.One such paradigm is provided by the theory of quantum computation,which is based on the idea of using quantum mechanics to perform computations,instead of classical physics.It turns out that while an ordinary computer can be used to simulate a quantum computer,it appears to be impossible to
4 Introduction and overview have resulted in modest success to date. Small quantum computers, capable of doing dozens of operations on a few quantum bits (or qubits) represent the state of the art in quantum computation. Experimental prototypes for doing quantum cryptography – a way of communicating in secret across long distances – have been demonstrated, and are even at the level where they may be useful for some real-world applications. However, it remains a great challenge to physicists and engineers of the future to develop techniques for making large-scale quantum information processing a reality. Let us turn our attention from quantum mechanics to another of the great intellectual triumphs of the twentieth century, computer science. The origins of computer science are lost in the depths of history. For example, cuneiform tablets indicate that by the time of Hammurabi (circa 1750 B.C.) the Babylonians had developed some fairly sophisticated algorithmic ideas, and it is likely that many of those ideas date to even earlier times. The modern incarnation of computer science was announced by the great mathematician Alan Turing in a remarkable 1936 paper. Turing developed in detail an abstract notion of what we would now call a programmable computer, a model for computation now known as the Turing machine, in his honor. Turing showed that there is a Universal Turing Machine that can be used to simulate any other Turing machine. Furthermore, he claimed that the Universal Turing Machine completely captures what it means to perform a task by algorithmic means. That is, if an algorithm can be performed on any piece of hardware (say, a modern personal computer), then there is an equivalent algorithm for a Universal Turing Machine which performs exactly the same task as the algorithm running on the personal computer. This assertion, known as the Church–Turing thesis in honor of Turing and another pioneer of computer science, Alonzo Church, asserts the equivalence between the physical concept of what class of algorithms can be performed on some physical device with the rigorous mathematical concept of a Universal Turing Machine. The broad acceptance of this thesis laid the foundation for the development of a rich theory of computer science. Not long after Turing’s paper, the first computers constructed from electronic components were developed. John von Neumann developed a simple theoretical model for how to put together in a practical fashion all the components necessary for a computer to be fully as capable as a Universal Turing Machine. Hardware development truly took off, though, in 1947, when John Bardeen, Walter Brattain, and Will Shockley developed the transistor. Computer hardware has grown in power at an amazing pace ever since, so much so that the growth was codified by Gordon Moore in 1965 in what has come to be known as Moore’s law, which states that computer power will double for constant cost roughly once every two years. Amazingly enough, Moore’s law has approximately held true in the decades since the 1960s. Nevertheless, most observers expect that this dream run will end some time during the first two decades of the twenty-first century. Conventional approaches to the fabrication of computer technology are beginning to run up against fundamental difficulties of size. Quantum effects are beginning to interfere in the functioning of electronic devices as they are made smaller and smaller. One possible solution to the problem posed by the eventual failure of Moore’s law is to move to a different computing paradigm. One such paradigm is provided by the theory of quantum computation, which is based on the idea of using quantum mechanics to perform computations, instead of classical physics. It turns out that while an ordinary computer can be used to simulate a quantum computer, it appears to be impossible to