Global perspectives 5 perform the simulation in an efficient fashion.Thus quantum computers offer an essential speed advantage over classical computers.This speed advantage is so significant that many researchers believe that no conceivable amount of progress in classical computation would be able to overcome the gap between the power of a classical computer and the power of a quantum computer. What do we mean by 'efficient'versus 'inefficient'simulations of a quantum computer? Many of the key notions needed to answer this question were actually invented before the notion of a quantum computer had even arisen.In particular,the idea of efficient and inefficient algorithms was made mathematically precise by the field of computational complexity.Roughly speaking,an efficient algorithm is one which runs in time polynomial in the size of the problem solved.In contrast,an inefficient algorithm requires super- polynomial(typically exponential)time.What was noticed in the late 1960s and early 1970s was that it seemed as though the Turing machine model of computation was at least as powerful as any other model of computation,in the sense that a problem which could be solved efficiently in some model of computation could also be solved efficiently in the Turing machine model,by using the Turing machine to simulate the other model of computation.This observation was codified into a strengthened version of the Church- Turing thesis: Any algorithmic process can be simulated efficiently using a Turing machine. The key strengthening in the strong Church-Turing thesis is the word efficiently.If the strong Church-Turing thesis is correct,then it implies that no matter what type of machine we use to perform our algorithms,that machine can be simulated efficiently using a standard Turing machine.This is an important strengthening,as it implies that for the purposes of analyzing whether a given computational task can be accomplished efficiently,we may restrict ourselves to the analysis of the Turing machine model of computation. One class of challenges to the strong Church-Turing thesis comes from the field of analog computation.In the years since Turing,many different teams of researchers have noticed that certain types of analog computers can efficiently solve problems believed to have no efficient solution on a Turing machine.At first glance these analog computers appear to violate the strong form of the Church-Turing thesis.Unfortunately for analog computation,it turns out that when realistic assumptions about the presence of noise in analog computers are made,their power disappears in all known instances;they cannot efficiently solve problems which are not efficiently solvable on a Turing machine.This lesson-that the effects of realistic noise must be taken into account in evaluating the efficiency of a computational model-was one of the great early challenges of quantum computation and quantum information,a challenge successfully met by the development of a theory of quantum error-correcting codes and fault-tolerant quantum computation. Thus,unlike analog computation,quantum computation can in principle tolerate a finite amount of noise and still retain its computational advantages. The first major challenge to the strong Church-Turing thesis arose in the mid 1970s, when Robert Solovay and Volker Strassen showed that it is possible to test whether an in- teger is prime or composite using a randomized algorithm.That is,the Solovay-Strassen test for primality used randomness as an essential part of the algorithm.The algorithm did not determine whether a given integer was prime or composite with certainty.Instead, the algorithm could determine that a number was probably prime or else composite with
Global perspectives 5 perform the simulation in an efficient fashion. Thus quantum computers offer an essential speed advantage over classical computers. This speed advantage is so significant that many researchers believe that no conceivable amount of progress in classical computation would be able to overcome the gap between the power of a classical computer and the power of a quantum computer. What do we mean by ‘efficient’ versus ‘inefficient’ simulations of a quantum computer? Many of the key notions needed to answer this question were actually invented before the notion of a quantum computer had even arisen. In particular, the idea of efficient and inefficient algorithms was made mathematically precise by the field of computational complexity. Roughly speaking, an efficient algorithm is one which runs in time polynomial in the size of the problem solved. In contrast, an inefficient algorithm requires superpolynomial (typically exponential) time. What was noticed in the late 1960s and early 1970s was that it seemed as though the Turing machine model of computation was at least as powerful as any other model of computation, in the sense that a problem which could be solved efficiently in some model of computation could also be solved efficiently in the Turing machine model, by using the Turing machine to simulate the other model of computation. This observation was codified into a strengthened version of the Church– Turing thesis: Any algorithmic process can be simulated efficiently using a Turing machine. The key strengthening in the strong Church–Turing thesis is the word efficiently. If the strong Church–Turing thesis is correct, then it implies that no matter what type of machine we use to perform our algorithms, that machine can be simulated efficiently using a standard Turing machine. This is an important strengthening, as it implies that for the purposes of analyzing whether a given computational task can be accomplished efficiently, we may restrict ourselves to the analysis of the Turing machine model of computation. One class of challenges to the strong Church–Turing thesis comes from the field of analog computation. In the years since Turing, many different teams of researchers have noticed that certain types of analog computers can efficiently solve problems believed to have no efficient solution on a Turing machine. At first glance these analog computers appear to violate the strong form of the Church–Turing thesis. Unfortunately for analog computation, it turns out that when realistic assumptions about the presence of noise in analog computers are made, their power disappears in all known instances; they cannot efficiently solve problems which are not efficiently solvable on a Turing machine. This lesson – that the effects of realistic noise must be taken into account in evaluating the efficiency of a computational model – was one of the great early challenges of quantum computation and quantum information, a challenge successfully met by the development of a theory of quantum error-correcting codes and fault-tolerant quantum computation. Thus, unlike analog computation, quantum computation can in principle tolerate a finite amount of noise and still retain its computational advantages. The first major challenge to the strong Church–Turing thesis arose in the mid 1970s, when Robert Solovay and Volker Strassen showed that it is possible to test whether an integer is prime or composite using a randomized algorithm. That is, the Solovay–Strassen test for primality used randomness as an essential part of the algorithm. The algorithm did not determine whether a given integer was prime or composite with certainty. Instead, the algorithm could determine that a number was probably prime or else composite with
6 Introduction and overview certainty.By repeating the Solovay-Strassen test a few times it is possible to determine with near certainty whether a number is prime or composite.The Solovay-Strassen test was of especial significance at the time it was proposed as no deterministic test for pri- mality was then known,nor is one known at the time of this writing.Thus,it seemed as though computers with access to a random number generator would be able to efficiently perform computational tasks with no efficient solution on a conventional deterministic Turing machine.This discovery inspired a search for other randomized algorithms which has paid off handsomely,with the field blossoming into a thriving area of research. Randomized algorithms pose a challenge to the strong Church-Turing thesis,suggest- ing that there are efficiently soluble problems which,nevertheless,cannot be efficiently solved on a deterministic Turing machine.This challenge appears to be easily resolved by a simple modification of the strong Church-Turing thesis: Any algorithmic process can be simulated efficiently using a probabilistic Turing machine. This ad hoc modification of the strong Church-Turing thesis should leave you feeling rather queasy.Might it not turn out at some later date that yet another model of computa- tion allows one to efficiently solve problems that are not efficiently soluble within Turing's model of computation?Is there any way we can find a single model of computation which is guaranteed to be able to efficiently simulate any other model of computation? Motivated by this question,in 1985 David Deutsch asked whether the laws of physics could be use to derive an even stronger version of the Church-Turing thesis.Instead of adopting ad hoc hypotheses,Deutsch looked to physical theory to provide a foundation for the Church-Turing thesis that would be as secure as the status of that physical theory. In particular,Deutsch attempted to define a computational device that would be capable of efficiently simulating an arbitrary physical system.Because the laws of physics are ultimately quantum mechanical,Deutsch was naturally led to consider computing devices based upon the principles of quantum mechanics.These devices,quantum analogues of the machines defined forty-nine years earlier by Turing,led ultimately to the modern conception of a quantum computer used in this book. At the time of writing it is not clear whether Deutsch's notion of a Universal Quan- tum Computer is sufficient to efficiently simulate an arbitrary physical system.Proving or refuting this conjecture is one of the great open problems of the field of quantum computation and quantum information.It is possible,for example,that some effect of quantum field theory or an even more esoteric effect based in string theory,quantum gravity or some other physical theory may take us beyond Deutsch's Universal Quan- tum Computer,giving us a still more powerful model for computation.At this stage,we simply don't know. What Deutsch's model of a quantum computer did enable was a challenge to the strong form of the Church-Turing thesis.Deutsch asked whether it is possible for a quantum computer to efficiently solve computational problems which have no efficient solution on a classical computer,even a probabilistic Turing machine.He then constructed a simple example suggesting that,indeed,quantum computers might have computational powers exceeding those of classical computers. This remarkable first step taken by Deutsch was improved in the subsequent decade by many people,culminating in Peter Shor's 1994 demonstration that two enormously important problems-the problem of finding the prime factors of an integer,and the so-
6 Introduction and overview certainty. By repeating the Solovay–Strassen test a few times it is possible to determine with near certainty whether a number is prime or composite. The Solovay-Strassen test was of especial significance at the time it was proposed as no deterministic test for primality was then known, nor is one known at the time of this writing. Thus, it seemed as though computers with access to a random number generator would be able to efficiently perform computational tasks with no efficient solution on a conventional deterministic Turing machine. This discovery inspired a search for other randomized algorithms which has paid off handsomely, with the field blossoming into a thriving area of research. Randomized algorithms pose a challenge to the strong Church–Turing thesis, suggesting that there are efficiently soluble problems which, nevertheless, cannot be efficiently solved on a deterministic Turing machine. This challenge appears to be easily resolved by a simple modification of the strong Church–Turing thesis: Any algorithmic process can be simulated efficiently using a probabilistic Turing machine. This ad hoc modification of the strong Church–Turing thesis should leave you feeling rather queasy. Might it not turn out at some later date that yet another model of computation allows one to efficiently solve problems that are not efficiently soluble within Turing’s model of computation? Is there any way we can find a single model of computation which is guaranteed to be able to efficiently simulate any other model of computation? Motivated by this question, in 1985 David Deutsch asked whether the laws of physics could be use to derive an even stronger version of the Church–Turing thesis. Instead of adopting ad hoc hypotheses, Deutsch looked to physical theory to provide a foundation for the Church–Turing thesis that would be as secure as the status of that physical theory. In particular, Deutsch attempted to define a computational device that would be capable of efficiently simulating an arbitrary physical system. Because the laws of physics are ultimately quantum mechanical, Deutsch was naturally led to consider computing devices based upon the principles of quantum mechanics. These devices, quantum analogues of the machines defined forty-nine years earlier by Turing, led ultimately to the modern conception of a quantum computer used in this book. At the time of writing it is not clear whether Deutsch’s notion of a Universal Quantum Computer is sufficient to efficiently simulate an arbitrary physical system. Proving or refuting this conjecture is one of the great open problems of the field of quantum computation and quantum information. It is possible, for example, that some effect of quantum field theory or an even more esoteric effect based in string theory, quantum gravity or some other physical theory may take us beyond Deutsch’s Universal Quantum Computer, giving us a still more powerful model for computation. At this stage, we simply don’t know. What Deutsch’s model of a quantum computer did enable was a challenge to the strong form of the Church–Turing thesis. Deutsch asked whether it is possible for a quantum computer to efficiently solve computational problems which have no efficient solution on a classical computer, even a probabilistic Turing machine. He then constructed a simple example suggesting that, indeed, quantum computers might have computational powers exceeding those of classical computers. This remarkable first step taken by Deutsch was improved in the subsequent decade by many people, culminating in Peter Shor’s 1994 demonstration that two enormously important problems – the problem of finding the prime factors of an integer, and the so-
Global perspectives 7 called 'discrete logarithm'problem-could be solved efficiently on a quantum computer. This attracted widespread interest because these two problems were and still are widely believed to have no efficient solution on a classical computer.Shor's results are a power- ful indication that quantum computers are more powerful than Turing machines,even probabilistic Turing machines.Further evidence for the power of quantum computers came in 1995 when Lov Grover showed that another important problem-the problem of conducting a search through some unstructured search space-could also be sped up on a quantum computer.While Grover's algorithm did not provide as spectacular a speed- up as Shor's algorithms,the widespread applicability of search-based methodologies has excited considerable interest in Grover's algorithm. At about the same time as Shor's and Grover's algorithms were discovered,many people were developing an idea Richard Feynman had suggested in 1982.Feynman had pointed out that there seemed to be essential difficulties in simulating quantum mechan- ical systems on classical computers,and suggested that building computers based on the principles of quantum mechanics would allow us to avoid those difficulties.In the 1990s several teams of researchers began fleshing this idea out,showing that it is indeed possible to use quantum computers to efficiently simulate systems that have no known efficient simulation on a classical computer.It is likely that one of the major applications of quantum computers in the future will be performing simulations of quantum mechan- ical systems too difficult to simulate on a classical computer,a problem with profound scientific and technological implications. What other problems can quantum computers solve more quickly than classical com- puters?The short answer is that we don't know.Coming up with good quantum algo- rithms seems to be hard.A pessimist might think that's because there's nothing quantum computers are good for other than the applications already discovered!We take a differ- ent view.Algorithm design for quantum computers is hard because designers face two difficult problems not faced in the construction of algorithms for classical computers. First,our human intuition is rooted in the classical world.If we use that intuition as an aid to the construction of algorithms,then the algorithmic ideas we come up with will be classical ideas.To design good quantum algorithms one must 'turn off one's classical intuition for at least part of the design process,using truly quantum effects to achieve the desired algorithmic end.Second,to be truly interesting it is not enough to design an algorithm that is merely quantum mechanical.The algorithm must be better than any existing classical algorithm!Thus,it is possible that one may find an algorithm which makes use of truly quantum aspects of quantum mechanics,that is nevertheless not of widespread interest because classical algorithms with comparable performance charac- teristics exist.The combination of these two problems makes the construction of new quantum algorithms a challenging problem for the future. Even more broadly,we can ask if there are any generalizations we can make about the power of quantum computers versus classical computers.What is it that makes quantum computers more powerful than classical computers-assuming that this is indeed the case?What class of problems can be solved efficiently on a quantum computer,and how does that class compare to the class of problems that can be solved efficiently on a classical computer?One of the most exciting things about quantum computation and quantum information is how little is known about the answers to these questions!It is a great challenge for the future to understand these questions better. Having come up to the frontier of quantum computation,let's switch to the history
Global perspectives 7 called ‘discrete logarithm’ problem – could be solved efficiently on a quantum computer. This attracted widespread interest because these two problems were and still are widely believed to have no efficient solution on a classical computer. Shor’s results are a powerful indication that quantum computers are more powerful than Turing machines, even probabilistic Turing machines. Further evidence for the power of quantum computers came in 1995 when Lov Grover showed that another important problem – the problem of conducting a search through some unstructured search space – could also be sped up on a quantum computer. While Grover’s algorithm did not provide as spectacular a speedup as Shor’s algorithms, the widespread applicability of search-based methodologies has excited considerable interest in Grover’s algorithm. At about the same time as Shor’s and Grover’s algorithms were discovered, many people were developing an idea Richard Feynman had suggested in 1982. Feynman had pointed out that there seemed to be essential difficulties in simulating quantum mechanical systems on classical computers, and suggested that building computers based on the principles of quantum mechanics would allow us to avoid those difficulties. In the 1990s several teams of researchers began fleshing this idea out, showing that it is indeed possible to use quantum computers to efficiently simulate systems that have no known efficient simulation on a classical computer. It is likely that one of the major applications of quantum computers in the future will be performing simulations of quantum mechanical systems too difficult to simulate on a classical computer, a problem with profound scientific and technological implications. What other problems can quantum computers solve more quickly than classical computers? The short answer is that we don’t know. Coming up with good quantum algorithms seems to be hard. A pessimist might think that’s because there’s nothing quantum computers are good for other than the applications already discovered! We take a different view. Algorithm design for quantum computers is hard because designers face two difficult problems not faced in the construction of algorithms for classical computers. First, our human intuition is rooted in the classical world. If we use that intuition as an aid to the construction of algorithms, then the algorithmic ideas we come up with will be classical ideas. To design good quantum algorithms one must ‘turn off’ one’s classical intuition for at least part of the design process, using truly quantum effects to achieve the desired algorithmic end. Second, to be truly interesting it is not enough to design an algorithm that is merely quantum mechanical. The algorithm must be better than any existing classical algorithm! Thus, it is possible that one may find an algorithm which makes use of truly quantum aspects of quantum mechanics, that is nevertheless not of widespread interest because classical algorithms with comparable performance characteristics exist. The combination of these two problems makes the construction of new quantum algorithms a challenging problem for the future. Even more broadly, we can ask if there are any generalizations we can make about the power of quantum computers versus classical computers. What is it that makes quantum computers more powerful than classical computers – assuming that this is indeed the case? What class of problems can be solved efficiently on a quantum computer, and how does that class compare to the class of problems that can be solved efficiently on a classical computer? One of the most exciting things about quantum computation and quantum information is how little is known about the answers to these questions! It is a great challenge for the future to understand these questions better. Having come up to the frontier of quantum computation, let’s switch to the history
8 Introduction and overview of another strand of thought contributing to quantum computation and quantum infor- mation:information theory.At the same time computer science was exploding in the 1940s,another revolution was taking place in our understanding of communication.In 1948 Claude Shannon published a remarkable pair of papers laying the foundations for the modern theory of information and communication. Perhaps the key step taken by Shannon was to mathematically define the concept of information.In many mathematical sciences there is considerable flexibility in the choice of fundamental definitions.Try thinking naively for a few minutes about the following question:how would you go about mathematically defining the notion of an information source?Several different answers to this problem have found widespread use;however, the definition Shannon came up with seems to be far and away the most fruitful in terms of increased understanding,leading to a plethora of deep results and a theory with a rich structure which seems to accurately reflect many (though not all)real-world communications problems. Shannon was interested in two key questions related to the communication of in- formation over a communications channel.First,what resources are required to send information over a communications channel?For example,telephone companies need to know how much information they can reliably transmit over a given telephone cable Second,can information be transmitted in such a way that it is protected against noise in the communications channel? Shannon answered these two questions by proving the two fundamental theorems of information theory.The first,Shannon's noiseless channel coding theorem,quantifies the physical resources required to store the output from an information source.Shan- non's second fundamental theorem,the noisy channel coding theorem,quantifies how much information it is possible to reliably transmit through a noisy communications channel.To achieve reliable transmission in the presence of noise,Shannon showed that error-correcting codes could be used to protect the information being sent.Shannon's noisy channel coding theorem gives an upper limit on the protection afforded by error- correcting codes.Unfortunately,Shannon's theorem does not explicitly give a practically useful set of error-correcting codes to achieve that limit.From the time of Shannon's pa- pers until today,researchers have constructed more and better classes of error-correcting codes in their attempts to come closer to the limit set by Shannon's theorem.A sophisti- cated theory of error-correcting codes now exists offering the user a plethora of choices in their quest to design a good error-correcting code.Such codes are used in a multitude of places including,for example,compact disc players,computer modems,and satellite communications systems. Quantum information theory has followed with similar developments.In 1995,Ben Schumacher provided an analogue to Shannon's noiseless coding theorem,and in the process defined the 'quantum bit'or 'qubit'as a tangible physical resource.However, no analogue to Shannon's noisy channel coding theorem is yet known for quantum in- formation.Nevertheless,in analogy to their classical counterparts,a theory of quantum error-correction has been developed which,as already mentioned,allows quantum com- puters to compute effectively in the presence of noise,and also allows communication over noisy guantum channels to take place reliably. Indeed,classical ideas of error-correction have proved to be enormously important in developing and understanding quantum error-correcting codes.In 1996,two groups working independently,Robert Calderbank and Peter Shor,and Andrew Steane,discov-
8 Introduction and overview of another strand of thought contributing to quantum computation and quantum information: information theory. At the same time computer science was exploding in the 1940s, another revolution was taking place in our understanding of communication. In 1948 Claude Shannon published a remarkable pair of papers laying the foundations for the modern theory of information and communication. Perhaps the key step taken by Shannon was to mathematically define the concept of information. In many mathematical sciences there is considerable flexibility in the choice of fundamental definitions. Try thinking naively for a few minutes about the following question: how would you go about mathematically defining the notion of an information source? Several different answers to this problem have found widespread use; however, the definition Shannon came up with seems to be far and away the most fruitful in terms of increased understanding, leading to a plethora of deep results and a theory with a rich structure which seems to accurately reflect many (though not all) real-world communications problems. Shannon was interested in two key questions related to the communication of information over a communications channel. First, what resources are required to send information over a communications channel? For example, telephone companies need to know how much information they can reliably transmit over a given telephone cable. Second, can information be transmitted in such a way that it is protected against noise in the communications channel? Shannon answered these two questions by proving the two fundamental theorems of information theory. The first, Shannon’s noiseless channel coding theorem, quantifies the physical resources required to store the output from an information source. Shannon’s second fundamental theorem, the noisy channel coding theorem, quantifies how much information it is possible to reliably transmit through a noisy communications channel. To achieve reliable transmission in the presence of noise, Shannon showed that error-correcting codes could be used to protect the information being sent. Shannon’s noisy channel coding theorem gives an upper limit on the protection afforded by errorcorrecting codes. Unfortunately, Shannon’s theorem does not explicitly give a practically useful set of error-correcting codes to achieve that limit. From the time of Shannon’s papers until today, researchers have constructed more and better classes of error-correcting codes in their attempts to come closer to the limit set by Shannon’s theorem. A sophisticated theory of error-correcting codes now exists offering the user a plethora of choices in their quest to design a good error-correcting code. Such codes are used in a multitude of places including, for example, compact disc players, computer modems, and satellite communications systems. Quantum information theory has followed with similar developments. In 1995, Ben Schumacher provided an analogue to Shannon’s noiseless coding theorem, and in the process defined the ‘quantum bit’ or ‘qubit’ as a tangible physical resource. However, no analogue to Shannon’s noisy channel coding theorem is yet known for quantum information. Nevertheless, in analogy to their classical counterparts, a theory of quantum error-correction has been developed which, as already mentioned, allows quantum computers to compute effectively in the presence of noise, and also allows communication over noisy quantum channels to take place reliably. Indeed, classical ideas of error-correction have proved to be enormously important in developing and understanding quantum error-correcting codes. In 1996, two groups working independently, Robert Calderbank and Peter Shor, and Andrew Steane, discov-
Global perspectives 9 ered an important class of quantum codes now known as CSS codes after their initials. This work has since been subsumed by the stabilizer codes,independently discovered by Robert Calderbank,Eric Rains,Peter Shor and Neil Sloane,and by Daniel Gottesman. By building upon the basic ideas of classical linear coding theory,these discoveries greatly facilitated a rapid understanding of quantum error-correcting codes and their application to quantum computation and quantum information. The theory of quantum error-correcting codes was developed to protect quantum states against noise.What about transmitting ordinary classical information using a quantum channel?How efficiently can this be done?A few surprises have been discovered in this arena.In 1992 Charles Bennett and Stephen Wiesner explained how to transmit two classical bits of information,while only transmitting one quantum bit from sender to receiver,a result dubbed superdense coding. Even more interesting are the results in distributed quantum computation.Imagine you have two computers networked,trying to solve a particular problem.How much communication is required to solve the problem?Recently it has been shown that quan- tum computers can require exponentially less communication to solve certain problems than would be required if the networked computers were classical!Unfortunately,as yet these problems are not especially important in a practical setting,and suffer from some undesirable technical restrictions.A major challenge for the future of quantum compu- tation and quantum information is to find problems of real-world importance for which distributed quantum computation offers a substantial advantage over distributed classical computation. Let's return to information theory proper.The study of information theory begins with the properties of a single communications channel.In applications we often do not deal with a single communications channel,but rather with networks of many channels.The subject of networked information theory deals with the information carrying properties of such networks of communications channels,and has been developed into a rich and intricate subject. By contrast,the study of networked quantum information theory is very much in its infancy.Even for very basic questions we know little about the information carrying abil- ities of networks of quantum channels.Several rather striking preliminary results have been found in the past few years;however,no unifying theory of networked information theory exists for quantum channels.One example of networked quantum information theory should suffice to convince you of the value such a general theory would have. Imagine that we are attempting to send quantum information from Alice to Bob through a noisy quantum channel.If that channel has zero capacity for quantum information, then it is impossible to reliably send any information from Alice to Bob.Imagine instead that we consider two copies of the channel,operating in synchrony.Intuitively it is clear (and can be rigorously justified)that such a channel also has zero capacity to send quan- tum information.However,if we instead reverse the direction of one of the channels,as illustrated in Figure 1.1,it turns out that sometimes we can obtain a non-zero capacity for the transmission of information from Alice to Bob!Counter-intuitive properties like this illustrate the strange nature of quantum information.Better understanding the in- formation carrying properties of networks of quantum channels is a major open problem of quantum computation and quantum information. Let's switch fields one last time,moving to the venerable old art and science of cryp- tography.Broadly speaking,cryptography is the problem of doing communication or
Global perspectives 9 ered an important class of quantum codes now known as CSS codes after their initials. This work has since been subsumed by the stabilizer codes, independently discovered by Robert Calderbank, Eric Rains, Peter Shor and Neil Sloane, and by Daniel Gottesman. By building upon the basic ideas of classical linear coding theory, these discoveries greatly facilitated a rapid understanding of quantum error-correcting codes and their application to quantum computation and quantum information. The theory of quantum error-correcting codes was developed to protect quantum states against noise. What about transmitting ordinary classical information using a quantum channel? How efficiently can this be done? A few surprises have been discovered in this arena. In 1992 Charles Bennett and Stephen Wiesner explained how to transmit two classical bits of information, while only transmitting one quantum bit from sender to receiver, a result dubbed superdense coding. Even more interesting are the results in distributed quantum computation. Imagine you have two computers networked, trying to solve a particular problem. How much communication is required to solve the problem? Recently it has been shown that quantum computers can require exponentially less communication to solve certain problems than would be required if the networked computers were classical! Unfortunately, as yet these problems are not especially important in a practical setting, and suffer from some undesirable technical restrictions. A major challenge for the future of quantum computation and quantum information is to find problems of real-world importance for which distributed quantum computation offers a substantial advantage over distributed classical computation. Let’s return to information theory proper. The study of information theory begins with the properties of a single communications channel. In applications we often do not deal with a single communications channel, but rather with networks of many channels. The subject of networked information theory deals with the information carrying properties of such networks of communications channels, and has been developed into a rich and intricate subject. By contrast, the study of networked quantum information theory is very much in its infancy. Even for very basic questions we know little about the information carrying abilities of networks of quantum channels. Several rather striking preliminary results have been found in the past few years; however, no unifying theory of networked information theory exists for quantum channels. One example of networked quantum information theory should suffice to convince you of the value such a general theory would have. Imagine that we are attempting to send quantum information from Alice to Bob through a noisy quantum channel. If that channel has zero capacity for quantum information, then it is impossible to reliably send any information from Alice to Bob. Imagine instead that we consider two copies of the channel, operating in synchrony. Intuitively it is clear (and can be rigorously justified) that such a channel also has zero capacity to send quantum information. However, if we instead reverse the direction of one of the channels, as illustrated in Figure 1.1, it turns out that sometimes we can obtain a non-zero capacity for the transmission of information from Alice to Bob! Counter-intuitive properties like this illustrate the strange nature of quantum information. Better understanding the information carrying properties of networks of quantum channels is a major open problem of quantum computation and quantum information. Let’s switch fields one last time, moving to the venerable old art and science of cryptography. Broadly speaking, cryptography is the problem of doing communication or