Classical and Quantum Computation

Front Cover
American Mathematical Society, 2002 - Computational complexity - 257 pages
This book presents a concise introduction to an emerging and increasingly important topic, the theory of quantum computing. The development of quantum computing exploded in 1994 with the discovery of its use in factoring large numbers--an extremely difficult and time-consuming problem when using a conventional computer. In less than 300 pages, the authors set forth a solid foundation to the theory, including results that have not appeared elsewhere and improvements on existing works. The book starts with the basics of classical theory of computation, including NP-complete problems and the idea of complexity of an algorithm. Then the authors introduce general principles of quantum computing and pass to the study of main quantum computation algorithms: Grover's algorithm, Shor's factoring algorithm, and the Abelian hidden subgroup problem. In concluding sections, several related topics are discussed (parallel quantum computation, a quantum analog of NP-completeness, and quantum error-correcting codes). This is a suitable textbook for a graduate course in quantum computing. Prerequisites are very modest and include linear algebra, elements of group theory and probability, and the notion of an algorithm (on a formal or an intuitive level). The book is complete with problems, solutions, and an appendix summarizing the necessary results from number theory.

Other editions - View all

Bibliographic information