《量子机器学习中数据挖掘的量子计算方法(英文版)》分三个部分对量子机器学习中数据挖掘的量子计算方法进行了介绍,第壹部分对基础概念进行了整体概述,例如,机器学习、量子力学、量子计算等,第二部分介绍了经典的学习算法,第三部分介绍了量子计算与机器学习。这本书综合了广泛的调查研究形成,采用简洁的表达形式,并配以应用、实践的例子。
《量子机器学习中数据挖掘的量子计算方法(英文版)》由哈尔滨工业大学出版社出版。
作者:(匈牙利)维特克(Wittek P.)
目录 Preface
Notations
PartOne FundamentaIConcepts
1 Introduction
1.1 Learning Theory and Data Mining
1.2 Why Quantum Computers?
1.3 A Heterogeneous Model
1.4 An Overview of Quantum Machine Learning Algorithms
1.5 Quantum—Like Learning on Classical Computers
2 Machine Learning
2.1 Data—DrivenModels
2.2 FeatureSpace
2.3 Supervised and Unsupervised Learning
2.4 GeneralizationPerformance
2.5 ModeIComplexity
2.6 Ensembles
2.7 Data Dependencies and ComputationalComplexity
3 Quantum Mechanics
3.1 States and Superposition
3.2 Density Matrix Representation and Mixed States
3.3 Composite Systems and Entanglement
3.4 Evolution
3.5 Measurement
3.6 UncertaintyRelations
3.7 Tunneling
3.8 Adiabatic Theorem
3.9 No—CloningTheorem
4 Quantum Computing
4.1 Qubits and the Bloch Sphere
4.2 QuantumCircuits
4.3 Adiabatic Quantum Computing
4.4 QuantumParallelism
4.5 Grover's Algorithm
4.6 ComplexityClasses
4.7 QuantumInformationTheory
Part Two ClassicalLearning Algorithms
5 Unsupervised Learning
5.1 Principal Component Analysis
5.2 ManifoldEmbedding
5.3 K—Means and K—Medians Clustering
5.4 HierarchicalClustering
5.5 Density—BasedClustering
6 Pattern Recogrution and Neural Networks
6.1 ThePerceptron
6.2 HopfieldNetworks
6.3 FeedforwardNetworks
6.4 DeepLearning
6.5 ComputationalComplexity
7 Supervised Learning and Support Vector Machines
7.1 K—NearestNeighbors
7.20ptimal Margin Classifiers
7.3 SoftMargins
7.4 Nonlinearity and KemelFunctions
7.5 Least—SquaresFormulation
7.6 Generalization Performance
7.7 Multiclass Problems
7.8 Loss Functions
7.9 ComputationalComplexity
8 Regression Analysis
8.1 Linear Least Squares
8.2 NonlinearRegression
8.3 NonparametricRegression
8.4 ComputationalComplexity
9 Boosting
9.1 WeakClassifiers
9.2 AdaBoost
9.3 A Family of Convex Boosters
9.4 Nonconvex Loss Functions
Part Three Quantum Computing and Machine Learning
10 Clustering Structure and Quantum Computing
10.1 Quantum Random Access Memory
10.2 Calculating Dot Products
10.3 Quantum Principal Component Analysis
10.4 Toward Quantum Manifold Embedding
10.5 QuantumK—Means
10.6 QuantumK—Medians
10.7 Quantum Hierarchical Clustering
10.8 ComputationalComplexity
11 Quantum Pattern Recognition
11.1 Quantum Associative Memory
11.2 The Quantum Perceptron
11.3 Quantum Neural Networks
11.4 PhysicaIRealizations
11.5 ComputationalComplexity
12 QuantumClassification
12.1 Nearest Neighbors
12.2 Support Vector Machines with Grover's Search
12.3 Support Vector Machines with Exponential Speedup
12.4 ComputationalComplexity
13 Quantum Process Tomography and Regression
13.1 Channel—State Duality
13.2 Quantum Process Tomography
13.3 Groups, Compact Lie Groups, and the Unitary Group
13.4 Representation Theory
13.5 Parallel Application and Storage of the Unitary
13.6 Optimal State for Learning
13.7 Applying the Unitary and Finding the Parameter for the Input State
14 Boosting and Adiabatic Quantum Computing
14.1 Quantum Annealing
14.2 Quadratic Unconstrained Binary Optimization
14.3 Ising Model
14.4 QBoost
14.5 Nonconvexity
14.6 Sparsity, Bit Depth, and Generalization Performance
14.7 Mapping to Hardware
14.8 ComputationalComplexity
Bibliography
Recent advances in quantum information theory indicate that machine leamingmay benefit from various paradigms of the field.For instance, adiabatic quantumcomputing finds the minimum of a multivariate function by a controlled physicalprocess using the adiabatic theorem (Farhi et al., 2000).The function is translated toa physical description, the Hamiltonian operator of a quantum system.Then, a systemwith a simple Hamiltonian is prepared and initialized to the ground state, the lowestenergy state a quantum system can occupy.Finally, the simple Hamiltonian is evolvedto the target Hamiltonian, and, by the adiabatic theorem, the system remains in theground state.At the end of the process, the solution is read out from the system, andwe obtain the global optimum for the function in question.
While more and more articles that explore the intersection of quantum computingand machine learning are being published, the field is fragmented, as was alreadynoted over a decade ago (Bonner and Freivalds, 2002).This should not come as asurprise: machine learning itself is a diverse and fragmented field of inquiry.Weattempt to identify common algorithms and trends, and observe the subtle interplaybetween faster execution and improved performance in machine learning by quantumcomputing.