from quantum machine learning to quantum aiblogs.esa.int/philab/files/2019/04/vdunjko_leiden.pdf ·...

Post on 20-May-2020

11 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

From Quantum Machine Learning to Quantum AI

Vedran Dunjko v.dunjko@liacs.leidenuniv.nl

ML→QIP (quantum-applied ML) [’74] QIP→ML (quantum-enhanced ML) [‘94]

QIP↭ML (quantum-generalized learning) [‘00] ML-insipred QM/QIP Physics inspired ML/AI

Quantum Information Processing (QIP)Machine Learning/AI

(ML/AI)

Quantum Machine Learning (QML)

3

Machine learning is not one thing. AI is not even a few things.

AI

supervised learning

unsupervised learning

online learning

generative models

reinforcement learning

deep learning

statistical learning

non-parametric learning

parametric learning

local search

Symbolic AI

computational learning theorycontrol theory

non-convex optimization

sequential decision theory

MLbig data analysis

4

QeML is even more things

AI

supervised learning

unsupervised learning

online learning

generative models

reinforcement learning

deep learning

statistical learning

non-parametric learning

parametric learning

local search

Symbolic AI

computational learning theorycontrol theory non-convex

optimization

sequential decision theory

MLbig data analysis

Quantum linear algebra

Shallow quantum circuits

Quantum oracle identification

Quantum walks & search

Adiabatic QC/ Quantum optimization

Quantum COLT

5

QeML is even more things

AI

supervised learning

unsupervised learning

online learning

generative models

reinforcement learning

deep learning

statistical learning

non-parametric learning

parametric learning

local search

Symbolic AI

computational learning theorycontrol theory non-convex

optimization

sequential decision theory

MLbig data analysis

Quantum linear algebra

Shallow quantum circuits

Quantum walks & search

Learning P(labels|data) given samples from P(data,labels)

-generative models -clustering (discriminative) -feature extraction

Machine Learning: the WHAT

or

Sudo is this a cat?Sudo make me a cat. Sudo what is a cat!?

Learning structure in P(data) give samples from P(data)

7

Machine Learning: the HOW

output hypothesis h on Data x Labels approximating P(labels|data)

output hypothesis h on Data “approximating” P(data)

model parameters θ

estimate error

on sample (dataset)

OptimizerIn practice

What about quantum computers?

-manipulate registers of 2-level systems (qubits)

-full description:

n qubits → 2n dimensional vector

-likely can efficiently compute more things than classical computers (factoring) e.g. factor numbers, or generate complex distributions

-even if QC is “shallow”

Banana for scale

cca 50 qubit all-purpose noisy

…and physics …and computer science

…and reality

Quantum computers…

-manipulation: acting locally (gates)

special-purpose quantum annealers

Quantum computers…

…and physics …and computer science

…and reality

-can compute things likely beyond BPP (factoring)

-can produce distributions which are hard-to-simulate for classical computers (unless PH collapses)

-even if QC is “shallow”

Banana for scale

special-purpose quantum annealers

cca 50 qubit all-purpose noisy

-manipulate registers of 2-level systems (qubits)

-full description:

n qubits → 2n dimensional vector

Bottlenecks of ML and the quantum pipeline

a) The optimization bottleneck b) Big data & comp. complexity c) Machine learning Models

8

Bottlenecks of ML and the quantum pipeline

a) The optimization bottleneck — quantum annealers b) Big data & comp. complexity — universal QC and Q. databasesc) Machine learning Models — restricted (shallow) architectures

Bottlenecks of ML and the quantum pipeline

a) The optimization bottleneck — quantum annealers b) Big data & comp. complexity — universal QC and Q. databasesc) Machine learning Models — restricted (shallow) architectures

Exponential data?

+Much of data analysis

is linear-algebra:

regression = Moore-Penrose PCA = SVD…

Precursors of Quantum Big Data

%

AI

online learning

generative models

reinforcement learning

deep learning

statistical learning

non-parametric learning

parametric learning

local search

Symbolic AI

computational learning theorycontrol theory non-convex

optimization

sequential decision theory

ML Quantum linear algebra

Shallow quantum circuits

Adiabatic QC/ Quantum optimization

Quantum oracle identification

Quantum walks

big data analysis

supervised learning

unsupervised learning

6

16

Enter quantum linear algebra

| i /PN

i=1 xi|ii

R

N 3 x = (xi)i#

f(A)| i = ↵0| i+ ↵1A| i+ ↵0A2| i · · · ⇡ A�1| i

U |0i| i =A BC D

� 0

�=

A C

�= |0iA| i+ |0iC| i

f(A)| i = ↵0| i+ ↵1A| i+ ↵0A2| i · · · ⇡ A�1| i

amplitude encoding

block encoding

functions of operators

Phys. Rev. Lett. 15,. 103, 250502 (2009) arXiv:1806.01838

-n qubits ↔ 2n dimensional vector

-compute evolution = linear algebra

-so… evolution of quantum systems *does* linear algebra

-with exponentially large matrices!inner productsP (0) = |h0| i|2

exp(n) amplitudes

in n qubits

Prediction: 44 zettabytes by 2020.

If all data is floats, this is 5.5x1021 float values

If this worked literally…this would make us INFORMATION GODS.

Prediction: 44 zettabytes by 2020.

If all data is floats, this is 5.5x1021 float values

… can be stored in state of 73 qubits (ions, photons….)

If this worked literally…this would make us INFORMATION GODS.

Timeline

20032008

20092012

20142013

20162018

Pattern recognition on a QC

QRAMHHL

Regression, PCA, SVM

Optimal QLS

Quantum Recommender Systems

QLA, smoothed analysis, De-quantization of low-rank systems

2019?

{

Quantum database

Linear system solving

Machine learning applications & Improvements

First efficient end-to-end scenario

We made it so efficient… that sometimes

we don’t need QCs!!

Data-robustness implies

q. efficiency

-Quantum works with full-rank transforms (e.g. Fourier for series) -polynomial advantage (up to 16 degree difference at the moment) -error scaling: exponential precision v.s. poly (in-)precision

-exponentially efficient processing given suitable databases

Quantum and classical

Summary of quantum (inspired) “big data”

Quantum advantages over classical

The “bad”-not an inexhaustible source of exponential quantum advantage

15

Bottlenecks of ML and the quantum pipeline

a) The optimization bottleneck — quantum annealers

b) Big data & comp. complexity — universal QC and Q. databasesc) Machine learning Models — restricted (shallow) architectures

(Quantum) Machine learning Models

Improving ML == speeding up algorithms… or is it?

model parameters θ

estimate error

on sample (dataset)

Optimizer

“Machine learning”

24

Machine learning Models matter!

Image: 10.1016/j.compstruct.2018.03.007

best fit v.s. “generalization performance” or classifying well beyond the training set

Data:

Models:

Not all models (+training algo) are born equal (for real datasets)…

Challenge:squeek

or meow?

big data analysis unsupervised learning

25

AI

online learning

generative models

reinforcement learning

deep learning

statistical learning

non-parametric learning

parametric learning

local search

Symbolic AI

computational learning theorycontrol theory non-convex

optimization

sequential decision theory

ML Quantum linear algebra

Shallow quantum circuits

Quantum oracle identification

Quantum walks

supervised learning

Machine learning Models

model parameters θ

estimate error

on sample (dataset)

Optimizer

“Machine learning”

family of functions. if it’s “good”, we can generalize well

model parameters θ

estimate error

on sample (dataset)

Optimizer

How about “shallow quantum circuits”? -instead neural network, train a QC! -related to ideas from q. condensed-matter physics (VQE)

=

=

=

=

=

Quantum Machine learning Models

“quantum kernel methods”

Phys. Rev. Lett. 122, 040504 2019 Nature 567, 209–212 (2019) (c.f. Elizabeth Behrman in ‘90s)

Quantum Machine learning Models“quantum kernel methods”

The good - near term architectures - seems to be robust

(noise not inherently critical!) - possibly very expressive

The neutral - many parameters - model advantages less clear (contrast to variational methods!)

The bad - barren plateaus (also in DNN)

(x1 _ x4 _ x10)| {z } (x1 _ x4 _ x10)| {z }

=

=

=

=

=

|�(✓in, ✓class)i

✓class✓in(x)

{(x, label)i}

estimate error

on sample (dataset)

Optimizer

(fidu

cial

)

Phys. Rev. Lett. 122, 040504 2019 Nature 567, 209–212 (2019)

A hope… killer app for noisy QCs?

ML good for dealing with noise (in *data*)… Can QML deal with its own noise (in *process*)?

18

Beyond ML?

big data analysis unsupervised learning

online learning

generative models

reinforcement learning

deep learning

statistical learning

non-parametric learning

parametric learning

local search

Symbolic AI

computational learning theorycontrol theory non-convex

optimization

ML Quantum linear algebra

Shallow quantum circuits

Quantum oracle identification

Quantum walks

supervised learning

big data analysis unsupervised learning

31

AI

online learning

generative models

reinforcement learning

deep learning

statistical learning

non-parametric learning

parametric learning

local search

Symbolic AI

computational learning theorycontrol theory non-convex

optimization

sequential decision theory

ML Quantum linear algebra

Shallow quantum circuits

Adiabatic QC/ Quantum optimization

Quantum oracle identification

Quantum walks

supervised learning

Quantum-enhanced reinforcement learning

c.f. Briegel

big data analysis unsupervised learning

32

AI

online learning

generative models

reinforcement learning

deep learning

statistical learning

non-parametric learning

parametric learning

local search

Symbolic AI

computational learning theorycontrol theory non-convex

optimization

sequential decision theory

ML Quantum linear algebra

Shallow quantum circuits

Adiabatic QC/ Quantum optimization

Quantum oracle identification

Quantum walks

supervised learning

Towards good-old-fashioned-AI

-planning -(symbolic) reasoning -automated proving -logic

3

Find a proof of Riemann’s hypothesis

with less than a million lines(if it exists)?

Optimal packingShortest tours

Traffic flow optimization

finding *good* (not worst case!) solutions to this is central to AI

RL and ML

f(x1, . . . , xn) = C1 ^ C2 ^ · · ·Ck ^ · · ·CL

Ck = (u _ v _ w), u, v, w 2 {x1, . . . , xn} [ {x̄1, . . . , x̄n}

f(x1, . . . , xn) = (x1 _ x10 _ x̄51) ^ (x̄3 _ x̄10 _ x̄11) ^ (x̄11 _ x̄44 _ x̄51) · · ·

big data analysis unsupervised learning

34

AI

online learning

generative models

reinforcement learning

deep learning

statistical learning

non-parametric learning

parametric learning

local search

Symbolic AI

computational learning theorycontrol theory non-convex

optimization

sequential decision theory

ML Quantum linear algebra

Shallow quantum circuits

Adiabatic QC/ Quantum optimization

Quantum oracle identification

Quantum walks

supervised learning

Towards good-old-fashioned-AIQuantum solutions for combinatorial optimization

NB: NP not in BQP

-annealers-quantum-enhanced classical algorithms

even on small QCs

35

classical

quantum

NP problems on smaller quantum computers

VD, Ge, Cirac, Phys. Rev. Lett. 121, 250501 (2018)

Works because structure is loose

For heuristic solutions… noise may not be a terminal problem

AI as the killer ap?

big data analysis unsupervised learning

AI

online learning

generative models

reinforcement learning

deep learning

statistical learning

non-parametric learning

parametric learning

local search

Symbolic AI

computational learning theorycontrol theory non-convex

optimization

sequential decision theory

ML Quantum linear algebra

Shallow quantum circuits

Adiabatic QC/ Quantum optimization

Quantum oracle identification

Quantum walks

supervised learning

Editor-in-ChiefGiovanniAcampora,UniversityofNaplesFedericoII,Italy

FieldEditors1)QuantumMachineLearningSethLloyd(MIT),USA2)QuantumComputing forArtificialIntelligenceHansJürgenBriegel,(Innsbruck, Austria)3)ArtificialIntelligenceforQuantum InformationProcessingChin-Teng Lin(Sydney,Australia)4)Quantum- andBio-inspiredComputational IntelligenceFranciscoHerrera(Granada,Spain)5)QuantumOptimizationDavide Venturelli (USRA,USA)

CALLFORPAPERS

top related