Learning Library

← Back to Library

Quantum Kernels Accelerate ML Classification

Key Points

  • The speaker explains that linear classification often requires mapping data into a higher‑dimensional feature space using kernel functions to make the classes linearly separable.
  • Classical kernel methods can become computationally expensive or give poor results when dealing with highly correlated, complex, or high‑frequency time‑series data.
  • Quantum computers can encode data into quantum circuits, enabling the construction of quantum kernel functions that explore vastly richer feature spaces than classical kernels can.
  • IBM researchers demonstrated in 2021 that quantum kernels can achieve exponential speed‑ups over classical kernels for certain classification tasks.
  • Current research is focused on improving quantum kernels for structured data and expanding their practical applications in machine learning.

Full Transcript

# Quantum Kernels Accelerate ML Classification **Source:** [https://www.youtube.com/watch?v=NqHKr9CGWJ0](https://www.youtube.com/watch?v=NqHKr9CGWJ0) **Duration:** 00:05:56 ## Summary - The speaker explains that linear classification often requires mapping data into a higher‑dimensional feature space using kernel functions to make the classes linearly separable. - Classical kernel methods can become computationally expensive or give poor results when dealing with highly correlated, complex, or high‑frequency time‑series data. - Quantum computers can encode data into quantum circuits, enabling the construction of quantum kernel functions that explore vastly richer feature spaces than classical kernels can. - IBM researchers demonstrated in 2021 that quantum kernels can achieve exponential speed‑ups over classical kernels for certain classification tasks. - Current research is focused on improving quantum kernels for structured data and expanding their practical applications in machine learning. ## Sections - [00:00:00](https://www.youtube.com/watch?v=NqHKr9CGWJ0&t=0s) **Quantum Feature Mapping for Classification** - The speaker introduces quantum computing’s relevance to machine learning by illustrating how projecting data into a higher‑dimensional feature space can transform a hard linear‑classification problem into an easily separable one. ## Full Transcript
0:00today I'm going to talk to you about 0:02Quantum Computing applications in 0:04machine learning this is a very exciting 0:07area of quantum Computing research and 0:10lots of classical machine learning 0:11developers are understandably excited 0:14about the potential applications within 0:17their own field 0:18so to get started let's talk about a 0:21classical machine learning problem that 0:23is one that's very common linear 0:25classification 0:27so if we start with two sets of data 0:30that we want to classify into two 0:32separate categories let's draw them here 0:34we're just going to have 0:37Three Dots and three crosses all on a 0:42single linear plane here if we arrange 0:45if the data is arranged like this it can 0:48be pretty easy to classify this into two 0:50discrete groups we can draw a single 0:52line in the middle here and now we've 0:54classified them 0:56but this can be a lot harder if our data 1:00is more complex 1:01for example if our data is arranged like 1:03this 1:04perhaps with the crosses in the middle 1:07now there isn't a single line that we 1:10can draw 1:11um on this plane to classify the data 1:14into two discrete groups 1:18so in order to solve this problem and 1:20classify this data what we need to do is 1:22we need to map this data into a higher 1:27dimensional space which we're going to 1:29call a feature space 1:36then if we've mapped the data for 1:39example like this 1:43we can now see because we've mapped this 1:45data into a high dimensional space there 1:48is now a much easier way to classify 1:51this 1:52so how do we do this step of uh 1:55transferring our data mapping it into a 1:58higher dimensional feature space to do 2:00this we can use kernel functions 2:08kernel functions work by taking some 2:11underlying features of the original data 2:13set and using that to map those data 2:16points into this High dimensional 2:18feature space 2:19kernel functions are incredibly powerful 2:22and Incredibly versatile but they do 2:24face problems sometimes they just give 2:27poor results 2:28um and also the compute runtime can 2:31explode as the complexity of the data 2:34sets increase 2:35if you're a if you're an experienced 2:37machine learning developer perhaps 2:39you've seen this already if you're 2:42dealing with data that has very strong 2:43correlations or perhaps if you're 2:45dealing with time series forecasting 2:48where the data is very complex and at a 2:51high frequency 2:53but quantum computers have the potential 2:55to 2:57um provide an advantage in this space 3:01they can be useful because quantum 3:03computers 3:04can access much uh more complex and 3:08higher dimensional feature spaces than 3:10their classical counterparts can 3:14and they can do this because 3:16quantum computers can we can encode our 3:18data into Quantum circuits and the 3:21resulting kernel functions could be very 3:24difficult or even impossible to 3:26replicate on a classical machine as well 3:28as this those kind of functions also can 3:31perform better 3:32uh in 2021 3:35IBM researchers 3:37um actually proved that Quantum kernels 3:40can provide an exponential speed up over 3:43their classical counterparts for certain 3:46uh classes of classification problems 3:55um as well as this there is a lot of 3:57research going into improving Quantum 4:00kernels with structured data and kernel 4:04alignment so as you can see this field 4:06is incredibly exciting there's a lot of 4:09research going on in this space 4:13um and you can use kiss kit runtime 4:25to easily build a Quantum machine 4:27learning algorithms with built-in tools 4:30such as the sampler primitive which 4:34Primitives are unique to uh the IBM's 4:37kisket runtime these are essentially 4:40predefined programs that help us to 4:43optimize workflows and execute them 4:46efficiently on Quantum systems 4:49let's take for example our linear 4:51classification problem let's say we have 4:54our data and we've encoded it into a 4:57Quantum circuit 5:00we can then use the sampler primitive 5:08to obtain quasi-probabilities indicating 5:12the relationships between the the 5:16different data points and these 5:18relationships can constitute our kernel 5:21Matrix 5:24and that kernel Matrix can then be 5:27evaluated and used in even a classical 5:30support Vector machine 5:33to predict new classification labels 5:36so if you're ready to get started 5:38learning more about Quantum machine 5:40learning you can check out the links in 5:43the description for more information 5:44about kisket runtime as well as a 5:47Quantum machine learning course that's 5:49available on the kiskit textbook I hope 5:52you've enjoyed this content thank you 5:54very much for watching