Techno Blender
Digitally Yours.

Quantum Deep Learning: A Quick Guide to Quantum Convolutional Neural Networks | by Holly Emblem | Oct, 2022

0 69


Everything you need to know about quantum convolutional neural networks (QCNNs), including the benefits and limitations of these approaches compared to classical computing methods

IQM Quantum Computer in Espoo Finland by Ragsxl

In recent years investment in quantum computing has increased significantly, with quantum approaches to areas such as security and network communication expected to upend existing classical computing techniques.

Researchers such as Garg and Ramakrishnan identify that at its core, quantum computing aims to “solve classically intractable problems through computationally cheaper techniques”. It is perhaps unsurprising that just as research in deep learning and quantum computing have grown in parallel in recent years, many are now examining the possibilities at the intersection of these two fields: Quantum deep learning.

In this article, we’ll discuss at high-level existing research and applications of quantum deep learning, focusing on hybrid quantum convolutional neural networks (QCNNs). To begin, a brief definition of quantum computing compared to classical computing is provided. From here, entanglement is defined, alongside entangled states and their applications.

Next, an overview of classical convolutional neural networks (CCNNs, or CNNs) is discussed. Finally, a discussion of QCNNs and their performance is provided, as well as the benefits and limitations of these approaches.

If you’re completely new to quantum computing, an important introductory concept is the difference between classical computing (what we are typically used to for computing tasks) and quantum. On classical computers, when a program is executed, a compiler is used to translate the program’s statements into operations held on binary bits.

Unlike bits on classical computers, which at any point will represent 1 or 0, qubits are able to “hover” between these two states. Only when measured will the qubit “collapse” into one of its states.

This property is known as superposition and is critical for quantum computing tasks (Ganguly, Cambier, 2021). Through superposition, quantum computers can perform tasks in parallel and do not require a completely parallel architecture or GPUs to process parallel computing tasks. This is because if each superposition states corresponds to a differing value, if a superposition state is acted upon, then the action is performed across all the states at the same time.

An example of a superposition quantum state is as follows:

With and a and b referring to probability amplitudes, which give the probability of projecting into a state once measurement is performed. Superposition quantum states are created by using quantum logic gates. If Bra-ket notation is new to you, then Perry’s Temple of Quantum Computing is highly recommended.

Just as superposition is an important principle in quantum physics, another key area to discuss is entanglement. Entanglement refers to the act of generating, or causing an interaction, between two or more particles in a manner which means that the quantum state of these particles can no longer be described independently from each other, even when separated by a long distance. When particles become entangled, if one particle is measured, then the other particle which is entangled with it will measure as the opposite state, instantaneously (these particles have no local state).

With an understanding of qubits and entanglement developed, it is now possible to discuss Bell states. These are maximally entangled states of qubits, which are:

|00⟩ → β → 1 √ 2 (|00⟩ + |11⟩) = |β00⟩,

|01⟩ → β → 1 √ 2 (|01⟩ + |10⟩) = |β01⟩

|10⟩ → β → 1 √ 2 (|00⟩ − |11⟩) = |β10⟩

|11⟩ → β → 1 √ 2 (|01⟩ − |10⟩) = |β11⟩

A Bell state is created with the following quantum circuit:

Bell state circuit from Perry’s Temple of Quantum Computing.

Here, a Bell state circuit is shown, which takes qubit inputs and applies a Hadamard and CNOT gate to create an entangled Bell state.

While understanding different quantum gates is out of scope for this article, given that rotation and CNOT gates will be discussed as part of the section on QCNNs, the following guide is recommended.

Bell states have been leveraged to develop a range of quantum computing applications. For example, Hegazy, Bahaa-Eldin and Dakroury have theorised that Bell states and superdense coding can be used to attain “unconditional security”.

With an introduction to quantum computing provided, we will now discuss classical approaches to deep learning, specifically convolutional neural networks (CNNs).

As François Chollet has noted in Deep Learning with Python, convolutional neural networks (CNN) have proven popular for tasks such as image classification as they are able to build hierarchies of patterns, such as first representing lines, then representing the edges of these lines. This allows CNNs to build on information between layers and represent complex visual data.

CNNs have convolutional layers, which consist of filters that “slide” across an input and produce a “feature map”, that allow for detecting patterns within inputs. CNNs also use pooling layers which reduce the size of feature map, which in turn reduces the resources needed for learning. For more on this, Oh, Choi and Kim’s 2020 guide to CNNs is highly recommended.

A convolutional neural network (CNN) shown by Cecbur

Returning to the topic at hand, with classical CNNs defined, it is now possible to explore how quantum CNNs leverage these traditional approaches and extend them. Garg and Ramakrishnan identify that a common approach to developing quantum neural networks is to develop a “hybrid” approach, introducing what is known as a “quanvolutional layer, a transformation based on random quantum circuits, as an additional component in a classical CNN”.

In this section, we will discuss a hybrid QCNN that has been developed by Lü et al and tested on the MNIST handwritten digit dataset. For their hybrid QCNN, Lü et al., in their 2021 paper, use quantum circuits and entanglement as part of a classical model to take an input image and then generate predictions as an output.

In this approach, a quantum convolutional neural network (QCNN) takes image data as input and encodes it into quantum state |x>, this is then transformed, and features are extracted, using quantum convolutional and pooling layers (Lü, et al., 2021).

Finally, a fully connected layer, using strongly entangled circuits is used to perform classification, and a prediction is obtained with measurement (Lü, et al., 2021).

Optimisation, which is used to reduce the differences between the training data labels and labels predicted by the QCNN, is handled by stochastic gradient descent (SGD). Focusing on the quantum circuits, the gates used in the quantum convolutional layer are shown as follows, with a combination of rotation and CNOT gate operators.

In the pooling layer, a subset of qubits is measured, and the outcomes derived then determine whether to apply single-qubit gates on their neighbours:

The fully connected layer consists of “universal single-qubit quantum gates” and CNOT gates which produce entangled states. To benchmark the QCNN against other approaches Lü et al utilised the MNIST dataset with a simulated QCNN. As per typical approaches, a training/test dataset was created, and a QCNN consisting of the following layers was developed:

  • 2 quantum convolutional layers
  • 2 quantum pooling layers
  • 1 quantum fully connected layer

This QCNN reached a test set accuracy of 96.65% for the dataset. In comparison, the highest accuracy score for this dataset, according to Papers with Code, with a classical CNN is 99.91%. However, it is important to note that for this experiment, only two classes of the MNIST dataset were classified, meaning full comparison to other MNIST model performance is somewhat limited.

While researchers such as Lü et al. have developed approaches for quantum CNNs, one of the key challenges in the field is that the hardware required to implement theoretical models is simply not available yet. Alongside this, there are also challenges associated specifically with hybrid approaches, which introduce quanvolutional layers alongside classical computing methods for CNNs.

If we consider one of the key benefits of quantum computing is that it becomes possible to solve “classically intractable problems through computationally cheaper techniques”, an important facet of these solutions is the “quantum speedup”: When exploring the benefits of quantum machine learning, Phillipson (2020) proposes that it is expected quantum algorithms will have a polynomial or even exponential speed up time compared with classical implementations. However, a limitation of the Lü et al. approach is that the “quantum speedup” gain is limited for algorithms which require consistent decoding/encoding of classical data and measurement, such as the QCNN. This is discussed by both Aaronson and Henderson et al., in their respective papers. Currently there is limited information on how to best design protocols which encode/decode and require minimal measurement, to benefit from “quantum speedup”.

More generally, entanglement has been shown as an important property for quantum machine learning. The QCNN proposed by Lü et al. makes use of strongly entangled circuits, which can generate entangled states as its fully connected layer, allowing the model to make predictions. Entanglement has been used elsewhere to aid deep learning models, such as Liu et al.’s usage of entanglement to extract important features from images. Furthermore, Sharma et al. have found that the use of entanglement in datasets may mean that models are able to learn from smaller training datasets than previously expected, refining what is known as the No-Free-Lunch theorem.

In this article, a comparison of classical and quantum deep learning methods has been provided, alongside an overview of a QCNN which utilises quantum layers, including strongly entangled circuits, to generate predictions. The benefits and limitations of quantum deep learning have been discussed, including applications of entanglement in machine learning more generally.

With this in mind, it’s now possible to consider what is next for quantum deep learning and specifically QCNNs. Garg and Ramakrishnan identify that alongside image recognition, quantum approaches have begun to be developed for areas such as natural language processing (NLP), such as Galofaro et al.’s work to detect hate speech.

Alongside this, we have also seen advancements in quantum hardware, with companies such as PsiQuantum aiming to develop million qubit quantum processors. Therefore, while we have seen that there are challenges associated with applying quantum neural networks, as research continues at the “juncture” of deep learning and quantum computing, we can expect to see further advancements in quantum deep learning.

For those interested, a small bibliography on relevant quantum computing and deep learning resources is provided, alongside the links in the article.

Aaronson, S. (2015) “Read the fine print”, Nature Physics, 11(4), pp. 291–293. doi: 10.1038/nphys3272.

Biamonte, J. et al.. (2017) “Quantum machine learning”, Nature, 549(7671), pp. 195–202. doi: 10.1038/nature23474.

Chollet, F., (2021). Deep Learning with Python. Second Edition. Shelter Island (New York, Estados Unidos): Manning.

Ganguly, S. and Cambier, T., 2021. Quantum Computing with Silq Programming. Packt.

Garg, S. and Ramakrishnan, G. (2020) Advances in Quantum Deep Learning: An Overview, arXiv.org. Available at: https://arxiv.org/abs/2005.

Hegazy, O., Bahaa-Eldin, A. and Dakroury, Y. (2014) Quantum Secure Direct Communication using Entanglement and Super Dense Coding, arXiv.org. Available at: https://arxiv.org/abs/1402.6219

Henderson, M. et al., (2019) Quanvolutional Neural Networks: Powering Image Recognition with Quantum Circuits, arXiv.org. Available at: https://arxiv.org/abs/1904.04767

Karn, U. (2016) An Intuitive Explanation of Convolutional Neural Networks — KDnuggets. Available at: https://www.kdnuggets.com/2016/11/intuitive-explanation-convolutional-neural-networks.html

Liu, Y. et al., (2021) “Entanglement-Based Feature Extraction by Tensor Network Machine Learning”, Frontiers in Applied Mathematics and Statistics, 7. doi: 10.3389/fams.2021.716044.

Lü, Y. et al., (2021) A Quantum Convolutional Neural Network for Image Classification, arXiv.org. Available at: https://arxiv.org/abs/2107.03630.

Mehta, N., 2020. Quantum Computing. [S.l.]: Pragmatic Bookshelf.

Ofcom, (2021) Quantum Communications: new potential for the future of communications Available at: https://www.ofcom.org.uk/__data/assets/pdf_file/0013/222601/Executive-Summary.pdf

Oh, S., Choi, J. and Kim, J. (2020) A Tutorial on Quantum Convolutional Neural Networks (QCNN), arXiv.org. Available at: https://arxiv.org/abs/2009.09423

Pattanayak, S., (2021). Quantum Machine Learning with Python: Using Cirq From Google Research and IBM Qiskit. Apress.

Phillipson, F (2020). Quantum Machine Learning: Benefits and Practical Examples. Available at: http://ceur-ws.org/Vol-2561/paper5.pdf

Perry, T. R. (2004). The Temple of Quantum Computing: Version 1.1 — April 29, 2006. Riley T. Perry.

Sewak, Karim, Pujari, P (2018). Practical convolutional neural networks. Birmingham: Packt.

Sharma, K. et al. (2022) “Reformulation of the No-Free-Lunch Theorem for Entangled Datasets”, Physical Review Letters, 128(7). doi: 10.1103/physrevlett.128.070501.

Voorhoede, D. (2022) Superposition and entanglement, Quantum Inspire. Available at: https://www.quantum-inspire.com/kbase/superposition-and-entanglement/


Everything you need to know about quantum convolutional neural networks (QCNNs), including the benefits and limitations of these approaches compared to classical computing methods

IQM Quantum Computer in Espoo Finland by Ragsxl

In recent years investment in quantum computing has increased significantly, with quantum approaches to areas such as security and network communication expected to upend existing classical computing techniques.

Researchers such as Garg and Ramakrishnan identify that at its core, quantum computing aims to “solve classically intractable problems through computationally cheaper techniques”. It is perhaps unsurprising that just as research in deep learning and quantum computing have grown in parallel in recent years, many are now examining the possibilities at the intersection of these two fields: Quantum deep learning.

In this article, we’ll discuss at high-level existing research and applications of quantum deep learning, focusing on hybrid quantum convolutional neural networks (QCNNs). To begin, a brief definition of quantum computing compared to classical computing is provided. From here, entanglement is defined, alongside entangled states and their applications.

Next, an overview of classical convolutional neural networks (CCNNs, or CNNs) is discussed. Finally, a discussion of QCNNs and their performance is provided, as well as the benefits and limitations of these approaches.

If you’re completely new to quantum computing, an important introductory concept is the difference between classical computing (what we are typically used to for computing tasks) and quantum. On classical computers, when a program is executed, a compiler is used to translate the program’s statements into operations held on binary bits.

Unlike bits on classical computers, which at any point will represent 1 or 0, qubits are able to “hover” between these two states. Only when measured will the qubit “collapse” into one of its states.

This property is known as superposition and is critical for quantum computing tasks (Ganguly, Cambier, 2021). Through superposition, quantum computers can perform tasks in parallel and do not require a completely parallel architecture or GPUs to process parallel computing tasks. This is because if each superposition states corresponds to a differing value, if a superposition state is acted upon, then the action is performed across all the states at the same time.

An example of a superposition quantum state is as follows:

With and a and b referring to probability amplitudes, which give the probability of projecting into a state once measurement is performed. Superposition quantum states are created by using quantum logic gates. If Bra-ket notation is new to you, then Perry’s Temple of Quantum Computing is highly recommended.

Just as superposition is an important principle in quantum physics, another key area to discuss is entanglement. Entanglement refers to the act of generating, or causing an interaction, between two or more particles in a manner which means that the quantum state of these particles can no longer be described independently from each other, even when separated by a long distance. When particles become entangled, if one particle is measured, then the other particle which is entangled with it will measure as the opposite state, instantaneously (these particles have no local state).

With an understanding of qubits and entanglement developed, it is now possible to discuss Bell states. These are maximally entangled states of qubits, which are:

|00⟩ → β → 1 √ 2 (|00⟩ + |11⟩) = |β00⟩,

|01⟩ → β → 1 √ 2 (|01⟩ + |10⟩) = |β01⟩

|10⟩ → β → 1 √ 2 (|00⟩ − |11⟩) = |β10⟩

|11⟩ → β → 1 √ 2 (|01⟩ − |10⟩) = |β11⟩

A Bell state is created with the following quantum circuit:

Bell state circuit from Perry’s Temple of Quantum Computing.

Here, a Bell state circuit is shown, which takes qubit inputs and applies a Hadamard and CNOT gate to create an entangled Bell state.

While understanding different quantum gates is out of scope for this article, given that rotation and CNOT gates will be discussed as part of the section on QCNNs, the following guide is recommended.

Bell states have been leveraged to develop a range of quantum computing applications. For example, Hegazy, Bahaa-Eldin and Dakroury have theorised that Bell states and superdense coding can be used to attain “unconditional security”.

With an introduction to quantum computing provided, we will now discuss classical approaches to deep learning, specifically convolutional neural networks (CNNs).

As François Chollet has noted in Deep Learning with Python, convolutional neural networks (CNN) have proven popular for tasks such as image classification as they are able to build hierarchies of patterns, such as first representing lines, then representing the edges of these lines. This allows CNNs to build on information between layers and represent complex visual data.

CNNs have convolutional layers, which consist of filters that “slide” across an input and produce a “feature map”, that allow for detecting patterns within inputs. CNNs also use pooling layers which reduce the size of feature map, which in turn reduces the resources needed for learning. For more on this, Oh, Choi and Kim’s 2020 guide to CNNs is highly recommended.

A convolutional neural network (CNN) shown by Cecbur

Returning to the topic at hand, with classical CNNs defined, it is now possible to explore how quantum CNNs leverage these traditional approaches and extend them. Garg and Ramakrishnan identify that a common approach to developing quantum neural networks is to develop a “hybrid” approach, introducing what is known as a “quanvolutional layer, a transformation based on random quantum circuits, as an additional component in a classical CNN”.

In this section, we will discuss a hybrid QCNN that has been developed by Lü et al and tested on the MNIST handwritten digit dataset. For their hybrid QCNN, Lü et al., in their 2021 paper, use quantum circuits and entanglement as part of a classical model to take an input image and then generate predictions as an output.

In this approach, a quantum convolutional neural network (QCNN) takes image data as input and encodes it into quantum state |x>, this is then transformed, and features are extracted, using quantum convolutional and pooling layers (Lü, et al., 2021).

Finally, a fully connected layer, using strongly entangled circuits is used to perform classification, and a prediction is obtained with measurement (Lü, et al., 2021).

Optimisation, which is used to reduce the differences between the training data labels and labels predicted by the QCNN, is handled by stochastic gradient descent (SGD). Focusing on the quantum circuits, the gates used in the quantum convolutional layer are shown as follows, with a combination of rotation and CNOT gate operators.

In the pooling layer, a subset of qubits is measured, and the outcomes derived then determine whether to apply single-qubit gates on their neighbours:

The fully connected layer consists of “universal single-qubit quantum gates” and CNOT gates which produce entangled states. To benchmark the QCNN against other approaches Lü et al utilised the MNIST dataset with a simulated QCNN. As per typical approaches, a training/test dataset was created, and a QCNN consisting of the following layers was developed:

  • 2 quantum convolutional layers
  • 2 quantum pooling layers
  • 1 quantum fully connected layer

This QCNN reached a test set accuracy of 96.65% for the dataset. In comparison, the highest accuracy score for this dataset, according to Papers with Code, with a classical CNN is 99.91%. However, it is important to note that for this experiment, only two classes of the MNIST dataset were classified, meaning full comparison to other MNIST model performance is somewhat limited.

While researchers such as Lü et al. have developed approaches for quantum CNNs, one of the key challenges in the field is that the hardware required to implement theoretical models is simply not available yet. Alongside this, there are also challenges associated specifically with hybrid approaches, which introduce quanvolutional layers alongside classical computing methods for CNNs.

If we consider one of the key benefits of quantum computing is that it becomes possible to solve “classically intractable problems through computationally cheaper techniques”, an important facet of these solutions is the “quantum speedup”: When exploring the benefits of quantum machine learning, Phillipson (2020) proposes that it is expected quantum algorithms will have a polynomial or even exponential speed up time compared with classical implementations. However, a limitation of the Lü et al. approach is that the “quantum speedup” gain is limited for algorithms which require consistent decoding/encoding of classical data and measurement, such as the QCNN. This is discussed by both Aaronson and Henderson et al., in their respective papers. Currently there is limited information on how to best design protocols which encode/decode and require minimal measurement, to benefit from “quantum speedup”.

More generally, entanglement has been shown as an important property for quantum machine learning. The QCNN proposed by Lü et al. makes use of strongly entangled circuits, which can generate entangled states as its fully connected layer, allowing the model to make predictions. Entanglement has been used elsewhere to aid deep learning models, such as Liu et al.’s usage of entanglement to extract important features from images. Furthermore, Sharma et al. have found that the use of entanglement in datasets may mean that models are able to learn from smaller training datasets than previously expected, refining what is known as the No-Free-Lunch theorem.

In this article, a comparison of classical and quantum deep learning methods has been provided, alongside an overview of a QCNN which utilises quantum layers, including strongly entangled circuits, to generate predictions. The benefits and limitations of quantum deep learning have been discussed, including applications of entanglement in machine learning more generally.

With this in mind, it’s now possible to consider what is next for quantum deep learning and specifically QCNNs. Garg and Ramakrishnan identify that alongside image recognition, quantum approaches have begun to be developed for areas such as natural language processing (NLP), such as Galofaro et al.’s work to detect hate speech.

Alongside this, we have also seen advancements in quantum hardware, with companies such as PsiQuantum aiming to develop million qubit quantum processors. Therefore, while we have seen that there are challenges associated with applying quantum neural networks, as research continues at the “juncture” of deep learning and quantum computing, we can expect to see further advancements in quantum deep learning.

For those interested, a small bibliography on relevant quantum computing and deep learning resources is provided, alongside the links in the article.

Aaronson, S. (2015) “Read the fine print”, Nature Physics, 11(4), pp. 291–293. doi: 10.1038/nphys3272.

Biamonte, J. et al.. (2017) “Quantum machine learning”, Nature, 549(7671), pp. 195–202. doi: 10.1038/nature23474.

Chollet, F., (2021). Deep Learning with Python. Second Edition. Shelter Island (New York, Estados Unidos): Manning.

Ganguly, S. and Cambier, T., 2021. Quantum Computing with Silq Programming. Packt.

Garg, S. and Ramakrishnan, G. (2020) Advances in Quantum Deep Learning: An Overview, arXiv.org. Available at: https://arxiv.org/abs/2005.

Hegazy, O., Bahaa-Eldin, A. and Dakroury, Y. (2014) Quantum Secure Direct Communication using Entanglement and Super Dense Coding, arXiv.org. Available at: https://arxiv.org/abs/1402.6219

Henderson, M. et al., (2019) Quanvolutional Neural Networks: Powering Image Recognition with Quantum Circuits, arXiv.org. Available at: https://arxiv.org/abs/1904.04767

Karn, U. (2016) An Intuitive Explanation of Convolutional Neural Networks — KDnuggets. Available at: https://www.kdnuggets.com/2016/11/intuitive-explanation-convolutional-neural-networks.html

Liu, Y. et al., (2021) “Entanglement-Based Feature Extraction by Tensor Network Machine Learning”, Frontiers in Applied Mathematics and Statistics, 7. doi: 10.3389/fams.2021.716044.

Lü, Y. et al., (2021) A Quantum Convolutional Neural Network for Image Classification, arXiv.org. Available at: https://arxiv.org/abs/2107.03630.

Mehta, N., 2020. Quantum Computing. [S.l.]: Pragmatic Bookshelf.

Ofcom, (2021) Quantum Communications: new potential for the future of communications Available at: https://www.ofcom.org.uk/__data/assets/pdf_file/0013/222601/Executive-Summary.pdf

Oh, S., Choi, J. and Kim, J. (2020) A Tutorial on Quantum Convolutional Neural Networks (QCNN), arXiv.org. Available at: https://arxiv.org/abs/2009.09423

Pattanayak, S., (2021). Quantum Machine Learning with Python: Using Cirq From Google Research and IBM Qiskit. Apress.

Phillipson, F (2020). Quantum Machine Learning: Benefits and Practical Examples. Available at: http://ceur-ws.org/Vol-2561/paper5.pdf

Perry, T. R. (2004). The Temple of Quantum Computing: Version 1.1 — April 29, 2006. Riley T. Perry.

Sewak, Karim, Pujari, P (2018). Practical convolutional neural networks. Birmingham: Packt.

Sharma, K. et al. (2022) “Reformulation of the No-Free-Lunch Theorem for Entangled Datasets”, Physical Review Letters, 128(7). doi: 10.1103/physrevlett.128.070501.

Voorhoede, D. (2022) Superposition and entanglement, Quantum Inspire. Available at: https://www.quantum-inspire.com/kbase/superposition-and-entanglement/

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment