Neural Networks for Programming Quantum Annealers
We explore a setup for performing classification on labeled classical datasets, consisting of a classical neural network connected to a quantum annealer.
Tags:Paper and LLMsQuantum Machine LearningPricing Type
- Pricing Type: Free
- Price Range Start($):
GitHub Link
The GitHub link is https://github.com/boschsamuel/nnforprogrammingquantumannealers
Introduce
This GitHub repository, “NNforProgrammingQuantumAnnealers,” focuses on using neural networks to program quantum annealers for tasks like classification. Quantum machine learning aims to solve complex problems in AI using quantum computing. The study explores connecting classical neural networks with quantum annealers to enhance performance on classification tasks. However, the findings suggest that adding a small quantum annealer does not significantly improve outcomes over using a regular classical neural network alone. The code is available in a Jupyter Notebook within the repository. The research paper is accessible at https://arxiv.org/abs/2308.06807.
We explore a setup for performing classification on labeled classical datasets, consisting of a classical neural network connected to a quantum annealer.
Content
Quantum machine learning has the potential to enable advances in artificial intelligence, such as solving problems intractable on classical computers. Some fundamental ideas behind quantum machine learning are similar to kernel methods in classical machine learning. Both process information by mapping it into high-dimensional vector spaces without explicitly calculating their numerical values. We explore a setup for performing classification on labeled classical datasets, consisting of a classical neural network connected to a quantum annealer. The neural network programs the quantum annealer’s controls and thereby maps the annealer’s initial states into new states in the Hilbert space. The neural network’s parameters are optimized to maximize the distance of states corresponding to inputs from different classes and minimize the distance between quantum states corresponding to the same class. Recent literature showed that at least some of the “learning” is due to the quantum annealer, connecting a small linear network to a quantum annealer and using it to learn small and linearly inseparable datasets. In this study, we consider a similar but not quite the same case, where a classical fully-fledged neural network is connected with a small quantum annealer. In such a setting, the fully-fledged classical neural-network already has built-in nonlinearity and learning power, and can already handle the classification problem alone, we want to see whether an additional quantum layer could boost its performance. We simulate this system to learn several common datasets, including those for image and sound recognition. We conclude that adding a small quantum annealer does not provide a significant benefit over just using a regular (nonlinear) classical neural network. Quite simple: the Jupyter Notebook file contains all the code, and just needs to be placed in the same folder as the function.py file (which contains all the helper functions). And that’s it! Depending on the dataset, you might need to adjust the parameters inside the main Jupyter Notebook
Related
LongLLaMA is a large language model designed to handle very long text contexts, up to 256,000 tokens. It's based on OpenLLaMA and uses a technique called Focused Transformer (FoT) for training. The repository provides a smaller 3B version of LongLLaMA for free use. It can also be used as a replacement for LLaMA models with shorter contexts.