Is there GPU support for qiskit-machine-learning?
See original GitHub issueWhat is the expected enhancement?
Hi there,
I see qiskit-aer
providing GPU support via qiskit-aer-gpu
, can this repo do the same? I’m not into the details, but I suspect there must be some advantage?
Cheers!
Issue Analytics
- State:
- Created 2 years ago
- Comments:7 (4 by maintainers)
Top Results From Across the Web
Qiskit Machine Learning - GitHub
Qiskit Machine Learning defines a generic interface for neural networks that is implemented by different quantum neural networks.
Read more >Release Notes - Qiskit
Added GPU support to TorchConnector . Now, if a hybrid PyTorch model is being trained on GPU, TorchConnector correctly detaches tensors, moves them...
Read more >Will scikit-learn utilize GPU? - python - Stack Overflow
No, or at least not in the near future. The main reason is that GPU support will introduce many software dependencies and introduce...
Read more >Recommended GPU Instances - Deep Learning AMI
We recommend a GPU instance for most deep learning purposes. Training new models is faster on a GPU instance than a CPU instance....
Read more >how can we save a model using qiskit_machine_learning?
There is a save_model method in Qiskit Aqua ( from qiskit.aqua.algorithms import VQC ) but for Qiskit Machine Learning ( from ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I had a look at the GPU support in Qiskit Aer. If you have GPU support installed for Aer, then you can create a simulator that runs on a GPU like this:
Once you created the simulator, you can create a
QuantumInstance
for this simulator:And basically you are done. Everything algorithm you run on this quantum instance will run circuits on GPU. So right now there’s no need to do anything special to add support of GPU in QML. Does this answer your question?
Thanks! Closing the issue now.