Mac M1 GPU support
See original GitHub issueHi,
I am wondering if there is a way to send the model to mps
GPU (mac apple M1)
Something like:
device = torch.device("mps")
model = SetFitModel.from_pretrained("sentence-transformers/paraphrase-mpnet-base-v2")
model.to(device)
....
in order to exploit the GPU since so far it uses only CPU.
Many thanks in advance for your help
Issue Analytics
- State:
- Created a year ago
- Comments:9 (3 by maintainers)
Top Results From Across the Web
Use an external graphics processor with your Mac
eGPUs are supported by any Mac with an Intel processor and Thunderbolt 3 ports running macOS High Sierra 10.13.4 or later. Learn how...
Read more >Running PyTorch on the M1 GPU - Sebastian Raschka
Today, PyTorch officially introduced GPU support for Apple's ARM M1 chips. This is an exciting day for Mac users out there, so I...
Read more >GPU-Acceleration Comes to PyTorch on M1 Macs
PyTorch have released support for GPU-acceleration on M1 Macs. Here we will explain how to get started with the new MPS layer for...
Read more >Install TensorFlow on Mac M1/M2 with GPU support
Install TensorFlow in a few steps on Mac M1/M2 with GPU support and benefit from the native performance of the new Mac ARM64...
Read more >Will M1 Macs Support eGPUs? (And Why They Probably ...
M1 Macs have Thunderbolt 3 connections which can be connected to an eGPU PCIe (Peripheral Component Interconnect Express) that are used on all...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@nishanthcmesh I think you can try the approach @dpicca took here by editing the source of
sentence-transformers
to put the model onmps
if it’s also available. If that works, my suggestion would be to open a PR on that repo once the next version oftorch
is releasedcumsum:out
now works with the nightly version of PyTorch for MPS. Could you give me a bit of guidance or a reference.to()
function that I can use to submit a pull request for this?