question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

How to use SD1.5 on M1 Mac? (CUDA not available Error)

See original GitHub issue

At the end of the demo video you can see the usage of SD1.5.

When I try to start Lama with --model=SD1.5 I get the following errors regarding missing CUDA.

/Users/USER/miniconda3/lib/python3.9/site-packages/torch/amp/autocast_mode.py:198: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling') /Users/USER/miniconda3/lib/python3.9/site-packages/torch/amp/autocast_mode.py:198: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling') usage: lama-cleaner [-h] [--host HOST] [--port PORT] [--model {lama,ldm,zits,mat,fcf,sd1.5,cv2}] [--hf_access_token HF_ACCESS_TOKEN] [--sd-disable-nsfw] [--sd-cpu-textencoder] [--sd-run-local] [--device {cuda,cpu}] [--gui] [--gui-size GUI_SIZE GUI_SIZE] [--input INPUT] [--debug] lama-cleaner: error: argument --model: invalid choice: 'SD1.5' (choose from 'lama', 'ldm', 'zits', 'mat', 'fcf', 'sd1.5', 'cv2')

What can I do to get it working? I already downloaded both checkpoint versions 1.5 pruned and 1.5pruned emaonly from hugging face. Can someone explain to me the next steps?

Issue Analytics

  • State:open
  • Created a year ago
  • Comments:8 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
Sanstercommented, Oct 24, 2022

Install the virtualenvironment according to the guidelines in the blog, then you can install Lama Cleaner (pip3 install lama-cleaner) after Activate the virtualenv

0reactions
Sanstercommented, Oct 25, 2022

Sorry for the annoying log, actually you can safely ignore that CUDA warning.

After the first time you run lama-cleaner --model=sd1.5 --device=cpu --hf_access_token=you_token, you can remove --hf_access_token, and add --run-sd-local to start the server

lama-cleaner --model=sd1.5 --device=cpu --run-sd-local

I made a typo, it should be --sd-run-local. This parameter simply allows you to not pass the huggingface token, nothing to do with running speed.

image

You can ignore this CUDA warning, it is normal for this warning to appear on machines without NVIDIA graphics.

The real reason the model running slow is, we should use PyTorch compiled for the M1 chip, currently Lama Cleaner not support that, I will add M1 support If I could get an M1 Mac.

Read more comments on GitHub >

github_iconTop Results From Across the Web

NVIDIA CUDA Getting Started Guide for Mac OS X
To verify that your system is CUDA-capable, under the Apple menu select About. This Mac, click the More Info … button, and then...
Read more >
Issues · Sanster/lama-cleaner - GitHub
Issues while installing using pip. #80 opened on Oct 7 by walidamrouche. Open 7 ... How to use SD1.5 on M1 Mac? (CUDA...
Read more >
CUDA for M1 MacBook Pro - MATLAB Answers
I'm working on aproject on deep learning using Matlab, i've downloaded the MatConvNet library, and i have Xcode version 13.4.1, but i couldn't...
Read more >
Surprising HPC results with M1 Max… - Apple Developer
This code supports CUDA, OpenCL, Metal, and OpenMP backends. We have done a lot of fine-tuning for each backend to get the best...
Read more >
Using pytorch Cuda on MacBook Pro - Stack Overflow
At the moment, you cannot use GPU acceleration with PyTorch with AMD GPU, i.e. without an nVidia GPU. The O.S. is not the...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found