How is running locally requiring your tokenized access to ckpt file ? Is that really local ? How is that local ?
See original GitHub issueDescribe the bug
this crap https://discuss.huggingface.co/t/how-to-login-to-huggingface-hub-with-access-token/22498/5
So… just let us use our own path to our folders and ckpt file ok ? Like a local thing you know ?
Reproduction
try it on miniconda, good luck !!!
Logs
none
System Info
win10 miniconda ldm env
Issue Analytics
- State:
- Created a year ago
- Reactions:1
- Comments:12 (5 by maintainers)
Top Results From Across the Web
transformers and BERT downloading to your local machine
I went to the link and manually downloaded all files to a folder and specified path of that folder in my code. Tokenizer...
Read more >How to load the pre-trained BERT model from local/colab ...
You are using the Transformers library from HuggingFace. ... You need to download a converted checkpoint, from there.
Read more >Use tokenizers from Tokenizers - Hugging Face
We now have a tokenizer trained on the files we defined. We can either continue using it in that runtime, or save it...
Read more >Save and load models | TensorFlow Core
This means a model can resume where it left off and avoid long training times. Saving also means you can share your model...
Read more >Understand BLOOM, the Largest Open-Access AI, and Run It ...
A BLOOM checkpoint takes 330 GB of disk space, so it… ... Understand BLOOM, the Largest Open-Access AI, and Run It on Your...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Hey @ExponentialML,
Thanks a lot for the feedback here - I think we haven’t done a great job at showing how to easily download this model and run it locally. It’s literally as easy as doing:
Followed by:
-> no need for an authentication token or cache or whatsoever. It’s also explained here: https://huggingface.co/docs/diffusers/quicktour
Given that you’re not the first to mention this problem, I’ll open an issue now about providing better documentation.
You’re can run locally.