question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

HuggingFace Model Hub (summarisation) - models not working locally (404 not found)

See original GitHub issue

Environment info

  • transformers version: 4.10.0
  • Platform: Linux-5.11.0-36-generic-x86_64-with-glibc2.29
  • Python version: 3.8.10
  • PyTorch version (GPU?): not installed (NA)
  • Tensorflow version (GPU?): 2.6.0 (False)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using GPU in script?: No
  • Using distributed or parallel set-up in script?: No

–>

Information

I am using a text summarisation model from HuggingFace Model Hub. However, this issue occurs regardless of what model I use.

The problem arises when using any text summarisation model from HuggingFace Model Hub locally.

The tasks I am working on is dialogue summarisation.

To reproduce

Steps to reproduce the behavior:

  1. Run this code locally with the environment I specified in the beginning:
from transformers import pipeline
summarizer = pipeline("summarization", model="lidiya/bart-large-xsum-samsum")
conversation = '''Jeff: Can I train a 🤗 Transformers model on Amazon SageMaker? 
Philipp: Sure you can use the new Hugging Face Deep Learning Container. 
Jeff: ok.
Jeff: and how can I get started? 
Jeff: where can I find documentation? 
Philipp: ok, ok you can find everything here. https://huggingface.co/blog/the-partnership-amazon-sagemaker-and-hugging-face                                   '''
print(summarizer(conversation))
  1. Output is:
2021-09-28 14:20:06.034022: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcudart.so.11.0'; dlerror: libcudart.so.11.0: cannot open shared object file: No such file or directory
2021-09-28 14:20:06.034044: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine.
404 Client Error: Not Found for url: https://huggingface.co/lidiya/bart-large-xsum-samsum/resolve/main/tf_model.h5
404 Client Error: Not Found for url: https://huggingface.co/lidiya/bart-large-xsum-samsum/resolve/main/tf_model.h5
Traceback (most recent call last):
  File "test.py", line 2, in <module>
    summarizer = pipeline("summarization", model="lidiya/bart-large-xsum-samsum")
  File "/home/teodor/Desktop/test/env/lib/python3.8/site-packages/transformers/pipelines/__init__.py", line 429, in pipeline
    framework, model = infer_framework_load_model(
  File "/home/teodor/Desktop/test/env/lib/python3.8/site-packages/transformers/pipelines/base.py", line 145, in infer_framework_load_model
    raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.")
ValueError: Could not load model lidiya/bart-large-xsum-samsum with any of the following classes: (<class 'transformers.models.auto.modeling_tf_auto.TFAutoModelForSeq2SeqLM'>, <class 'transformers.models.bart.modeling_tf_bart.TFBartForConditionalGeneration'>).

Expected behavior

On Google Colab and on HugginFace website a string is outputted containing the summary of the inputted text: “Jeff wants to train a Transformers model on Amazon SageMaker. He can use the new Hugging Face Deep Learning Container. The documentation is available on HuggingFace.co and on the blog, Jeff can find it here. . . Jeff can train a model on Huging Face.co.”

Why is it not working locally? Any help would be much appreciated. I’ve been trying to solve this problem for the past few days but I couldn’t find a working solution so far. Thank you!

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:7 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
patil-surajcommented, Sep 28, 2021

It seems that pytorch is not installed, you should install pytorch to be able to use this model.

0reactions
teodortita2commented, Sep 28, 2021

@patil-suraj you helped me find the solution. The only thing that worked for me is:

from transformers import pipeline, AutoTokenizer, AutoModelForSeq2SeqLM

conversation = '''Jeff: Can I train a 🤗 Transformers model on Amazon SageMaker? 
Philipp: Sure you can use the new Hugging Face Deep Learning Container. 
Jeff: ok.
Jeff: and how can I get started? 
Jeff: where can I find documentation? 
Philipp: ok, ok you can find everything here. https://huggingface.co/blog/the-partnership-amazon-sagemaker-and-hugging-face                                           
'''

tokenizer = AutoTokenizer.from_pretrained("lidiya/bart-large-xsum-samsum")

model = AutoModelForSeq2SeqLM.from_pretrained("lidiya/bart-large-xsum-samsum")

summarizer = pipeline("summarization", model=model, tokenizer=tokenizer)

print(summarizer(conversation))

Thank you a lot for your patience. Hope many people see this.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Can't download (some) models although they are in the hub
Can't download (some) models to pytorch, although they are in the hub (tried also the from_tf flag). Error: 404 Client Error: Not Found...
Read more >
Internal Server Error with Hub · Issue #1274 - GitHub
HTTPError: 404 Client Error: Not Found for url: https://huggingface.co/api/models/joelito/legal-xlm-base-test2 The above exception was the direct cause of the ...
Read more >
Hugging Face model Bio_ClinicalBERT producing 404 error
The 404 error simply means that transformers was not able to find a Tensorflow version of this particular model (which is being requested...
Read more >
huggingface t5 example - La Calabrisella 2
Download models from the HuggingFace model zoo First, download the ... over 2568 datasets available on HuggingFace Hub. from_pretrained ('t5-small') model ...
Read more >
Source code for speechbrain.pretrained.fetching
Otherwise, the source is interpreted as a Huggingface model hub ID, ... as e: if "404 Client Error" in str(e): raise ValueError("File not...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found