question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Huggingface transformers inference: ModuleNotFoundError: No module named 'generate'

See original GitHub issue

When running the imports listed in transformers.md:

from PIL import Image
from torchvision import transforms
from transformers import OFATokenizer, OFAModel
from generate import sequence_generator

I get ModuleNotFoundError: No module named 'generate'

Where is generate supposed to come from? The implementations of sequence_generator.SequenceGenerator that I see in e.g. fairseq also don’t have the same signature, so it’s not clear how to proceed.

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:10 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
JustinLin610commented, Sep 14, 2022

Hi @Sultanax, sure. I did two things:

  1. Added an empty __init__.py file in transformers/src/transformers/models/ofa/generate/__init__.py
  2. Changed the import to be from transformers.models.ofa.generate import sequence_generator

Note: you will need to pip install OFA/transformers/ again

I have added the empty __init__.py to the directory generate and it all works now. If you guys are interested, you can try this notebook (link) and see if there are still any problems.

1reaction
JustinLin610commented, Aug 12, 2022

Good suggestion! Thanks for helping us improve our repo. I’ll update the information to make things clear!

Read more comments on GitHub >

github_iconTop Results From Across the Web

hyperparameter_search raytune: ModuleNotFoundError: No ...
So I downgraded transformers and ray to 4.4.2 and 1.2.0 (creating a fresh conda environment), and made the necessary adjustments to the run_ner...
Read more >
ModuleNotFoundError: No module named 'transformers'
Hi! I've been having trouble getting transformers to work in Spaces. When tested in my environment using python -c "from transformers import ...
Read more >
What to do when you get an error - Hugging Face Course
In this section we'll look at some common errors that can occur when you're trying to generate predictions from your freshly tuned Transformer...
Read more >
Deploy models to Amazon SageMaker - Hugging Face
Deploying a Transformers models in SageMaker for inference is as easy as: Copied. from sagemaker.huggingface import ... Create a custom inference module.
Read more >
Export to ONNX - Transformers - Hugging Face
Once exported, a model can be optimized for inference via techniques such as ... The transformers.onnx package can then be used as a...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found