question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Error when trying to use GPTNeo model

See original GitHub issue

Environment info

  • transformers version: 4.6.0.dev0
  • Platform: Windows-10-10.0.19041-SP0
  • Python version: 3.8.7
  • PyTorch version (GPU?): not installed (NA)
  • Tensorflow version (GPU?): 2.4.0 (False)
  • Using GPU in script?: Not yet
  • Using distributed or parallel set-up in script?: yes

Who can help

Models:

Information

The problem arises when using:

  • the official example scripts: (give details below)
  • my own modified scripts: (give details below) Using GPTNeo

The tasks I am working on is:

  • an official GLUE/SQUaD task: (give the name)
  • my own task or dataset: (give details below) I want to try to convert GPT-Neo to a TF.js model for a class

To reproduce

Steps to reproduce the behavior:

  1. pip install transformers
  2. import the necessary parts
  3. run file

Here’s the error I get after running:

model = GPTNeoForCausalLM.from_pretrained("EleutherAI/gpt-neo-1.3B")
AttributeError: type object 'GPTNeoForCausalLM' has no attribute 'from_pretrained

Expected behavior

The model should be saved in a new folder.

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:7 (5 by maintainers)

github_iconTop GitHub Comments

2reactions
patil-surajcommented, May 3, 2021

All Transformers models have save_pretrained method, save is the wrong method name.

2reactions
patil-surajcommented, May 3, 2021

Hi there, this is because PyTorch is not installed. Currently, GPT Neo is only implemented pt and pt should be installed if you want to use a model implemented in PyTorch.

When it’s not installed a dummy object is returned, which is the reason for the above error.

Read more comments on GitHub >

github_iconTop Results From Across the Web

gpt-neo-2.7B isn't working with pipleline - Hugging Face Forums
I'm getting a basic error when I try to access GTP-NEO-2.7B via a pipeline. ... pipeline('text-generation', model='EleutherAI/gpt-neo-2.7B').
Read more >
How do you install a library from HuggingFace? E.g. GPT Neo ...
No. If you are using offline, i.e. host the model on your own computer - there are no costs. Also if you are...
Read more >
AWS Marketplace: GPT Neo 2.7B | Text Generation
GPT-Neo 2.7B is a transformer model designed using EleutherAI's replication of the GPT-3 architecture. GPT-Neo refers to the class of models, while 2.7B ......
Read more >
Fine-Tune AI Text Generation GPT-Neo Model with ... - YouTube
In this Applied NLP Tutorial, We are going to build our Custom Stable Diffusion Prompt Generator Model by Fine-Tuning Krea AI's Stable ...
Read more >
AI Content Generator Open Source GPT-Neo Transformers ...
EleutherAI #gpt #pytorchAI Content Generator Open Source GPT-Neo Transformers EleutherAI video stamps Use to Jump ahead the video ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found