No module named 'magma.transformers'
See original GitHub issueIssue Description
Hi,
just download the “magma-master” file and followed the instructions (I think), but trying to run test.py
I get errors. It seems there are some parts missing?
First I get:
(magma) c:\Python\magma-master>python test.py
Traceback (most recent call last):
File "c:\Python\magma-master\test.py", line 4, in <module>
from magma.language_model import get_language_model
ImportError: cannot import name 'get_language_model' from 'magma.language_model' (c:\Python\magma-master\magma\language_model.py)>
looking at the code it seems like get_language_model
is not used anyhow, so commented line 4 out. But after that there is a similar miss:
(magma) c:\Python\magma-master>python test.py
Traceback (most recent call last):
File "c:\Python\magma-master\test.py", line 25, in <module>
from magma.transformers import GPTJForCausalLM
ModuleNotFoundError: No module named 'magma.transformers'
And here GPTJForCausalLM
is used right in the next line. Looking at transformers.py
there is just nothing like GPTJForCausalLM
in there at all. Seems like something is missing here completly?
Best Tuxius
Issue Analytics
- State:
- Created a year ago
- Comments:6 (2 by maintainers)
Hi,
thank you for the fix. Trying to run
example_inference.py
on my PC with a 8 GB RTX 2070 GPU I get aRuntimeError: CUDA out of memory
. Having 64 GB on my AMD Ryzen 9 5900x I tried to switch from GPU to CPU. Therefore in theexample_inference.py
I changed line 7 from todevice = 'cuda:0'
todevice = 'cpu'
. However, this had no effect. Looking atmagma.py
the reason is line 40:"cuda" if torch.cuda.is_available() else "cpu"
so it does not use the setting from
example_inference.py
. Changing that line to"cpu"
is a dirty quick fix that works for me, I am gettingmagma successfully loaded
… until I get the next error:
Seems like CPU is not yet implemented?
Best
@Tuxius I can personally confirm that this works on an RTX-3090 as shown on
example_inference.py
🙂