question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

No module named 'magma.transformers'

See original GitHub issue

Hi,

just download the “magma-master” file and followed the instructions (I think), but trying to run test.py I get errors. It seems there are some parts missing?

First I get: (magma) c:\Python\magma-master>python test.py Traceback (most recent call last): File "c:\Python\magma-master\test.py", line 4, in <module> from magma.language_model import get_language_model ImportError: cannot import name 'get_language_model' from 'magma.language_model' (c:\Python\magma-master\magma\language_model.py)>

looking at the code it seems like get_language_model is not used anyhow, so commented line 4 out. But after that there is a similar miss:

(magma) c:\Python\magma-master>python test.py Traceback (most recent call last): File "c:\Python\magma-master\test.py", line 25, in <module> from magma.transformers import GPTJForCausalLM ModuleNotFoundError: No module named 'magma.transformers'

And here GPTJForCausalLM is used right in the next line. Looking at transformers.py there is just nothing like GPTJForCausalLM in there at all. Seems like something is missing here completly?

Best Tuxius

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:6 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
Tuxiuscommented, Mar 22, 2022

Hi,

thank you for the fix. Trying to run example_inference.py on my PC with a 8 GB RTX 2070 GPU I get a RuntimeError: CUDA out of memory. Having 64 GB on my AMD Ryzen 9 5900x I tried to switch from GPU to CPU. Therefore in the example_inference.py I changed line 7 from to device = 'cuda:0' to device = 'cpu'. However, this had no effect. Looking at magma.py the reason is line 40:

"cuda" if torch.cuda.is_available() else "cpu"

so it does not use the setting from example_inference.py. Changing that line to "cpu" is a dirty quick fix that works for me, I am getting magma successfully loaded

… until I get the next error:

Traceback (most recent call last):
  File "C:\Python\magma-master\example_inference.py", line 18, in <module>
    embeddings = model.preprocess_inputs(inputs)
  File "C:\Python\magma-master\magma\magma.py", line 192, in preprocess_inputs
    return self.embed(input_list)
  File "C:\Python\magma-master\magma\magma.py", line 209, in embed
    image_embeddings = self.image_prefix(x)
  File "C:\Users\frank\anaconda3\envs\magma\lib\site-packages\torch\nn\modules\module.py", line 1110, in _call_impl
    return forward_call(*input, **kwargs)
  File "C:\Python\magma-master\magma\image_prefix.py", line 83, in forward
    logits = self.enc(x)
  File "C:\Users\frank\anaconda3\envs\magma\lib\site-packages\torch\nn\modules\module.py", line 1110, in _call_impl
    return forward_call(*input, **kwargs)
  File "C:\Users\frank\anaconda3\envs\magma\lib\site-packages\clip\model.py", line 143, in forward
    x = stem(x)
  File "C:\Users\frank\anaconda3\envs\magma\lib\site-packages\clip\model.py", line 138, in stem
    x = self.relu(bn(conv(x)))
  File "C:\Users\frank\anaconda3\envs\magma\lib\site-packages\torch\nn\modules\module.py", line 1110, in _call_impl
    return forward_call(*input, **kwargs)
  File "C:\Users\frank\anaconda3\envs\magma\lib\site-packages\torch\nn\modules\conv.py", line 447, in forward
    return self._conv_forward(input, self.weight, self.bias)
  File "C:\Users\frank\anaconda3\envs\magma\lib\site-packages\torch\nn\modules\conv.py", line 443, in _conv_forward
    return F.conv2d(input, weight, bias, self.stride,
RuntimeError: "slow_conv2d_cpu" not implemented for 'Half'

Seems like CPU is not yet implemented?

Best

0reactions
Mayukhdebcommented, Mar 24, 2022

@Tuxius I can personally confirm that this works on an RTX-3090 as shown on example_inference.py 🙂

Read more comments on GitHub >

github_iconTop Results From Across the Web

ImportError: No module named 'transformers' · Issue #2478
Questions & Help I have installed transformers by "pip install transformers command" However, when I tried to use it, it says no module....
Read more >
11.3 kB
... List from torchtyping import TensorType from transformers.file_utils import ModelOutput from magma.config import MultimodalConfig from magma.utils ...
Read more >
Python3.6 error - No module named 'src'
This is because it is not able to locate the module named 'src' probably because the path to the 'src' folder isn't specified...
Read more >
modulenotfounderror: no module named 'transformers' ...
modulenotfounderror: no module named 'transformers' comes when you have not installed it on your system. Know how to solve this error.
Read more >
No module named 'transformers' - Python
The Python ModuleNotFoundError: No module named 'transformers' occurs when we forget to install the `transformers` module before importing it or install it ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found