question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Execution stops with message "killed"

See original GitHub issue

I have a thinkpad X220 (no GPU 😄 ), i5 2520M 8GB. Running archlinux.

Setup.sh completed with no issue.

First two attempts with python image_from_text.py --text='alien life' --seed=7

Namespace(mega=False, torch=False, text='alien life', seed=7, image_path='generated', image_token_count=256)
parsing metadata from ./pretrained/dalle_bart_mini
tokenizing text
['Ġalien']
['Ġlife']
text tokens [0, 8925, 742, 2]
loading flax encoder
encoding text tokens
WARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)
loading flax decoder
sampling image tokens
Traceback (most recent call last):
  File "/home/peter/minidalle/min-dalle/image_from_text.py", line 44, in <module>
    image = generate_image_from_text(
  File "/home/peter/minidalle/min-dalle/min_dalle/generate_image.py", line 66, in generate_image_from_text
    image_tokens[...] = generate_image_tokens_flax(
  File "/home/peter/minidalle/min-dalle/min_dalle/min_dalle_flax.py", line 70, in generate_image_tokens_flax
    image_tokens = decode_flax(
  File "/home/peter/minidalle/min-dalle/min_dalle/min_dalle_flax.py", line 49, in decode_flax
    image_tokens = decoder.sample_image_tokens(
  File "/home/peter/minidalle/min-dalle/min_dalle/models/dalle_bart_decoder_flax.py", line 255, in sample_image_tokens
    _, image_tokens = lax.scan(

[…]

raise TypeError(msg.format(name, ", ".join(map(str, types))))
jax._src.traceback_util.UnfilteredStackTrace: TypeError: lax.dynamic_update_slice requires arguments to have the same dtypes, got float16, float32.

The stack trace below excludes JAX-internal frames.
The preceding is the original exception that occurred, unmodified.

--------------------

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/peter/minidalle/min-dalle/image_from_text.py", line 44, in <module>
    image = generate_image_from_text(
  File "/home/peter/minidalle/min-dalle/min_dalle/generate_image.py", line 66, in generate_image_from_text
    image_tokens[...] = generate_image_tokens_flax(
  File "/home/peter/minidalle/min-dalle/min_dalle/min_dalle_flax.py", line 70, in generate_image_tokens_flax
    image_tokens = decode_flax(
  File "/home/peter/minidalle/min-dalle/min_dalle/min_dalle_flax.py", line 49, in decode_flax
    image_tokens = decoder.sample_image_tokens(
  File "/home/peter/.local/lib/python3.10/site-packages/flax/linen/transforms.py", line 1246, in wrapped_fn
    return prewrapped_fn(self, *args, **kwargs)
  File "/home/peter/.local/lib/python3.10/site-packages/flax/linen/module.py", line 352, in wrapped_module_method
    return self._call_wrapped_method(fun, args, kwargs)
  File "/home/peter/.local/lib/python3.10/site-packages/flax/linen/module.py", line 651, in _call_wrapped_method
    y = fun(self, *args, **kwargs)
  File "/home/peter/minidalle/min-dalle/min_dalle/models/dalle_bart_decoder_flax.py", line 255, in sample_image_tokens
    _, image_tokens = lax.scan(
  File "/home/peter/minidalle/min-dalle/min_dalle/models/dalle_bart_decoder_flax.py", line 214, in sample_next_image_token
    logits, keys_state, values_state = self.apply(
  File "/home/peter/minidalle/min-dalle/min_dalle/models/dalle_bart_decoder_flax.py", line 189, in __call__
    decoder_state, (keys_state, values_state) = self.layers(
  File "/home/peter/.local/lib/python3.10/site-packages/flax/core/axes_scan.py", line 138, in scan_fn
    _, out_pvals, _ = pe.trace_to_jaxpr_nounits(f_flat, in_pvals)
  File "/home/peter/.local/lib/python3.10/site-packages/flax/core/axes_scan.py", line 114, in body_fn
    broadcast_out, c, ys = fn(broadcast_in, c, *xs)
  File "/home/peter/minidalle/min-dalle/min_dalle/models/dalle_bart_decoder_flax.py", line 95, in __call__
    decoder_state, keys_values_state = self.self_attn(
  File "/home/peter/minidalle/min-dalle/min_dalle/models/dalle_bart_decoder_flax.py", line 37, in __call__
    keys_state = lax.dynamic_update_slice(
TypeError: lax.dynamic_update_slice requires arguments to have the same dtypes, got float16, float32.

Other inputs lead to the execution finishing with a “killed” message:

[peter@peter-arcox220 min-dalle]$ python image_from_text.py --text='a comfy chair that looks like an avocado' --mega --seed=4
Namespace(mega=True, torch=False, text='a comfy chair that looks like an avocado', seed=4, image_path='generated', image_token_count=256)
parsing metadata from ./pretrained/dalle_bart_mega
tokenizing text
['Ġa']
['Ġcomfy']
['Ġchair']
['Ġthat']
['Ġlooks']
['Ġlike']
['Ġan']
['Ġavocado']
text tokens [0, 58, 29872, 2408, 766, 4126, 1572, 101, 16632, 2]
Killed
[peter@peter-arcox220 min-dalle]$ python image_from_text.py --text='a comfy chair that looks like an avocado' --mega --seed=4
Namespace(mega=True, torch=False, text='a comfy chair that looks like an avocado', seed=4, image_path='generated', image_token_count=256)
parsing metadata from ./pretrained/dalle_bart_mega
tokenizing text
['Ġa']
['Ġcomfy']
['Ġchair']
['Ġthat']
['Ġlooks']
['Ġlike']
['Ġan']
['Ġavocado']
text tokens [0, 58, 29872, 2408, 766, 4126, 1572, 101, 16632, 2]
Killed
[peter@peter-arcox220 min-dalle]$ python image_from_text.py --text='a comfy chair that looks like an avocado' --torch --seed=4
Namespace(mega=False, torch=True, text='a comfy chair that looks like an avocado', seed=4, image_path='generated', image_token_count=256)
parsing metadata from ./pretrained/dalle_bart_mini
tokenizing text
['Ġa']
['Ġcomfy']
['Ġchair']
['Ġthat']
['Ġlooks']
['Ġlike']
['Ġan']
['Ġavocado']
text tokens [0, 58, 29872, 2408, 766, 4126, 1572, 101, 16632, 2]
loading torch encoder
encoding text tokens
loading torch decoder
sampling image tokens
image token 0 is 23
image token 1 is 8867
image token 2 is 15149
image token 3 is 10225
image token 4 is 6271
[...]
image token 74 is 4319
image token 75 is 14420
image token 76 is 9720
image token 77 is 7781
image token 78 is 8583
image token 79 is 5401
Killed
[peter@peter-arcox220 min-dalle]$ python image_from_text.py --text='a comfy chair that looks like an avocado' --mega --seed=4
Namespace(mega=True, torch=False, text='a comfy chair that looks like an avocado', seed=4, image_path='generated', image_token_count=256)
parsing metadata from ./pretrained/dalle_bart_mega
tokenizing text
['Ġa']
['Ġcomfy']
['Ġchair']
['Ġthat']
['Ġlooks']
['Ġlike']
['Ġan']
['Ġavocado']
text tokens [0, 58, 29872, 2408, 766, 4126, 1572, 101, 16632, 2]
Killed

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:6 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
philpaxcommented, Jun 28, 2022

I suspect you’re running out of memory, and the OOM killer is stepping in - your system is unlikely to have the resources required to run inference. Just a guess, though.

0reactions
w4ffl35commented, Jul 4, 2022

it is also important to mention that the above screenshot does NOT represent resources being spent on torch. These are other applications that are running along with a lightweight flask server that will fire off a command to generate images. Attempting to run the model crashes it. This isn’t unexpected behavior, but it seems inefficient.

Perhaps this is a limitation of ML and torch, or perhaps torch is configured incorrect, unfortunately i do not have enough knowledge to know which is likely.

Read more comments on GitHub >

github_iconTop Results From Across the Web

What does 'killed' mean when processing a huge CSV with ...
"killed" generally means that the process received some signal that caused it to exit. In this case since it is happening at the...
Read more >
Python programs suddenly get killed - Unix Stack Exchange
I have a python script which runs for a few minutes creating matplotlib plot files. If i run the script from the commandline...
Read more >
Long running PHP process randomly stops with message 'Killed'
I have a PHP script I wrote that I am running, and it has ran all the way through before, but for some...
Read more >
process stops, see "killed" in temrinal window - Ask Ubuntu
I diagnosed the issue - out of memory ( 8GB memory ). I used : dmesg -T| grep -E -i -B100 'killed process'....
Read more >
Check What Killed a Linux Process - Baeldung
A common practice when trying to terminate a process is to try with a SIGTERM or SIGQUIT first, and if it doesn't stop...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found