question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

size error while running predict script

See original GitHub issue

when I run the prediction script on a custom trained model (unlabeled paired data) I get the following error:

Traceback (most recent call last):
  File "demos/paired_MMIV/demo_predict.py", line 39, in <module>
    save_png=False,
  File "/home/charlie/3DREG-tests/DeepReg/deepreg/predict.py", line 334, in predict
    save_png=save_png,
  File "/home/charlie/3DREG-tests/DeepReg/deepreg/predict.py", line 81, in predict_on_dataset
    outputs_dict = model.predict(x=inputs_dict)
  File "/home/charlie/anaconda3/envs/deepreg/lib/python3.7/site-packages/tensorflow/python/keras/engine/training.py", line 130, in _method_wrapper
    return method(self, *args, **kwargs)
  File "/home/charlie/anaconda3/envs/deepreg/lib/python3.7/site-packages/tensorflow/python/keras/engine/training.py", line 1599, in predict
    tmp_batch_outputs = predict_function(iterator)
  File "/home/charlie/anaconda3/envs/deepreg/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py", line 780, in __call__
    result = self._call(*args, **kwds)
  File "/home/charlie/anaconda3/envs/deepreg/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py", line 814, in _call
    results = self._stateful_fn(*args, **kwds)
  File "/home/charlie/anaconda3/envs/deepreg/lib/python3.7/site-packages/tensorflow/python/eager/function.py", line 2829, in __call__
    return graph_function._filtered_call(args, kwargs)  # pylint: disable=protected-access
  File "/home/charlie/anaconda3/envs/deepreg/lib/python3.7/site-packages/tensorflow/python/eager/function.py", line 1848, in _filtered_call
    cancellation_manager=cancellation_manager)
  File "/home/charlie/anaconda3/envs/deepreg/lib/python3.7/site-packages/tensorflow/python/eager/function.py", line 1924, in _call_flat
    ctx, args, cancellation_manager=cancellation_manager))
  File "/home/charlie/anaconda3/envs/deepreg/lib/python3.7/site-packages/tensorflow/python/eager/function.py", line 550, in call
    ctx=ctx)
  File "/home/charlie/anaconda3/envs/deepreg/lib/python3.7/site-packages/tensorflow/python/eager/execute.py", line 60, in quick_execute
    inputs, attrs, num_outputs)
tensorflow.python.framework.errors_impl.InvalidArgumentError:  Input to reshape is a tensor with 1769472 values, but the requested shape has 3538944
         [[node DDFRegistrationModelWithoutLabel/tf_op_layer_Reshape/Reshape (defined at /home/charlie/3DREG-tests/DeepReg/deepreg/predict.py:81) ]] [Op:__inference_predict_function_6158]

Function call stack:
predict_function

the requested shape of the tensor seems to be double the one passed in input at some point and I don’t understand why, I have checked that all the shapes and the datatype of the nifti files in both training and test datasets are consistent…

any idea why this could happen?

and it occurs at the line save_png=False of the predict script but the same exact error happens with save_png=True

in that script the relevant part is:

gpu = "2,3"
gpu_allow_growth = True
predict(
    gpu=gpu,
    gpu_allow_growth=gpu_allow_growth,
    config_path=config_path,
    ckpt_path=ckpt_path,
    mode="test",
    batch_size=1,
    sample_label="all",
    log_dir=log_dir,
    save_png=False, # True is the same
)

finally, this error does not prevent the actual predictions to be output, and they look reasonable too, so I am not sure which level of the code it interferes with

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:5 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
mathpluscodecommented, Dec 12, 2020

Hi again, @ciphercharly, actually, I might know the reason, you are using two GPUs, so that batch_size = 1, means 1 per GPU? Can you try using only one GPU for inference?

0reactions
ciphercharlycommented, Dec 13, 2020

thanks

Read more comments on GitHub >

github_iconTop Results From Across the Web

out of memory when using model.predict() #5337 - GitHub
hi~ I am now using keras to build my network. the training is normal. but when I use the model.predict() to predict the...
Read more >
Getting dimension mismatch error when i try to predict with ...
Now I would like to perform on a new dataset my prediction but I always get following error message raise ValueError('dimension mismatch') ...
Read more >
What to do when you get an error - Hugging Face Course
In this section we'll look at some common errors that can occur when you're trying to generate predictions from your freshly tuned Transformer...
Read more >
Embedded R Script error: cannot allocate vector of size 838.6 ...
I am trying to use linear model in Embedded R Scripts. In the Linear Model, we have to predict on a column which...
Read more >
Error while using `predict` on a tidymodels workflow
The data I am working with dflog has three variables, Height , Weight and Gender and Gender is what I am trying to...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found