question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

device-side assert triggered Error while doing inference on Distilbert and Bert

See original GitHub issue

Environment info

  • transformers version: 3.4.0
  • Platform: Colab
  • Python version: 3.8
  • PyTorch version (GPU?):
  • Tensorflow version (GPU?):
  • Using GPU in script?:
  • Using distributed or parallel set-up in script?:

Who can help

Information

Model I am using Distilbert

The problem arises when using:

  • the official example scripts: (give details below)
  • my own modified scripts: (give details below) colab

The tasks I am working on is:

  • an official GLUE/SQUaD task: SQUaD v2
  • my own task or dataset:

To reproduce

Steps to reproduce the behavior:

  1. Get Model and Tokenizer
  2. Get SQUAD2 datasets
  3. Performance Inference on the validation dataset with GPU
  4. get result on SQUAD V2 Metrix

Run the colab to reproduce.

Expected behavior

It should give results in the below format without error on SQUAD V2 Metrix

{'exact': 79.4660153288975, 'f1': 82.91266052065696, 'total': 11873, 'HasAns_exact': 77.64844804318489, 'HasAns_f1': 84.55162253066118, 'HasAns_total': 5928, 'NoAns_exact': 81.27838519764508, 'NoAns_f1': 81.27838519764508, 'NoAns_total': 5945, 'best_exact': 79.4660153288975, 'best_exact_thresh': 1.0, 'best_f1': 82.91266052065693, 'best_f1_thresh': 1.0}

Note: This code is working fine for longformer model. I found this issue in Distilbert and Bert Model while doing inference on GPU

Tagging SMEs: @LysandreJik

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:5 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
LysandreJikcommented, Feb 24, 2021

Glad we could help, closing!

1reaction
LysandreJikcommented, Feb 24, 2021

Hi @bhadreshpsavani, this can’t work on CPU. You’re sending a sequence that is too long to the model so it cannot handle it.

Please replace

inputs = tokenizer(example['question'], example['context'], return_tensors="pt")

by

inputs = tokenizer(example['question'], example['context'], return_tensors="pt", truncation=True)

This truncates the sequences that are too long.

Your colab should work then.

Read more comments on GitHub >

github_iconTop Results From Across the Web

RuntimeError: CUDA error: device-side assert triggered · Issue ...
Questions & Help . "RuntimeError: CUDA error: device-side assert triggered" occurs. My model is as follows: class TextClassify(nn.
Read more >
[HELP] RuntimeError: CUDA error: device-side assert triggered
I get this error: RuntimeError: CUDA error: device-side assert triggered CUDA kernel errors might be asynchronously reported at some other API call,so the ......
Read more >
CUDA error: device-side assert triggered - BART model ...
My code was working fine when I used for another encoder-decoder model (T5), but with bart I am getting this error:
Read more >
CUDA Error: Device-Side Assert Triggered: Solved | Built In
A CUDA error: device-side assert triggered is an error that's often caused when you either have inconsistency between the number of labels and ......
Read more >
How to fix “CUDA error: device-side assert triggered” error?
I use huggingface Transformer to fine-tune a binary classification model. When I do inference job on big data. In rare case, it will...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found