unsupervised learning -tsda
See original GitHub issueHi, I used TSDA method to pretrain a BERT model on a corpus of sentences and I got this error:
RuntimeError: CUDA error: CUBLAS_STATUS_ALLOC_FAILED when calling cublasCreate(handle)
and then used CUDA_LAUNCH_BLOCKING=1 python [YOUR_PROGRAM] to trace the error and got this:
RuntimeError: CUDA error: device-side assert triggered
any help?
Issue Analytics
- State:
- Created 2 years ago
- Comments:21 (10 by maintainers)
Top Results From Across the Web
TSDAE — Sentence-Transformers documentation
This section shows an example, of how we can train an unsupervised TSDAE (Tranformer-based Denoising AutoEncoder) model with pure sentences as training data ......
Read more >Trying to understand the experiments in UDA ...
A recent paper called Unsupervised Data Augmentation for Consistency Training has claimed to achieve state-of-the-art results on IMDb using ...
Read more >TSA Looking for New Tech, ML to Improve Screening ...
The Transportation Security Administration (TSA) is looking to improve its airport screening technology – and its use of machine learning ...
Read more >Airline Security Through Artificial Intelligence
How the Transportation Security Administration Can Use Machine Learning to ... are similar to those involved in TSA's screening process for checked baggage....
Read more >Framework of TSA based on machine learning techniques. ...
A general framework of machine learning-based TSA is shown in Figure 1. In terms of classifiers, there are generally two aspects determining their...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
I tried this and it was ok, but actually I think the problem was due to some tokens that weren’t in utf-8 encoding, when I removed them the problem was solved.
Hi @ReySadeghi, I cannot reproduce it: I found it can successfully load the SBERT checkpoint with added tokens. Before a more detailed conversation, could you please do this checking: (to see if there will still be the assertion error without TSDAE)
If running this new snippet also reports the error, I think it might be related to your transformers version. And if this works well, you can change the
vocab
variable above into your new token list and try again.