Trying to load bert vocab would result in unpickling error
See original GitHub issueHey,
I was trying to serve a finetuned bert model. in the logs of torchserve I noticed that the workers are not being loaded.
The exception I got is
2020-04-22 14:06:23,581 [INFO ] W-9004-sentiment-analysis_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - self.source_vocab = torch.load(self.manifest['model']['sourceVocab'])
2020-04-22 14:06:23,581 [INFO ] W-9007-sentiment-analysis_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - _pickle.UnpicklingError: invalid load key, '['.
The vocab file I used is this one
Also i noticed that even trying torch.load('vocab.txt')
would throw the same error. Am i missing something with how to load the vocab ?
Thank you
Issue Analytics
- State:
- Created 3 years ago
- Comments:11 (3 by maintainers)
Top Results From Across the Web
What to do when you get an error - Hugging Face Course
In this section we'll look at some common errors that can occur when you're trying to generate predictions from your freshly tuned Transformer...
Read more >What causes the error "_pickle.UnpicklingError: invalid load ...
The problem was that the pickle is created via sklearn.externals.joblib and i was trying to load it via standard pickle library.
Read more >How to Build a WordPiece Tokenizer For BERT
We will learn how to build a WordPiece tokenizer for BERT from scratch. ... During tokenization vocab.txt is used to map text to...
Read more >Loading big Doc2Vec model with error UnpicklingError
If you get a "pickle data was truncated" error when trying to load the model, then the portion of the save-data that is...
Read more >BERT - Tokenization and Encoding - Albert Au Yeung
Let's first try to understand how an input sentence should be represented ... This is commonly known as the out-of-vocabulary (OOV) problem.
Read more >Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@wassimseif For
transformers
models, you will need to provide custom handler code, the defaulttext_handler
does not supporttransformers
models as of now. For more details on how this would work, see: https://medium.com/@freidankm_39840/deploy-huggingface-s-bert-to-production-with-pytorch-serve-27b068026d18Hey @vgoklani Thanks !
Oh-My-Zsh + A LOT of configurations & long restless nights