XLM-RoBERTa is not supported
See original GitHub issueHi
Thanks a lot for your hard work. Can you add support XLM-RoBERTa
too? Thanks
Issue Analytics
- State:
- Created 4 years ago
- Comments:12 (7 by maintainers)
Top Results From Across the Web
XLM-RoBERTa - Hugging Face
XLM -RoBERTa is a multilingual model trained on 100 different languages. Unlike some XLM multilingual models, it does not require lang tensors to...
Read more >XLM-RoBERTa model for QA seems not properly work #7774
The problem arises when using: the official example scripts: run_squad.py; my own modified scripts: (give details below). The tasks I am working ......
Read more >nlp - What is the difference between XLM-roberta-base and ...
I heard xlm-roberta-large works better in a few shot setting, have no idea why or if that's true. Is xlm-roberta-base zero-shot?
Read more >XLM-RoBERTa | Lecture 56 (Part 2) - YouTube
Unsupervised Cross-lingual Representation Learning at ScaleCourse Materials: https://github.com/maziarraissi/Applied-Deep-Learning.
Read more >Zero Shot Multilingual Learning with XLM-Roberta ... - YouTube
Also Checkout my 2nd Channel ( on Trading, Crypto & Investments ) - https://www.youtube.com/channel/UChMwVQBFtaOga5Mh0uE1Icg I am a Banker ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Added in 0.17.0. Let me know if you run into any issues! 😉
Is there a reason why xml-roberta is not supported for language modeling fine-tuning?