Can't import ClassificationModel in Kaggle
See original GitHub issueCan’t import ClassificationModel in the private notebook in Kaggle (Python 3.7). I start:
!pip install simpletransformers
import torch
from simpletransformers.classification import ClassificationModel
and get the error on the last command:
ImportError: cannot import name ‘MobileBertConfig’ from ‘transformers’ (/opt/conda/lib/python3.7/site-packages/transformers/init.py)
I tried changing parameter “Environment” = “original environment (2020-07-06)” or “latest environment” - no difference
How to fix it?
Issue Analytics
- State:
- Created 3 years ago
- Comments:7 (3 by maintainers)
Top Results From Across the Web
I can't import a dataset I have downloaded from kaggle in a ...
I can't import a dataset I have downloaded from kaggle in a Jupyter Notebook. arrow_drop_up 0. Hey Data scientists!! Here is where I...
Read more >SVM Classifier Tutorial | Kaggle
I will start off by importing the required Python libraries. ... Based on the above analysis we can conclude that our classification model...
Read more >Logistic Regression Classifier Tutorial - Kaggle
I train a binary classification model using Logistic Regression. ... pd.read_csv) import matplotlib.pyplot as plt # data visualization import seaborn as sns ...
Read more >Student's Performance and Classification Model - Kaggle
Explore and run machine learning code with Kaggle Notebooks | Using data from Students Performance in Exams.
Read more >Bias and Variance in Machine Learning | Kaggle
The irreducible error cannot be reduced regardless of what algorithm is used. ... Importing required Libraries import numpy as np # linear algebra...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I solved the problem with a lack of space by switching from model “BERT” to “DistilBERT”. And this gave a very good result even in 1 or 2 epochs. I made public my notebook - you can see: https://www.kaggle.com/vbmokin/nlp-with-dt-simple-transformers-research This is not a prize contest - this is a contest to test new NLP technologies - https://www.kaggle.com/c/nlp-getting-started Thank you for your amazing library and for your help in using it! Your advice “upgrade transformers” helped me.
It shouldn’t be necessary to remove them if your
transformers
version is up to date.