Unable to load the repository in google colab
See original GitHub issueI have already cloned the repository using: !git clone https://github.com/qubvel/segmentation_models
Now when I try to load it (import it), it shows following error:
from segmentation_models import Unet
--------------------------------------------------------------------------
ModuleNotFoundError Traceback (most recent call last)
<ipython-input-21-95926c7db055> in <module>()
----> 1 from segmentation_models import Unet
/content/segmentation_models/__init__.py in <module>()
----> 1 from .segmentation_models import *
/content/segmentation_models/segmentation_models/__init__.py in <module>()
3 from .__version__ import __version__
4
----> 5 from .unet import Unet
6 from .fpn import FPN
7 from .linknet import Linknet
/content/segmentation_models/segmentation_models/unet/__init__.py in <module>()
----> 1 from .model import Unet
/content/segmentation_models/segmentation_models/unet/model.py in <module>()
2 from ..utils import freeze_model
3 from ..utils import legacy_support
----> 4 from ..backbones import get_backbone, get_feature_layers
5
6 old_args_map = {
/content/segmentation_models/segmentation_models/backbones/__init__.py in <module>()
----> 1 from classification_models import Classifiers
2 from classification_models import resnext
3
4 from . import inception_resnet_v2 as irv2
5 from . import inception_v3 as iv3
/content/classification_models/__init__.py in <module>()
----> 1 from .classification_models import *
/content/classification_models/classification_models/__init__.py in <module>()
3 from . import resnet as rn
4 from . import senet as sn
----> 5 from . import keras_applications as ka
6
7
/content/classification_models/classification_models/keras_applications/__init__.py in <module>()
1 import keras
----> 2 from .keras_applications.keras_applications import *
3
4 set_keras_submodules(
5 backend=keras.backend,
ModuleNotFoundError: No module named 'classification_models.classification_models.keras_applications.keras_applications.keras_applications'
Can anyone help regarding this? Thanks.
Issue Analytics
- State:
- Created 5 years ago
- Comments:17 (7 by maintainers)
Top Results From Across the Web
Error Unable to load Colaboratory. · Issue #1693 - GitHub
I've resolved the issue in my Jupyter file. It's code block id issue on Google colab. While same file running on local pc's...
Read more >Google Colab cannot access Github private repositories
Go to colab main page colab.research.google.com. Go to GitHub tab. Check the checkbox with the label "include private repos". Then colab will ...
Read more >Google Drive + Google Colab + GitHub; Don't Just Read, Do It!
How GitHub, Google Colab and Google Drive work together; How to deal with custom file, and push Jupyter notebook changes to GitHub Repo....
Read more >Access GitHub dataset to Google Colab notebook - YouTube
... any database which is available on GitHub repository to our google colab notebook.https://gi... ... Your browser can't play this video.
Read more >Google Colab + Git - Pushing Changes to a GitHub Repo!
Brief tutorial on how to access Git in Google Colab and push changes to a GitHub Repo.The notebook can be found in the...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
One more note, choose another network for multiclass segmentation. Unet has only 16 filters at the end, it would be hard to separate 255 classes. Better take PSP or FPN.
Never did it with so many classes. I think it depends on data a lot. Try to use PSP with downsampling_factor=16 and heavy encoders like InceptionResNetV2/Senet154 (require a lot of GPU memory), do not downsample image a lot. Maybe you need several GPUs to train something.
P.S. add aux output to help training (reed paper of PSPNet) P.P.S.
use_batchnorm=False
to reduce required memory P.P.P.S. (😄) use weighted loss function