question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. ItΒ collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

TypeError: 'NoneType' object is not subscriptable

See original GitHub issue

I am having this error while trying to load the model.

from detoxify import Detoxify

model = Detoxify('original', device="cuda")


---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In [15], line 3
      1 from detoxify import Detoxify
----> 3 results = Detoxify('original').predict('some text')

File ~/.conda/envs/py/lib/python3.9/site-packages/detoxify/detoxify.py:103, in Detoxify.__init__(self, model_type, checkpoint, device, huggingface_config_path)
    101 def __init__(self, model_type="original", checkpoint=PRETRAINED_MODEL, device="cpu", huggingface_config_path=None):
    102     super().__init__()
--> 103     self.model, self.tokenizer, self.class_names = load_checkpoint(
    104         model_type=model_type,
    105         checkpoint=checkpoint,
    106         device=device,
    107         huggingface_config_path=huggingface_config_path,
    108     )
    109     self.device = device
    110     self.model.to(self.device)

File ~/.conda/envs/py/lib/python3.9/site-packages/detoxify/detoxify.py:56, in load_checkpoint(model_type, checkpoint, device, huggingface_config_path)
     50 change_names = {
     51     "toxic": "toxicity",
     52     "identity_hate": "identity_attack",
     53     "severe_toxic": "severe_toxicity",
     54 }
     55 class_names = [change_names.get(cl, cl) for cl in class_names]
---> 56 model, tokenizer = get_model_and_tokenizer(
     57     **loaded["config"]["arch"]["args"],
     58     state_dict=loaded["state_dict"],
     59     huggingface_config_path=huggingface_config_path,
     60 )
     62 return model, tokenizer, class_names

File ~/.conda/envs/py/lib/python3.9/site-packages/detoxify/detoxify.py:20, in get_model_and_tokenizer(model_type, model_name, tokenizer_name, num_classes, state_dict, huggingface_config_path)
     16 def get_model_and_tokenizer(
     17     model_type, model_name, tokenizer_name, num_classes, state_dict, huggingface_config_path=None
     18 ):
     19     model_class = getattr(transformers, model_name)
---> 20     model = model_class.from_pretrained(
     21         pretrained_model_name_or_path=None,
     22         config=huggingface_config_path or model_type,
     23         num_labels=num_classes,
     24         state_dict=state_dict,
     25         local_files_only=huggingface_config_path is not None,
     26     )
     27     tokenizer = getattr(transformers, tokenizer_name).from_pretrained(
     28         huggingface_config_path or model_type,
     29         local_files_only=huggingface_config_path is not None,
     30         # TODO: may be needed to let it work with Kaggle competition
     31         # model_max_length=512,
     32     )
     34     return model, tokenizer

File ~/.conda/envs/py/lib/python3.9/site-packages/transformers/modeling_utils.py:2379, in PreTrainedModel.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
   2369     if dtype_orig is not None:
   2370         torch.set_default_dtype(dtype_orig)
   2372     (
   2373         model,
   2374         missing_keys,
   2375         unexpected_keys,
   2376         mismatched_keys,
   2377         offload_index,
   2378         error_msgs,
-> 2379     ) = cls._load_pretrained_model(
   2380         model,
   2381         state_dict,
   2382         loaded_state_dict_keys,  # XXX: rename?
   2383         resolved_archive_file,
   2384         pretrained_model_name_or_path,
   2385         ignore_mismatched_sizes=ignore_mismatched_sizes,
   2386         sharded_metadata=sharded_metadata,
   2387         _fast_init=_fast_init,
   2388         low_cpu_mem_usage=low_cpu_mem_usage,
   2389         device_map=device_map,
   2390         offload_folder=offload_folder,
   2391         offload_state_dict=offload_state_dict,
   2392         dtype=torch_dtype,
   2393         load_in_8bit=load_in_8bit,
   2394     )
   2396 model.is_loaded_in_8bit = load_in_8bit
   2398 # make sure token embedding weights are still tied if needed

File ~/.conda/envs/py/lib/python3.9/site-packages/transformers/modeling_utils.py:2572, in PreTrainedModel._load_pretrained_model(cls, model, state_dict, loaded_keys, resolved_archive_file, pretrained_model_name_or_path, ignore_mismatched_sizes, sharded_metadata, _fast_init, low_cpu_mem_usage, device_map, offload_folder, offload_state_dict, dtype, load_in_8bit)
   2569                 del state_dict[checkpoint_key]
   2570     return mismatched_keys
-> 2572 folder = os.path.sep.join(resolved_archive_file[0].split(os.path.sep)[:-1])
   2573 if device_map is not None and is_safetensors:
   2574     param_device_map = expand_device_map(device_map, original_loaded_keys)

TypeError: 'NoneType' object is not subscriptable

pip install information:

Collecting detoxify
  Downloading detoxify-0.5.0-py3-none-any.whl (12 kB)
Collecting transformers!=4.18.0
  Downloading transformers-4.25.1-py3-none-any.whl (5.8 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.8/5.8 MB 75.2 MB/s eta 0:00:0000:0100:01
Collecting torch>=1.7.0
  Downloading torch-1.13.0-cp39-cp39-manylinux1_x86_64.whl (890.2 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 890.2/890.2 MB 3.6 MB/s eta 0:00:0000:0100:01
Collecting sentencepiece>=0.1.94
  Downloading sentencepiece-0.1.97-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 107.1 MB/s eta 0:00:00
Collecting typing-extensions
  Downloading typing_extensions-4.4.0-py3-none-any.whl (26 kB)
Collecting nvidia-cuda-nvrtc-cu11==11.7.99
  Downloading nvidia_cuda_nvrtc_cu11-11.7.99-2-py3-none-manylinux1_x86_64.whl (21.0 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 21.0/21.0 MB 77.9 MB/s eta 0:00:0000:0100:01
Collecting nvidia-cublas-cu11==11.10.3.66
  Downloading nvidia_cublas_cu11-11.10.3.66-py3-none-manylinux1_x86_64.whl (317.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 317.1/317.1 MB 8.9 MB/s eta 0:00:0000:0100:01
Collecting nvidia-cuda-runtime-cu11==11.7.99
  Downloading nvidia_cuda_runtime_cu11-11.7.99-py3-none-manylinux1_x86_64.whl (849 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 849.3/849.3 kB 112.2 MB/s eta 0:00:00
Collecting nvidia-cudnn-cu11==8.5.0.96
  Downloading nvidia_cudnn_cu11-8.5.0.96-2-py3-none-manylinux1_x86_64.whl (557.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 557.1/557.1 MB 6.0 MB/s eta 0:00:0000:0100:01
Requirement already satisfied: wheel in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from nvidia-cublas-cu11==11.10.3.66->torch>=1.7.0->detoxify) (0.37.1)
Requirement already satisfied: setuptools in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from nvidia-cublas-cu11==11.10.3.66->torch>=1.7.0->detoxify) (63.4.1)
Collecting regex!=2019.12.17
  Downloading regex-2022.10.31-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (769 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 770.0/770.0 kB 116.5 MB/s eta 0:00:00
Requirement already satisfied: numpy>=1.17 in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from transformers!=4.18.0->detoxify) (1.23.4)
Requirement already satisfied: tqdm>=4.27 in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from transformers!=4.18.0->detoxify) (4.64.1)
Requirement already satisfied: pyyaml>=5.1 in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from transformers!=4.18.0->detoxify) (6.0)
Collecting filelock
  Downloading filelock-3.8.2-py3-none-any.whl (10 kB)
Requirement already satisfied: packaging>=20.0 in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from transformers!=4.18.0->detoxify) (21.3)
Requirement already satisfied: requests in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from transformers!=4.18.0->detoxify) (2.28.1)
Collecting huggingface-hub<1.0,>=0.10.0
  Downloading huggingface_hub-0.11.1-py3-none-any.whl (182 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 182.4/182.4 kB 103.0 MB/s eta 0:00:00
Collecting tokenizers!=0.11.3,<0.14,>=0.11.1
  Downloading tokenizers-0.13.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (7.6 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.6/7.6 MB 33.4 MB/s eta 0:00:0000:0100:01m
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from packaging>=20.0->transformers!=4.18.0->detoxify) (3.0.9)
Requirement already satisfied: certifi>=2017.4.17 in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from requests->transformers!=4.18.0->detoxify) (2022.9.24)
Requirement already satisfied: charset-normalizer<3,>=2 in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from requests->transformers!=4.18.0->detoxify) (2.1.1)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from requests->transformers!=4.18.0->detoxify) (1.26.12)
Requirement already satisfied: idna<4,>=2.5 in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from requests->transformers!=4.18.0->detoxify) (3.4)
Installing collected packages: tokenizers, sentencepiece, typing-extensions, regex, nvidia-cuda-runtime-cu11, nvidia-cuda-nvrtc-cu11, nvidia-cublas-cu11, filelock, nvidia-cudnn-cu11, huggingface-hub, transformers, torch, detoxify
Successfully installed detoxify-0.5.0 filelock-3.8.2 huggingface-hub-0.11.1 nvidia-cublas-cu11-11.10.3.66 nvidia-cuda-nvrtc-cu11-11.7.99 nvidia-cuda-runtime-cu11-11.7.99 nvidia-cudnn-cu11-8.5.0.96 regex-2022.10.31 sentencepiece-0.1.97 tokenizers-0.13.2 torch-1.13.0 transformers-4.25.1 typing-extensions-4.4.0

additional information python 3.9.13 haa1d7c7_2
on linux

Issue Analytics

  • State:open
  • Created 9 months ago
  • Reactions:3
  • Comments:6

github_iconTop GitHub Comments

2reactions
user1342commented, Dec 11, 2022

As a quick fix for this. On version 0.5.0, I was able to get around this error by commenting out line 2572 in modeling_utils.py (see this fork of transformers):

Replace: folder = os.path.sep.join(resolved_archive_file[0].split(os.path.sep)[:-1]) with #folder = os.path.sep.join(resolved_archive_file[0].split(os.path.sep)[:-1])

I’m unsure if this fix will have repercussions later down the line, however, for the time being it seems to be working.

1reaction
laurahanucommented, Dec 13, 2022

Hey all, thanks for raising this and good to see there’s a workaround!

We will look into this asap!

Read more comments on GitHub >

github_iconTop Results From Across the Web

TypeError: 'NoneType' object is not subscriptable
NoneType is the type of the None object which represents a lack of value, for example, a function that does not explicitly return...
Read more >
Python Math - 'NoneType' object is not subscriptable
The reason that lista gets set to None is because the return value of list.sort() is None ... it does not return a...
Read more >
[Solved] TypeError: 'NoneType' Object is Not Subscriptable
The error, NoneType object is not subscriptable, means that you were trying to subscript a NoneType object. This resulted in a type error....
Read more >
TypeError: 'NoneType' object is not subscriptable in Python
The Python "TypeError: 'NoneType' object is not subscriptable" occurs when we try to access a None value at a specific index. To solve...
Read more >
Python TypeError: 'NoneType' object is not subscriptable
The β€œTypeError: 'NoneType' object is not subscriptable” error is common if you assign the result of a built-in list method like sort() ,...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found