TypeError: can't pickle Environment objects on Windows/MacOs
See original GitHub issueI’m running under Windows 10, following along the instructions given by the readme document. When trying to retrain the model using this command
python nerTagger.py --dataset-type conll2003 train_eval
I ran into the following exception (right after compiling embeddings) - any tips?
Thank you for the wonderful work!
Compiling embeddings... (this is done only one time per embeddings at first launch)
path: d:\Projects\embeddings\glove.840B.300d.txt
100%|████████████████████████████████████████████████████████████████████| 2196017/2196017 [08:06<00:00, 4517.80it/s] embeddings loaded for 2196006 words and 300 dimensions
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
char_input (InputLayer) (None, None, 30) 0
__________________________________________________________________________________________________
time_distributed_1 (TimeDistrib (None, None, 30, 25) 2150 char_input[0][0]
__________________________________________________________________________________________________
word_input (InputLayer) (None, None, 300) 0
__________________________________________________________________________________________________
time_distributed_2 (TimeDistrib (None, None, 50) 10200 time_distributed_1[0][0]
__________________________________________________________________________________________________
concatenate_1 (Concatenate) (None, None, 350) 0 word_input[0][0]
time_distributed_2[0][0]
__________________________________________________________________________________________________
dropout_1 (Dropout) (None, None, 350) 0 concatenate_1[0][0]
__________________________________________________________________________________________________
bidirectional_2 (Bidirectional) (None, None, 200) 360800 dropout_1[0][0]
__________________________________________________________________________________________________
dropout_2 (Dropout) (None, None, 200) 0 bidirectional_2[0][0]
__________________________________________________________________________________________________
dense_1 (Dense) (None, None, 100) 20100 dropout_2[0][0]
__________________________________________________________________________________________________
dense_2 (Dense) (None, None, 10) 1010 dense_1[0][0]
__________________________________________________________________________________________________
chain_crf_1 (ChainCRF) (None, None, 10) 120 dense_2[0][0]
==================================================================================================
Total params: 394,380
Trainable params: 394,380
Non-trainable params: 0
__________________________________________________________________________________________________
Epoch 1/60
Exception in thread Thread-2:
Traceback (most recent call last):
File "d:\Anaconda3\Lib\threading.py", line 916, in _bootstrap_inner
self.run()
File "d:\Anaconda3\Lib\threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "d:\Projects\delft\env\lib\site-packages\keras\utils\data_utils.py", line 548, in _run
with closing(self.executor_fn(_SHARED_SEQUENCES)) as executor:
File "d:\Projects\delft\env\lib\site-packages\keras\utils\data_utils.py", line 522, in <lambda>
initargs=(seqs,))
File "d:\Anaconda3\Lib\multiprocessing\context.py", line 119, in Pool
context=self.get_context())
File "d:\Anaconda3\Lib\multiprocessing\pool.py", line 174, in __init__
self._repopulate_pool()
File "d:\Anaconda3\Lib\multiprocessing\pool.py", line 239, in _repopulate_pool
w.start()
File "d:\Anaconda3\Lib\multiprocessing\process.py", line 105, in start
self._popen = self._Popen(self)
File "d:\Anaconda3\Lib\multiprocessing\context.py", line 322, in _Popen
return Popen(process_obj)
File "d:\Anaconda3\Lib\multiprocessing\popen_spawn_win32.py", line 65, in __init__
reduction.dump(process_obj, to_child)
File "d:\Anaconda3\Lib\multiprocessing\reduction.py", line 60, in dump
ForkingPickler(file, protocol).dump(obj)
TypeError: can't pickle Environment objects
Issue Analytics
- State:
- Created 5 years ago
- Comments:14
Top Results From Across the Web
TypeError: can't pickle Environment objects · Issue #526
Hello,. I'm trying to run the dcgan/main.py file to train a GAN. I'm using a Windows 7 system with python 3.7 (anaconda).
Read more >python 3.x - TypeError: can't pickle _thread.lock objects
The problem here is that self in function run_parallel() can't be pickled as it is a class instance.
Read more >Python “multiprocessing” “Can't pickle…” - TedChen
The 'pickle' issue in 'multiprocessing' is because of passing objects as arguments between process. There is 3 methods to start process in 'multiprocessing'....
Read more >Multiprocessing and Pickle, How to Easily fix that?
Pickling or Serialization transforms from object state into a series of bits — the object could be methods, data, class, API end-points, ...
Read more >Torch——[TypeError: can't pickle Environment objects]解决 ...
问题描述问题分析因为windows操作系统的原因,在Windows中,多进程multiprocessing使用的是序列化pickle来在多进程之间转移数据,而socket对象是不能 ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@davidlenz Sadly, I had to boot my laptop in Linux (Ubuntu) and run the tool. On Linux, I didn’t face that issue. It’s maybe the problem with Windows and I also looking forward to hearing new update on this too
I have the same problem with MacOs.
The solution is to disable the multithreading by setting
nb_workers = 0
. Depending on the task to be performed it should modified in bothsequenceLabelling/wrapper.py
andtrainer.py: 172
.