Pickle error in Parallel
See original GitHub issueHey,
I get an error with 0.12 that didn’t occur in 0.11
snippet of code:
num_cores = multiprocessing.cpu_count()
chunks = _get_chunks(ids, 100)
logger.info('cores: {0}, chunks: {1}'.format(num_cores, len(chunks)))
Parallel(n_jobs=num_cores)(delayed(_clean_text)(c) for c in chunks)
The resulting error is:
Cannot pickle files that are not opened for reading: a
Issue Analytics
- State:
- Created 5 years ago
- Comments:13 (9 by maintainers)
Top Results From Across the Web
Pickle error when making parallel version of map function in ...
I'm trying to use it the same exact way as I would use the regular map function. Why does this happen? How can...
Read more >Serialization of un-picklable objects - Joblib
First, define functions which cannot be pickled with the standard pickle protocol. They cannot be serialized with pickle because they are defined in...
Read more >Parallel sampling without pickle in version 4.0.0b3 - v4
I'm using a black-box likelihood and run into an error when parallel sampling. version 4.0.0b3. Traceback (most recent call last): File ...
Read more >parallel test running throwing pickleError - Google Groups
Setting a breakpoint inside that method around the error message should give you the required pointers to fix the import and thus the...
Read more >Multiprocessing and Pickle, How to Easily fix that?
However, the multiprocess tasks can't be pickled; it would raise an error failing to pickle. That's because when dividing a single task over ......
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@EloyRoura the point of using the loky backend in joblib is to replace the multiprocessing.Pool backend. With the loky backend you no longer need to protect your code with the
if __name__ == '__main__'
condition.Here is an example:
For completeness, here is a stand-alone snippet for the pymc3-related issue: