Fail in cluster for load my lib
See original GitHub issueDescribe the issue:
I tried using LocalCluster to process my data, using the code below. I had no problem with the empty Client()
object, but when I put the cluster in the client. I will get some errors.
The task for do_prediction
is:
- use URL to read file in S3 and do feature engineering
- use a pretrained model which is loaded from a pickle file to predict the data
- upload prediction result to S3
The feature engineering part in do_prediction
used some function in my utl.py
all_meter_lst = load('all_meter_lst.pkl')
for cks in chunks(all_meter_lst, 999):
lazy_results = []
for meter_url in cks:
lazy_result = dask.delayed(do_prediction)(meter_url)
lazy_results.append(lazy_result)
dask.compute(*lazy_results)
# Not work!!!!!
# cluster = LocalCluster(n_workers=8, processes=True)
# client = Client(cluster)
# Work!!!!!
client = Client()
client
The error I got:
2022-11-25 03:09:13,886 - distributed.worker - ERROR - Could not deserialize task do_prediction-3a99f190-0611-4bc9-bc7f-1764cb6fe83b
Traceback (most recent call last):
File "/opt/homebrew/Caskroom/miniforge/base/envs/lake/lib/python3.10/site-packages/distributed/worker.py", line 2219, in execute
function, args, kwargs = await self._maybe_deserialize_task(ts)
File "/opt/homebrew/Caskroom/miniforge/base/envs/lake/lib/python3.10/site-packages/distributed/worker.py", line 2192, in _maybe_deserialize_task
function, args, kwargs = _deserialize(*ts.run_spec)
File "/opt/homebrew/Caskroom/miniforge/base/envs/lake/lib/python3.10/site-packages/distributed/worker.py", line 2863, in _deserialize
function = loads_function(function)
File "/opt/homebrew/Caskroom/miniforge/base/envs/lake/lib/python3.10/site-packages/distributed/worker.py", line 2857, in loads_function
return pickle.loads(bytes_object)
File "/opt/homebrew/Caskroom/miniforge/base/envs/lake/lib/python3.10/site-packages/distributed/protocol/pickle.py", line 73, in loads
return pickle.loads(x)
ModuleNotFoundError: No module named 'utl'
- Dask version: latest
- Python version: 3.10
- Operating System: macos
- Install method (conda, pip, source):
Issue Analytics
- State:
- Created 10 months ago
- Comments:5 (2 by maintainers)
Top Results From Across the Web
loading library on cluster - c++ - Stack Overflow
First try LD_LIBRARY_PATH=../boost_1_45_0 ./main on your front-end node. – Fred Foo. Jan 19, 2011 at 20:08. Add a ...
Read more >Error while loading library : libCore.so in cluster - ROOT Forum
Hi I am using root 5.34/34 in cluser. The following error is coming. error while loading shared libraries: libCore.so: cannot open shared ......
Read more >LIb security error issue during plugin loadin for liberty cluster
Now, I have added the following code in my IBM HTTP server conf file for SSL connection and restarted the httpd server. The...
Read more >Cluster failed to launch - Databricks Knowledge Base
The cluster can fail to launch if it has a connection to an external Hive metastore and it tries to download all the...
Read more >Cluster failed to launch - Azure Databricks | Microsoft Learn
The cluster can fail to launch if it has a connection to an external Hive metastore and it tries to download all the...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I’m going to close this for now. @b-y-f, ping me on the Discourse channel when you have something going on there.
@b-y-f I’m trying to figure out what’s going on here. I think this would be better discussed in out Discourse Group. But I am curious if you get better results defining your cluster and client before asking dask to do a computation. By that I mean:
You’d probably be better off with a
dask.bag
instead of usingchunks
, e.g.,