question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

_pickle.PicklingError: Can't pickle <class 'boto3.resources.factory.s3.ServiceResource'>: attribute lookup s3.ServiceResource on boto3.resources.factory failed

See original GitHub issue

I want to try my model. The data is saved in AWS. I use boto3 simply like

        self.s3_img = S3Images(boto3.resource('s3'))
        self.s3_obj = S3GetObjects()

I met this error when I feed the data and model in to the pytorch training pipeline.

The code looks like

import pytorch_lightning as pl
from pytorch_lightning import Trainer
trainer = Trainer(
        checkpoint_callback=checkpoint_callback,
        callbacks=get_callbacks(chkpt_path),
        fast_dev_run=False,
        max_epochs=100,
        resume_from_checkpoint=checkpoint_path
    )
trainer.fit(model)

The error is

  File "main.py", line 191, in <module>
    train()
  File "/Users/admin/opt/anaconda3/envs/kk/lib/python3.8/site-packages/hydra/main.py", line 20, in decorated_main
    run_hydra(
  File "/Users/admin/opt/anaconda3/envs/kk/lib/python3.8/site-packages/hydra/_internal/utils.py", line 171, in run_hydra
    hydra.run(
  File "/Users/admin/opt/anaconda3/envs/kk/lib/python3.8/site-packages/hydra/_internal/hydra.py", line 82, in run
    return run_job(
  File "/Users/admin/opt/anaconda3/envs/kk/lib/python3.8/site-packages/hydra/plugins/common/utils.py", line 109, in run_job
    ret.return_value = task_function(task_cfg)
  File "main.py", line 176, in train
    trainer.fit(model)
  File "/Users/admin/opt/anaconda3/envs/kk/lib/python3.8/site-packages/pytorch_lightning/trainer/states.py", line 48, in wrapped_fn
    result = fn(self, *args, **kwargs)
  File "/Users/admin/opt/anaconda3/envs/kk/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 1084, in fit
    results = self.accelerator_backend.train(model)
  File "/Users/admin/opt/anaconda3/envs/kk/lib/python3.8/site-packages/pytorch_lightning/accelerators/cpu_backend.py", line 39, in train
    results = self.trainer.run_pretrain_routine(model)
  File "/Users/admin/opt/anaconda3/envs/kk/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 1224, in run_pretrain_routine
    self._run_sanity_check(ref_model, model)
  File "/Users/admin/opt/anaconda3/envs/kk/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 1257, in _run_sanity_check
    eval_results = self._evaluate(model, self.val_dataloaders, max_batches, False)
  File "/Users/admin/opt/anaconda3/envs/kk/lib/python3.8/site-packages/pytorch_lightning/trainer/evaluation_loop.py", line 305, in _evaluate
    for batch_idx, batch in enumerate(dataloader):
  File "/Users/admin/opt/anaconda3/envs/kk/lib/python3.8/site-packages/torch/utils/data/dataloader.py", line 352, in __iter__
    return self._get_iterator()
  File "/Users/admin/opt/anaconda3/envs/kk/lib/python3.8/site-packages/torch/utils/data/dataloader.py", line 294, in _get_iterator
    return _MultiProcessingDataLoaderIter(self)
  File "/Users/admin/opt/anaconda3/envs/kk/lib/python3.8/site-packages/torch/utils/data/dataloader.py", line 801, in __init__
    w.start()
  File "/Users/admin/opt/anaconda3/envs/kk/lib/python3.8/multiprocessing/process.py", line 121, in start
    self._popen = self._Popen(self)
  File "/Users/admin/opt/anaconda3/envs/kk/lib/python3.8/multiprocessing/context.py", line 224, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
  File "/Users/admin/opt/anaconda3/envs/kk/lib/python3.8/multiprocessing/context.py", line 284, in _Popen
    return Popen(process_obj)
  File "/Users/admin/opt/anaconda3/envs/kk/lib/python3.8/multiprocessing/popen_spawn_posix.py", line 32, in __init__
    super().__init__(process_obj)
  File "/Users/admin/opt/anaconda3/envs/kk/lib/python3.8/multiprocessing/popen_fork.py", line 19, in __init__
    self._launch(process_obj)
  File "/Users/admin/opt/anaconda3/envs/kk/lib/python3.8/multiprocessing/popen_spawn_posix.py", line 47, in _launch
    reduction.dump(process_obj, fp)
  File "/Users/admin/opt/anaconda3/envs/kk/lib/python3.8/multiprocessing/reduction.py", line 60, in dump
    ForkingPickler(file, protocol).dump(obj)
_pickle.PicklingError: Can't pickle <class 'boto3.resources.factory.s3.ServiceResource'>: attribute lookup s3.ServiceResource on boto3.resources.factory failed

Can anyone tell me what’s the meaning of this error and how to solve it? Thanks for any suggestions and help!

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:6 (3 by maintainers)

github_iconTop GitHub Comments

3reactions
kdailycommented, Sep 1, 2021

Hi @isvogor-foi,

No updates to provide. However as previously mentioned, we should track this under #678. Closing this out now!

3reactions
karliesamacommented, Jan 29, 2021

Hi! I found that I won’t have this error if I use python 3.7 but got this error with 3.8. 3.8 includes a new way to pickle objects.

Read more comments on GitHub >

github_iconTop Results From Across the Web

PicklingError: Can't pickle <class 'botocore.client.S3 ...
Is it a way to resolve the issue? However, using global variable and functions works? s3 = boto3.client("s3") def g ...
Read more >
How to avoid pickling errors when using PyTorch DDP with S3 ...
However, my training fails for _pickle.PicklingError: Can't pickle <class 'boto3.resources.factory.s3.ServiceResource'>: attribute lookup s3 ...
Read more >
Source code for boto3.resources.factory - AWS
[docs]class ResourceFactory: """ A factory to create new :py:class:`~boto3.resources.base.ServiceResource` classes from a :py:class:`~boto3.resources.model.
Read more >
How to use boto3 client with Python multiprocessing - iTecNote
PicklingError : Can't pickle <class 'botocore.client.S3'>: attribute lookup S3 on botocore.client failed. I have very limited experience with the Python ...
Read more >
Using the Boto3 S3 Service Resource - VAST Data
The S3 resource class has actions that are S3 service operations that pre-fill some of the parameters for you. Actions can return three......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found