[BUG] pd.read_csv(s3_path) fails with "TypeError: 'coroutine' object is not subscriptable"
See original GitHub issueCode:
import pandas as pd
df = pd.read_csv(s3_path)
Error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/pandas/io/parsers.py", line 676, in parser_f
return _read(filepath_or_buffer, kwds)
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/pandas/io/parsers.py", line 431, in _read
filepath_or_buffer, encoding, compression
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/pandas/io/common.py", line 185, in get_filepath_or_buffer
filepath_or_buffer, encoding=encoding, compression=compression, mode=mode
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/pandas/io/s3.py", line 48, in get_filepath_or_buffer
file, _fs = get_file_and_filesystem(filepath_or_buffer, mode=mode)
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/pandas/io/s3.py", line 29, in get_file_and_filesystem
file = fs.open(_strip_schema(filepath_or_buffer), mode)
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/fsspec/spec.py", line 844, in open
**kwargs
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/s3fs/core.py", line 394, in _open
autocommit=autocommit, requester_pays=requester_pays)
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/s3fs/core.py", line 1276, in __init__
cache_type=cache_type)
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/fsspec/spec.py", line 1134, in __init__
self.details = fs.info(path)
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/s3fs/core.py", line 719, in info
return sync(self.loop, self._info, path, bucket, key, kwargs, version_id)
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/fsspec/asyn.py", line 51, in sync
raise exc.with_traceback(tb)
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/fsspec/asyn.py", line 35, in f
result[0] = await future
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/s3fs/core.py", line 660, in _info
Key=key, **version_id_kw(version_id), **self.req_kw)
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/s3fs/core.py", line 214, in _call_s3
raise translate_boto_error(err)
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/s3fs/core.py", line 207, in _call_s3
return await method(**additional_kwargs)
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/aiobotocore/client.py", line 121, in _make_api_call
operation_model, request_dict, request_context)
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/aiobotocore/client.py", line 140, in _make_request
return await self._endpoint.make_request(operation_model, request_dict)
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/aiobotocore/endpoint.py", line 90, in _send_request
exception):
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/aiobotocore/endpoint.py", line 199, in _needs_retry
caught_exception=caught_exception, request_dict=request_dict)
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/aiobotocore/hooks.py", line 29, in _emit
response = handler(**kwargs)
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/botocore/utils.py", line 1214, in redirect_from_error
new_region = self.get_bucket_region(bucket, response)
File "/opt/conda/envs/rapids/lib/python3.7/site-packages/botocore/utils.py", line 1272, in get_bucket_region
headers = response['ResponseMetadata']['HTTPHeaders']
TypeError: 'coroutine' object is not subscriptable
I wasn’t seeing this error until a couple of days ago, not sure what changed.
I installed s3fs via conda install -c conda-forge s3fs
. When I try pinning s3fs to 0.4.2
, everything works just fine.
Issue Analytics
- State:
- Created 3 years ago
- Comments:28 (13 by maintainers)
Top Results From Across the Web
[BUG] pd.read_csv(s3_path) fails with "TypeError: 'coroutine ...
Yes, TypeError: 'coroutine' object is not subscriptable happens outside the multi-threading/multi-processing bit. The NoCredentialsError when I ...
Read more >Pandas pd.read_csv(s3_path) fails with "TypeError: 'coroutine ...
I am running a spark application in Amazon EMR Cluster and since a few days ago, I am getting the following error whenever...
Read more >Nbdev_test_nbs in CI:TypeError: 'coroutine' object is not ...
... this error on github's CI tests where all tests are failing on nbdev_test_nbs: TypeError: 'coroutine' object is not subscriptable Looks ...
Read more >How to Fix Object is Not Subscriptable In Python - YouTube
In this quick tutorial we will explore how to fix ' Object Is Not Subscriptable ' in python. A subscriptable object describes objects...
Read more >How To Fix Type Error: Type Object is not Subscriptable
Are you wondering How To Fix Type Error : Type Object is not Subscriptable and have not found a good video that explains...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Right, this is actually an s3fs issue, probably while doing a HEAD_OBJECT (but it can stay here)
OK, so NoCredentials will take some more experimenting with whether the client is initialised in the parent process or not, and whether the processes are forked or spawned. It may no longer be relevant in the latest release.
So you can also pass this as S3FileSystem(client_kwargs={“region_name”: …}), or
storage_options={'client_kwargs': {"region_name": ..}}
when going through read_csv. Obviously this is only a workaround, not a solution: you shouldn’t have to know the region beforehand.It would be best, then, if we can recreate the bug using aiobotocore alone, and post the issue there. If you look through their github and stackoverflow, similar messages have appeared from time to time. It may depend on the exact versions of aiobotocore and botocore installed.