Can't access public bucket without credentials
See original GitHub issueOn master (ccb779d3e2054310dcff0e4308fe38acfe5784f6
) in an environment with no Google credentials (python:3.6.4
Docker image):
# this works
In [2]: gcsfs.GCSFileSystem(token='anon').open('gs://gcp-public-data-landsat/index.csv.gz').read(1)
Out[2]: b'\x1f'
# but this doesn't
In [3]: gcsfs.GCSFileSystem().open('gs://gcp-public-data-landsat/index.csv.gz').read(1)
_call exception: HTTPConnectionPool(host='metadata.google.internal', port=80): Max retries exceeded with url: /computeMetadata/v1/instance/service-accounts/default/?recursive=true (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbc9bdd5b00>: Failed to establish a new connection: [Errno -2] Name or service not known',))
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/urllib3/connection.py", line 141, in _new_conn
(self.host, self.port), self.timeout, **extra_kw)
File "/usr/local/lib/python3.6/site-packages/urllib3/util/connection.py", line 60, in create_connection
for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
File "/usr/local/lib/python3.6/socket.py", line 745, in getaddrinfo
for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
socket.gaierror: [Errno -2] Name or service not known
# ... several pages of these logs
# ... then several pages of tracebacks; finally:
RefreshError: HTTPConnectionPool(host='metadata.google.internal', port=80): Max retries exceeded with url: /computeMetadata/v1/instance/service-accounts/default/?recursive=true (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbc9b785eb8>: Failed to establish a new connection: [Errno -2] Name or service not known',))
It seems that the default auth should be falling back on anon
; I guess this type of exception is not being caught properly? Additionally, if the anon
method did eventually succeed, it might be preferable to suppress the logs during the initial failure+retries.
Issue Analytics
- State:
- Created 5 years ago
- Comments:5 (5 by maintainers)
Top Results From Across the Web
Access S3 public data without credentials - DEV Community
First create a sample bucket(pick a unique name) · Now add couple of objects/files/images · Make it public by unchecking the option "Block...
Read more >Controlling access to a bucket with user policies
Use user permissions to control access to your bucket. ... Use your AWS account credentials, not the credentials of an IAM user, to...
Read more >Know How to Access S3 Bucket without IAM Roles and Use ...
We all have used IAM credentials to access our S3 buckets. ... What if we do not require keys or roles without making...
Read more >Get publicly accessible contents of S3 bucket without AWS ...
If the bucket's permissions allow Everyone to list it, you can just do a simple HTTP GET request to http://s3.amazonaws.com/bucketname with no credentials....
Read more >Can't access public bucket without credentials #99 - GitHub
On master (ccb779d3e2054310dcff0e4308fe38acfe5784f6) in an environment with no Google credentials (python:3.6.4 Docker image): # this works ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Makes sense; #100 with one modification (see comments) now works for me:
I don’t quite see how that happens, but I have added a line to reset the session if the connection apparently fails. Checking for
credentials.valid
won’t work because, until the first use of the credentials, they were likely be expired and not (yet) valid. Please correct me if I am wrong.