question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

session.client('s3') - AttributeError: 'NoneType' object has no attribute 'utf_8_decode'

See original GitHub issue

I am running python code on windows server. I randomly get a NoneType error when trying to initialize a session.client(‘s3’). When I run locally everything is fine, but I have deployed this code under rest service created by ArcGIS Server. I am perplexed at what is causing this and have tried many different things.

it will randomly fail on self.client = self.session.client('s3')

I have pasted the full error text below.

any help would be great, I am spinning my wheels.

here is my code.

from distutils.spawn import find_executable
from operator import attrgetter
from os.path import basename

# external packs
from boto3 import session

class AmazonApi(object):
    def __init__(self, bucket_name, folder_name="", region_name='us-east-1'):

        self.session = session.Session(region_name=region_name)
        self.client = self.session.client('s3')
        self.s3 = self.session.resource('s3')

        self.bucket_name = bucket_name
        self.folder_name = folder_name
        self.bucket = self.s3.Bucket(self.bucket_name)

    bucket_name = property(attrgetter('_bucket_name'))

    @bucket_name.setter
    def bucket_name(self, d):
        """ Validate incoming format
        """
        if not find_executable("aws"):
            raise Exception("missing aws cli (cmd line tools), ensure the aws cli is installed and configured")
        if not self.bucket_exists(d):
            raise Exception("bucket does not exists '{0}'".format(d))
        # noinspection PyAttributeOutsideInit
        self._bucket_name = d

    def bucket_exists(self, bucket_name):
        return self.s3.Bucket(bucket_name).creation_date

    def push_file(self, file, permission='public-read'):
        data = open(file, 'rb')
        key_text = "{0}/{1}".format(self.folder_name, basename(file))

        # push file to bucket
        self.bucket.put_object(Key=key_text, Body=data, ACL=permission)

        if not check_if_s3_file_exists(key_text, self.bucket):
            raise Exception("file upload unsuccessful '{0}'".format(key_text))

        return self.get_url(key_text)

    def get_url(self, key_name):
        url = self.client.generate_presigned_url('get_object', Params={'Bucket': self.bucket_name,
                                                                       'Key': key_name}, ExpiresIn=604800)
        return url

    def download_file(self, key, output_path):
        self.s3.Bucket(self.bucket_name).download_file(key, output_path)

line 50, in init self.client = self.session.client(‘s3’) File “C:\Python27\ArcGISx6410.4\lib\site-packages\boto3\session.py”, line 263, in client aws_session_token=aws_session_token, config=config) File “C:\Python27\ArcGISx6410.4\lib\site-packages\botocore\session.py”, line 851, in create_client endpoint_resolver = self.get_component(‘endpoint_resolver’) File “C:\Python27\ArcGISx6410.4\lib\site-packages\botocore\session.py”, line 726, in get_component return self._components.get_component(name) File “C:\Python27\ArcGISx6410.4\lib\site-packages\botocore\session.py”, line 922, in get_component self._components[name] = factory() File “C:\Python27\ArcGISx6410.4\lib\site-packages\botocore\session.py”, line 189, in create_default_resolver endpoints = loader.load_data(‘endpoints’) File “C:\Python27\ArcGISx6410.4\lib\site-packages\botocore\loaders.py”, line 132, in _wrapper data = func(self, *args, **kwargs) File “C:\Python27\ArcGISx6410.4\lib\site-packages\botocore\loaders.py”, line 420, in load_data found = self.file_loader.load_file(possible_path) File “C:\Python27\ArcGISx6410.4\lib\site-packages\botocore\loaders.py”, line 173, in load_file payload = fp.read().decode(‘utf-8’) File “C:\Python27\ArcGISx6410.4\Lib\encodings\utf_8.py”, line 16, in decode return codecs.utf_8_decode(input, errors, True) AttributeError: ‘NoneType’ object has no attribute ‘utf_8_decode’

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:5

github_iconTop GitHub Comments

2reactions
lelabo-mcommented, Nov 29, 2018

Seems that you did not setup aws credentials on your localhost. Try using: self.session.client('s3', aws_access_key_id="something", aws_secret_access_key="something")

0reactions
seahawks8commented, Dec 7, 2018

so it turns out this was my fault with how I was uploading files.

s3 = resource('s3', aws_access_key_id=PUBLIC_KEY, aws_secret_access_key=ACCESS_KEY,
                   region_name=region_name)

bucket = s3.Bucket(bucket_name)

data = open(file, 'rb')
key_text = "{0}/{1}".format(folder_name, basename(file))

# push file to bucket
return_obj = bucket.put_object(Key=key_text, Body=data, ACL=permission)

Reading the data file would somehow screw the encoding on my system, I did a bunch of debugging and I never caught it, but I do know it was the cause. I tried settting the system enconding and a bunch of other things. anyways this is what I did instead.

key_text = "{0}/{1}".format(folder_name, basename(file))
s3 = client('s3', aws_access_key_id=PUBLIC_KEY, aws_secret_access_key=ACCESS_KEY)
s3.upload_file(file, bucket_name,  key_text, ExtraArgs={'ACL':'public-read'})
Read more comments on GitHub >

github_iconTop Results From Across the Web

'NoneType' object has no attribute 'boto_region_name'
The error is caused by your sagemaker.tensorflow.serving.Model not having a sagemaker.session.Session associated with it.
Read more >
'NoneType' object has no attribute 'utf_8_decode'
I have a published geoprocessing service that is being used by web app builder. This geoprocessing service makes a call to an ArcGIS...
Read more >
boto/boto3 - Gitter
So I gotta ask: is the guide out of date, or have there been no breaking changes ... I got - 'NoneType' object...
Read more >
AttributeError: 'NoneType' object has no attribute 'sc' - Support
I am trying to execute pyspark script via emr. Script will process files from S3 bucket and put into another folder.
Read more >
AttributeError: 'NoneType' object has no attribute 'call_soon'
async def get_object(self, file_path: str) -> Optional[bytes]: async with self.session.client(self.SERVICE_NAME) as s3: try: s3_object ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found