Issues with Expires arg on put_object() with S3
See original GitHub issueHere’s my setup
AwsSession = Session(aws_access_key_id = 'ACCESS_ID', aws_secret_access_key = 'ACCESS_SECRET')
S3 = AwsSession.resource("s3")
And here’s where I’m trying to put an object and give it an expiration (in datetime format) of 5 minutes from now
now = datetime.datetime.now()
expires = now + datetime.timedelta(minutes=5)
S3.Bucket('BUCKET_NAME').put_object(Key=image_name, Body=image_data, Expires=expires, ACL='public-read', ContentType="image", ContentDisposition="inline")
Running that code throws an error
Traceback (most recent call last):
File "aws_test.py", line 43, in <module>
Main()
File "aws_test.py", line 11, in __init__
self.main()
File "aws_test.py", line 34, in main
img = self.upload_image_datetime(f, "test_image_{}".format(t))
File "aws_test.py", line 19, in upload_image_datetime
self.S3.Bucket('BUCKET_NAME').put_object(Key=image_name, Body=image_data, Expires=expires, ACL='public-read', ContentType="image", ContentDisposition="inline")
File "/usr/local/lib/python2.7/site-packages/boto3/resources/factory.py", line 344, in do_action
response = action(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/boto3/resources/action.py", line 77, in __call__
response = getattr(parent.meta.client, operation_name)(**params)
File "/usr/local/lib/python2.7/site-packages/botocore/client.py", line 269, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python2.7/site-packages/botocore/client.py", line 310, in _make_api_call
api_params, operation_model)
File "/usr/local/lib/python2.7/site-packages/botocore/client.py", line 351, in _convert_to_request_dict
api_params, operation_model)
File "/usr/local/lib/python2.7/site-packages/botocore/validate.py", line 275, in serialize_to_request
operation_model)
File "/usr/local/lib/python2.7/site-packages/botocore/serialize.py", line 400, in serialize_to_request
shape_members)
File "/usr/local/lib/python2.7/site-packages/botocore/serialize.py", line 478, in _partition_parameters
value = self._convert_header_value(shape, param_value)
File "/usr/local/lib/python2.7/site-packages/botocore/serialize.py", line 504, in _convert_header_value
datetime_obj = parse_timestamp(value)
File "/usr/local/lib/python2.7/site-packages/botocore/utils.py", line 317, in parse_timestamp
return dateutil.parser.parse(value)
File "/usr/local/lib/python2.7/site-packages/dateutil/parser.py", line 1008, in parse
return DEFAULTPARSER.parse(timestr, **kwargs)
File "/usr/local/lib/python2.7/site-packages/dateutil/parser.py", line 392, in parse
res = self._parse(timestr, **kwargs)
File "/usr/local/lib/python2.7/site-packages/dateutil/parser.py", line 492, in _parse
l = _timelex.split(timestr) # Splits the timestr into tokens
File "/usr/local/lib/python2.7/site-packages/dateutil/parser.py", line 174, in split
return list(cls(s))
File "/usr/local/lib/python2.7/site-packages/dateutil/parser.py", line 171, in next
return self.__next__() # Python 2.x support
File "/usr/local/lib/python2.7/site-packages/dateutil/parser.py", line 164, in __next__
token = self.get_token()
File "/usr/local/lib/python2.7/site-packages/dateutil/parser.py", line 82, in get_token
nextchar = self.instream.read(1)
AttributeError: 'datetime.datetime' object has no attribute 'read'
Looks like we can’t pass a datetime.datetime object for Expires. However, the documentation http://boto3.readthedocs.org/en/latest/reference/services/s3.html#S3.Client.put_object tells you to give it a datetime object (only with a date, though)
Expires (datetime) -- The date and time at which the object is no longer cacheable.
Now, I also tried passing Expires a string containing the ISO format of the datetime object.
now = datetime.datetime.now()
expires = now + datetime.timedelta(minutes=5)
expires = expires.isoformat()
S3.Bucket('BUCKET_NAME').put_object(Key=image_name, Body=image_data, Expires=expires, ACL='public-read', ContentType="image", ContentDisposition="inline")
So, this sort of works. It doesn’t error out, but it also never expires on the amazon server. This makes me think maybe there’s something I need to do on my end to make/let it actually expire (and be removed)
I know Amazon doesn’t immediately remove expired files like that, but even going back to things I uploaded days ago (also given a 5 minute expiration period) I find that they still exist and have not expired.
So, I’m not really sure if this is a documentation problem, or a legitimate code problem. I’d really like to be able to upload a file which lasts only ~5 minutes. Any ideas?
Thanks for your time
EDIT:
Looks like giving it a negative timestamp (isoformat) doesn’t do anything either
now = datetime.datetime.now()
expires = now - datetime.timedelta(minutes=30)
expires = expires.isoformat()
S3.Bucket('BUCKET_NAME').put_object(Key=image_name, Body=image_data, Expires=expires, ACL='public-read', ContentType="image", ContentDisposition="inline")
That still goes through and shows up fine. Curious.
Also, I tried passing it a time.time() and it didn’t error, but it also didn’t expire the files.
Looking through the code, and it looks like everything I’m doing is supported, leading me to think it’s a problem with either timezones, or bucket settings. Or maybe just misunderstanding how Expires actually works.
Issue Analytics
- State:
- Created 8 years ago
- Comments:6 (1 by maintainers)
Top GitHub Comments
It seems that S3 does not support per object expiration.
The
Expires
header is not meant to set the TTL of the object. S3 documentations says “Expires = The date and time at which the object is no longer cacheable” (http://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPUT.html)To get your files automatically deleted by S3 you need to create Lifecycle rules on your bucket.
I have the exact same issue as @kwinkunks and @kyleknap. Using
datetime
object or formatted string doesn’t help. TheExpires Date
field in S3 showsNone
whereas a new header shows up with the correct expiry date.If anyone has found a solution please update this issue.