SQS: JSONDecodeError when reading message from Amazon SQS predefined_queues
See original GitHub issueWhen reading messages from SQS using predefined_queues
(messages come from SNS) I see the following JSON decode error in:
# kombu/transport/SQS.py", line 290
def _message_to_python(self, message, queue_name, queue):
try:
body = base64.b64decode(message['Body'].encode())
except TypeError:
body = message['Body'].encode()
payload = loads(bytes_to_str(body)) # <<< HERE line 290
Afet dropping rdb.set_trace()
around those lines I can see that the message looks of course just like regular AWS SNS/SQS message:
# simplified version for clarity
{
'MessageId': 'abcd-efgh',
'ReceiptHandle': 'abcdefgh==',
'MD5OfBody': 'abc123',
'Body': '''{\n
"Type" : "Notification",\n
"MessageId" : "abcd-efgh",\n
"TopicArn" : "arn:aws:sns:xxx",\n
"Subject" : "Amazon S3 Notification",\n
"Message" : "{\\"Records\\":[{\\"eventVersion\\":\\"2.1\\",\\"eventSource\\":\\"aws:s3\\",...}"
}'''
}
The try: body =
block works fine, it’ll produce bytestring:
(Pdb) base64.b64decode(message['Body'].encode())
b'O*^6\x8ab~\'\x1c\xd6*\'1
but then it errors in line payload = loads(bytes_...
(Pdb) payload = loads(bytes_to_str(body))
*** json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
(Pdb) bytes_to_str(body)
'O*^6�c~\'\x1b�*\
It looks like for messages coming from 'predefined_queues` (from other AWS publishers) we should simply:
loads(bytes_to_str(message['Body'].encode()))
# or even better
loads(message['Body'])
Issue Analytics
- State:
- Created 3 years ago
- Reactions:2
- Comments:9 (2 by maintainers)
Top Results From Across the Web
Amazon SQS dead-letter queues - AWS Documentation
Learn about sending messages to an Amazon SQS dead-letter queue, which other queues can target for messages that can't be successfully processed.
Read more >django - how to poll message from SQS using celery worker ...
You cannot use Celery to consume arbitrary SQS messages. ... management command in which you poll your SQS queue using the boto3 library....
Read more >class SQS. Client - Boto3 Docs 1.26.35 documentation - AWS
A low-level client representing Amazon Simple Queue Service (SQS). Welcome to the Amazon SQS ... The default visibility timeout for a message is...
Read more >Using Amazon SQS — Celery 5.2.7 documentation
Note that newly created queues themselves (also if created by Celery) will have the default value of 0 set for the “Receive Message...
Read more >Amazon SQS - Apache Airflow
In the following example, the task publish_to_queue publishes a message containing the task instance and the execution date to a queue with a...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Hi there!
I’d like to reopen this discussion since I think it pertains to Kombu as a messaging library and not just how it integrates with Celery.
The way Kombu is described in the README is that it is “an idiomatic high-level interface for the AMQ protocol” and that it “Allows application authors to support several message server solutions by using pluggable transports.” This seems like false advertising if there’s the caveat of having to follow the Celery message format, which would be a pretty major caveat. My team was excited at the prospect of easily swapping some RMQ workers over to SQS without having to change the worker code drastically, but it seems like that won’t be possible with Kombu.
@thedrow am I understanding this correctly? Apologies if I’m missing something here, I’m relatively new to Kombu 😃
Yes. You can use Kombu in isolation and that’s a very valid usecase.