Add support for arbitrary SQS message
See original GitHub issueHi. I came across this project because I am looking for a solution that will allow to run an execution pool (workers) that can process tasks from any AWS SQS queues with little effort. I mean queues that were not created by the application but are already present in AWS. These are e.g. various types of notifications from AWS systems, such as the creation of a file in S3.
I tried Celery, but support for SQS in Celery comes down only to publishing and processing its own messages.
I was hoping that pyqs
would be able to process any message, but here the situation looks similar to Celery, we have:
def decode_message(message):
message_body = message['Body']
json_body = json.loads(message_body)
if 'task' in message_body:
return json_body
# elif ... <<< here, or extract this part into its own method, like 'detect_message_type(message)'
else:
# Fallback to processing celery messages
return decode_celery_message(json_body['body'])
Have you considered this possibility? This could work using the configuration option, or using a part of message that uniquely identifies Celery (such as, I think, “task” identifies pyqs in the code above).
Issue Analytics
- State:
- Created 3 years ago
- Comments:6 (5 by maintainers)
Yeah, that makes sense. I could imagine some type of more generic registry where we hook up task processors with queue names.
I think it’d also help to slightly change the current definition of the Manager Worker classes to avoid hardcoding the processor class name, which would make it easier, or possible at all to override, see: