Docker worker won't pick up messages from SQS
See original GitHub issueApologies if this is a spam issue but it’s driving me nuts.
TLDR: I’m enqueuing jobs to AWS SQS from a local env and I’m able to consume them successfully from the same local env but when I try to pack the worker into a docker instance and run it, it will not pick up the jobs. As far as I can tell, it is hitting the right queues, there are no errors, but it just won’t consume the jobs.
Python env
python --version
Python 3.9.6
pip list
amqp 5.0.7
billiard 3.6.4.0
boto3 1.20.24
botocore 1.23.24
celery 5.2.1
click 8.0.3
click-didyoumean 0.3.0
click-plugins 1.1.1
click-repl 0.2.0
jmespath 0.10.0
kombu 5.2.2
pip 21.1.3
prompt-toolkit 3.0.24
pycurl 7.44.1
python-dateutil 2.8.2
pytz 2021.3
s3transfer 0.5.0
setuptools 56.0.0
six 1.16.0
urllib3 1.26.7
vine 5.0.0
wcwidth 0.2.5
requirements.txt
celery[sqs]
local environment
export AWS_ACCESS_KEY_ID="AKIA*********"
export AWS_SECRET_ACCESS_KEY="**********"
my 'task code
from celery import Celery
import pprint
app = Celery('tasks')
app.conf.broker_transport = 'sqs'
app.conf.broker_transport_options = {
'region': 'ap-southeast-2'
}
app.conf.result_backend = 'rpc://'
print("----------------------------------------------")
pprint.pprint(app.conf)
print("----------------------------------------------")
@app.task
def add(x, y):
return x + y
consuming messages from local
celery -A tasks worker --loglevel=info
Dockerfile
FROM python:3.6.15-alpine3.13
COPY ./app /app
WORKDIR /app
CMD ash -c "apk add --no-cache build-base libcurl curl-dev; pip3 install -r requirements.txt; celery -A tasks worker --loglevel=debug"
Log of consuming messages running local (mac)
Log of (not) consuming messages running from docker
AWS console showing one message successfully consumed using local and a second message sat there doing nothing
Issue Analytics
- State:
- Created 2 years ago
- Comments:7 (2 by maintainers)
Top Results From Across the Web
Celery/Kombu worker under Docker worker won't pick up ...
I have an exp using celery with queues in docker and I have met the issue you have. I would first start from...
Read more >Celery/Kombu worker under Docker worker won't pick up ...
Coding example for the question Celery/Kombu worker under Docker worker won't pick up messages from SQS-docker.
Read more >Can't receive messages · Issue #8 · roribio/alpine-sqs - GitHub
Hi, I have been using both this project and the docker-sqs-local, and in both cases I manage to push messages on the queue,...
Read more >Tutorial: Sending a message to an Amazon SQS queue from ...
In this tutorial, you learn how to send messages to an Amazon SQS queue over a secure, private network. This network consists of...
Read more >Can't Create SQS Queue From Docker-based Lambda
Hi all, I'm write a lambda function in Python to create SQS queues when ... Or that the image has a ~/.aws/... directory...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Still stuck on this one. I have more or less followed a great tutorial (https://medium.com/swlh/low-cost-workers-python-celery-aws-sqs-aws-ec2-spot-fb4f446f83fa) by @geekrohit (https://github.com/geekrohit)
since posting, I have tried adding this setting
but it doesn’t help
Hi Guys,
I lost patience with this and rolled my own job queue implementation using sqs. It’s a bit cheap and cheerful but it works just fine with docker so I’m not going to waste any more time with Celery/Kombu. GL Guys. If you are int, I put the key files into a gist https://gist.github.com/JavascriptMick/56141fd06286b0aea7616836724d4a7c