question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Unable to use AzureLogHandler into a GUnicorn/Flask app

See original GitHub issue

Describe your environment.

We want to export Python logs to Azure Monitor. We have a simple python gunicorn/flask webapp, we are unable to execute logger.info() (or other log methods) and successfully send the message to Application Insights. It seems that when AzureLogHandler runs into flask/gunicorn only the main process use the handler, child thread/processes are unable to send log lines to Azure Application Insights. For example, when http request is received by a flask/gunicorn app, should be unable to log through the handler, but nothing happens. It seems that having HTTP requests handled into child processes (threads or forks) in some way is related to this behaviour.

Current environment is running into a docker container with:

  • ubuntu:18.04
  • Python 3.6.9
  • Flask==1.1.2
  • gunicorn==20.0.4
  • opencensus-ext-azure==1.0.4
  • opencensus-ext-flask==0.7.3

Steps to reproduce. I have created a minimal project on github:

git clone https://github.com/freedev/flask-azure-application-insights
cd flask-azure-application-insights
./run-docker.sh <InstrumentationKey>

Change <InstrumentationKey> with your own Instrumentation Key (00000000-0000-0000-0000-000000000000) run-docker.sh script will start a simple web app running into a docker instance and listening on port 5000.

Once the docker instance is started there is a first log message written successfully by main process.

After we expect that each http request that comes to flask application should be logged through the AzureLogHandler but none arrives to Application Insights.

try with:

curl http://localhost:5000/logging

What is the expected behavior? We expect to be able to use AzureLogHandler and write into Application Insights for each http request received;

Each http request that comes to flask application should be logged through the AzureLogHandler.

For example: once the application is started with:

  ./run-docker.sh <InstrumentationKey>

Then open Azure Portal -> Application Insights -> Monitoring - Logs

Then execute the query:

 traces
 | sort by timestamp desc 

You should be able to see the line with message:

main process started (pid)

But even a message for each request handled:

handling request /logging (pid)

Please note that messages for request handling are missing.

What is the actual behavior? Only the log lines written by main process appears into Application Insights. Child process/threads are unable to send log lines to Application insights. To doublecheck try to start flask without gunicorn:

export APPLICATIONINSIGHTS_CONNECTION_STRING="InstrumentationKey=00000000-0000-0000-0000-000000000000"
python application.py

In this way I see each http request being logged successfully. In our context not have gunicorn is not acceptable this solution. We have tried to start the flask without gunicorn just to have the proof that AzureLogHandler does not work in a multithreaded/multiprocess environment.

Additional context. Add any other context about the problem here.

Issue Analytics

  • State:open
  • Created 3 years ago
  • Comments:13 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
omsharma2364commented, Nov 24, 2020

@freedev I was facing similar issue. Got it working by using multiprocessing-logging with minimal changes. Usage as mentioned in here Just make sure you call install_mp_handler() after you have added all the desired handlers.

1reaction
freedevcommented, Oct 6, 2020

I believe GU_WORK_NUM represents the number of workers that are spun up (each which are a new process).

yep

In run-docker.sh, what values are you using for GU_THREADS_NUM and GU_WORK_NUM?

I’ve tried with GU_THREADS_NUM=3 and GU_WORK_NUM=1 and GU_WORK_NUM=3 and GU_THREADS_NUM=1 As said, having a new logger instance (and a new AzureLogHandler) for each pid has figured out the problem.

getLogger() works well with more workers (GU_WORK_NUM > 1), I see a new line in the log analytics for each process id. On the other hand, when I try with multiple threads, the load I’m generating with my browser it’s not enough to generate many threads. I think I’ve to write a test unit to double check this.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Flask application unable to start on Azure, connection in use
I once ran into a similar issue and the fix was to specify the path to my main.py app: gunicorn -w 4 -k...
Read more >
How To Serve Flask Applications with Gunicorn ... - DigitalOcean
In this guide, you will build a Python application using the Flask microframework on Ubuntu 20.04. The majority of this tutorial is about ......
Read more >
Microsoft/azure-monitor-python - Gitter
Hi, We are using FastAPI in our production and successfully implemented log exporting using Azure log handers to App Insights. In an attempt...
Read more >
census-instrumentation - Bountysource
I am using the requests integration in my Python Azure Function App, in the ... when AzureLogHandler runs into flask/gunicorn only the main...
Read more >
opencensus-ext-azure - PyPI
Install the logging integration package using pip install opencensus-ext-logging. import logging from opencensus.ext.azure.log_exporter import AzureLogHandler ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found