S3 Remote Logging not working
See original GitHub issueApache Airflow version: v2.0.0b3
Kubernetes version (if you are using kubernetes) (use kubectl version
): 1.16.15
Environment:
- Cloud provider or hardware configuration: AWS
- OS (e.g. from /etc/os-release):
- Kernel (e.g.
uname -a
): - Install tools: Custom Helm Chart
- Others:
What happened:
S3 Remote Logging not working. Below is the stacktrace:
Running <TaskInstance: canary_dag.print_date 2020-12-09T19:46:17.200838+00:00 [queued]> on host canarydagprintdate-9fafada4409d4eafb5e6e9c7187810ae β
β [2020-12-09 19:54:09,825] {s3_task_handler.py:183} ERROR - Could not verify previous log to append: 'NoneType' object is not callable β
β Traceback (most recent call last): β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py", line 179, in s3_write β
β if append and self.s3_log_exists(remote_log_location): β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py", line 141, in s3_log_exists β
β return self.hook.check_for_key(remote_log_location) β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/hooks/s3.py", line 57, in wrapper β
β connection = self.get_connection(self.aws_conn_id) β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/hooks/base.py", line 63, in get_connection β
β conn = Connection.get_connection_from_secrets(conn_id) β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/connection.py", line 351, in get_connection_from_secrets β
β conn = secrets_backend.get_connection(conn_id=conn_id) β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 64, in wrapper β
β with create_session() as session: β
β File "/usr/local/lib/python3.7/contextlib.py", line 112, in __enter__ β
β return next(self.gen) β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 29, in create_session β
β session = settings.Session() β
β TypeError: 'NoneType' object is not callable β
β [2020-12-09 19:54:09,826] {s3_task_handler.py:193} ERROR - Could not write logs to s3://my-favorite-airflow-logs/canary_dag/print_date/2020-12-09T19:46:17.200838+00:00/2.log β
β Traceback (most recent call last): β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py", line 190, in s3_write β
β encrypt=conf.getboolean('logging', 'ENCRYPT_S3_LOGS'), β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/hooks/s3.py", line 57, in wrapper β
β connection = self.get_connection(self.aws_conn_id) β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/hooks/base.py", line 63, in get_connection β
β conn = Connection.get_connection_from_secrets(conn_id) β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/connection.py", line 351, in get_connection_from_secrets β
β conn = secrets_backend.get_connection(conn_id=conn_id) β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 64, in wrapper β
β with create_session() as session: β
β File "/usr/local/lib/python3.7/contextlib.py", line 112, in __enter__ β
β return next(self.gen) β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 29, in create_session β
β session = settings.Session() β
β TypeError: 'NoneType' object is not callable
stream closed
What you expected to happen Able to see the task instance logs in the airflow UI being read from S3 remote location.
How to reproduce it:
Pulled the latest master and created an airflow image from the dockerfile mentioned in the repo.
Issue Analytics
- State:
- Created 3 years ago
- Comments:54 (51 by maintainers)
Top Results From Across the Web
Airflow won't write logs to s3 - Stack Overflow
I tried different ways to configure Airflow 1.9 to write logs to s3 however it just ignores it. I found a lot of...
Read more >Writing logs to Amazon S3 - Apache Airflow
Remote logging to Amazon S3 uses an existing Airflow connection to read or write logs. If you don't have a connection properly setup,...
Read more >AirFlow Remote Logging Using S3 Object Storage
Configuring Remote Logging in Airflow Β· An Airflow Connection needs to be created to the object storage system where the data will be...
Read more >Airflow remote logging using AWS S3 - Cloud Walker
Airflow logs are stored in the filesystem by default in $AIRFLOW_HOME/dags directory, this is also called remote logging.
Read more >Troubleshooting: CloudWatch Logs and CloudTrail errors
The following topic describes the errors you may receive when viewing Apache Airflow logs. I can't see my task logs, or I received...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Fix for local executor comming.
I see!