Replies: 1 comment 1 reply
-
The The user defined logging hasn't been effective since 3.0, and I forgot to update the docs. Given our docs show this example, I'll look at backporting #55850 in to 3.1.2 (or 3.1.1, but I think I'll miss the cut-off for that). Once that lands you would be able to set this with the following config option: AIRFLOW__LOGGING__NAMESPACE_LEVELS='botocore.credentials=warning airflow.models.dagbag.DagBag=warning py.warnings=error' |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I have configured the Airflow backend db to a PostgreSQL database, connected my secrets backend, and have enabled remote logging to cloudwatch on AWS. In addition, I have monitoring enabled on the airflow logs and get an email via SNS anytime an error or warning is detected in the logs.
The Issue
In my airflow logs, there are warnings from loggers I don't care about that trigger an email (with a cost implication) that I am trying to suppress. There are also info level logs that repeatedly appear that I want to silence in production environments to minimize noise in the logs so problem resolution is easier.
However, my implementation of the recommended approach is not suppressing the logs and I am still seeing the log examples below appear in cloudwatch and I'm not sure why. I can see the user-defined logging config is imported successfully for scheduler and triggerer.
INFO - Detected user-defined logging config. Attempting to load log_config.LOGGING_CONFIG
INFO - Successfully imported user-defined logging config. FastAPI App will serve log from /home/airflow
Example logs to be suppressed
"logger": "py.warnings",
"event": "The
airflow.hooks.base.BaseHook
attribute is deprecated. Please use'airflow.sdk.bases.hook.BaseHook'
.","level": "warning"
"logger": "botocore.credentials",
"event": "Found credentials from IAM Role",
"level": "info"
"logger": "airflow.models.dagbag.DagBag",
"event": "Filling up the DagBag from..."
"level": "info"
Current solution that isn't working
Based on the Airflow advanced logging documentation: https://airflow.apache.org/docs/apache-airflow/stable/administration-and-deployment/logging-monitoring/advanced-logging-configuration.html
I created a config directory under airflow so the project structure follows: /home/airflow/config/log_config.py
In my .env file to override default airflow variables I have set:
PYTHONPATH=/home/airflow/config
AIRFLOW__LOGGING__LOGGING_CONFIG_CLASS=log_config.LOGGING_CONFIG
My log_config.py file is as follows:
Environment Setup
I am running airflow 3.1.0 using python 3.12 and the following providers:
"apache-airflow==3.1.0"
"apache-airflow-providers-microsoft-mssql"
"apache-airflow-providers-amazon"
"flask_appbuilder"
"psycopg2-binary"
"asyncpg"
--constraint "https://raw.githubusercontent.com/apache/airflow/constraints-3.1.0/constraints-3.12.txt"
Beta Was this translation helpful? Give feedback.
All reactions