Config value interpolation not working when set as environment variable
See original GitHub issueApache Airflow version: 1.10.10
Kubernetes version (if you are using kubernetes) (use kubectl version
):
Environment:
- Cloud provider or hardware configuration:
- OS (e.g. from /etc/os-release):
- Kernel (e.g.
uname -a
): - Install tools:
- Others:
What happened:
I set store_serialized_dags as an environment variable: export AIRFLOW__CORE__STORE_SERIALIZED_DAGS=True
.
The default value of store_dag_code
is %(store_serialized_dags)s
, so I expect it to now also be True. However, it’s not (it’s False).
What you expected to happen: I expect store_dag_code to have the same value as store_serialized_dags.
It’s because the interpolation is applied to the value set in .cfg, but not when it’s set via an env var.
How to reproduce it:
Was able to reproduce it in a test (add this to test_configuration.py):
def test_interpolation_from_env_var(self):
"""Test if interpolation works when the substituted value is set by an environment variable."""
test_config = '''[test]
key1 = testme
key2 = %(key1)s
'''
with mock.patch.dict('os.environ', AIRFLOW__TEST__KEY1="something_else"):
test_conf = AirflowConfigParser(default_config=test_config)
key2 = test_conf.get("test", "key2")
self.assertEqual(key2, "something_else")
So far I haven’t been able to make any sense of airflow/configuration.py
, to fix it.
Anything else we need to know:
Issue Analytics
- State:
- Created 3 years ago
- Reactions:1
- Comments:7 (7 by maintainers)
Top GitHub Comments
@ashb WDYT about rewriting the entire module? To me the whole module reads like hack on hack on hack. Putting some thought into it and rewriting it completely seems like the better alternative than to add another hack to get this interpolation issue to work.
The coming month I’m completely full so I’m not able to spend any time on it soon though.
+1 to rewriting for 2.0. However, for 1.10.11, I think I will try to fix or add another hack for store_dag_code - let’see