Command airflow scheduler -p fails unexpectedly
See original GitHub issueApache Airflow version: apache-airflow==1.10.14
Kubernetes version (if you are using kubernetes) (use kubectl version
): None
Environment:
- Cloud provider or hardware configuration: x86
- OS (e.g. from /etc/os-release): Ubuntu 16.04.4 LTS
- Kernel (e.g.
uname -a
): Linux numba-linux64-gpu 4.4.0-87-generic #110-Ubuntu SMP Tue Jul 18 12:55:35 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux - Install tools:
pip
- Others:
docker-compose
What happened:
The command airflow scheduler -p
fails with an error message.
(base) root@d10c31496430:/# airflow scheduler -p
/ci_repo/airflow/airflow-ci/airflow_ci/ui.py:12: DeprecationWarning: get: Accessing configuration method 'get' directly from the configuration module is deprecated. Please access the configuration from the 'configuration.conf' object via 'conf.get'
DASK_DASHBOARD = configuration.get('dask', 'dashboard')
/ci_repo/airflow/airflow-ci/airflow_ci/airflow_plugins.py:7: FutureWarning: Registering operators or sensors in plugins is deprecated -- these should be treated like 'plain' python modules, and imported normally in DAGs.
Airflow 2.0 has removed the ability to register these types in plugins. See <http://airflow.apache.org/docs/stable/howto/custom-operator.html>.
class CIPlugin(AirflowPlugin):
Traceback (most recent call last):
File "/opt/conda/bin/airflow", line 37, in <module>
args.func(args)
File "/opt/conda/lib/python3.7/site-packages/airflow/utils/cli.py", line 78, in wrapper
metrics = _build_metrics(f.__name__, args[0])
File "/opt/conda/lib/python3.7/site-packages/airflow/utils/cli.py", line 108, in _build_metrics
full_command[idx + 1] = "*" * 8
IndexError: list assignment index out of range
What you expected to happen:
Expecting the flag -p
to mean --do_pickle
as per the command synopsis:
usage: airflow scheduler [-h] [-d DAG_ID] [-sd SUBDIR] [-r RUN_DURATION]
[-n NUM_RUNS] [-p] [--pid [PID]] [-D]
[--stdout STDOUT] [--stderr STDERR] [-l LOG_FILE]
optional arguments:
-h, --help show this help message and exit
-d DAG_ID, --dag_id DAG_ID
The id of the dag to run
-sd SUBDIR, --subdir SUBDIR
File location or directory from which to look for the
dag. Defaults to '[AIRFLOW_HOME]/dags' where
[AIRFLOW_HOME] is the value you set for 'AIRFLOW_HOME'
config you set in 'airflow.cfg'
-r RUN_DURATION, --run-duration RUN_DURATION
Set number of seconds to execute before exiting
-n NUM_RUNS, --num_runs NUM_RUNS
Set the number of runs to execute before exiting
-p, --do_pickle Attempt to pickle the DAG object to send over to the
workers, instead of letting workers run their version
of the code.
--pid [PID] PID file location
-D, --daemon Daemonize instead of running in the foreground
--stdout STDOUT Redirect stdout to this file
--stderr STDERR Redirect stderr to this file
-l LOG_FILE, --log-file LOG_FILE
Location of the log file
What do you think went wrong?
This PR broke it: https://github.com/apache/airflow/pull/11468
After this PR the option -p
is misinterpreted as --password
.
The workaround is to use the long form --do_pickle
.
How to reproduce it:
Run the mentioned command with the correct version of Airflow.
Anything else we need to know:
I have reported this via Slack and Marcos Marx asked me to open an issue about it.
Issue Analytics
- State:
- Created 3 years ago
- Reactions:2
- Comments:6 (3 by maintainers)
Top Results From Across the Web
7 Common Errors to Check When Debugging Airflow DAGs
1. Your DAG Isn't Running at the Expected Time · Airflow's Schedule Interval · Use Timetables for Simpler Scheduling · Airflow Time Zones....
Read more >airflow: error: unrecognized arguments - Airflow scheduler error
In AIRFLOW-2119 it shows that in v1.9 the original command that failed is output below the airflow: error: unrecognized arguments line.
Read more >Troubleshooting Airflow scheduler issues | Cloud Composer
Troubleshooting issues with running and queued tasks · Task queues are too long · Using TimeTable feature of Airflow scheduler · Limited cluster...
Read more >Release Notes — Airflow Documentation
New to this release of Airflow is the concept of Datasets to Airflow, and with it a new way of scheduling dags: data-aware...
Read more >apache/incubator-airflow - Gitter
[2018-08-20 11:46:16,196] {celery_executor.py:54} ERROR - Command 'airflow run ... raised unexpected: AirflowException('Celery command failed',)
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
also happens in Airflow 2.0
Closing this then. Thanks @jens-scheffler-bosch for looking at it. we can always reopen or create a new issue if needed.