question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

task_concurrency=1 not able to limit the active tasks to 1

See original GitHub issue

Apache Airflow version: 1.10.10

Environment:

  • Cloud provider or hardware configuration:
  • OS (e.g. from /etc/os-release):
  • Kernel (e.g. uname -a):
  • Install tools:
  • Others:

What happened:

I have set the task_concurrency=1 for BashOperator, like below sleep_task = BashOperator(task_id=f"task1", bash_command=“sleep 2”, execution_timeout=None, timeout=20, retries=3, retry_delay=timedelta(seconds=30), task_concurrency=1, dag=dag) It should limit the active tasks for this sleep_task to 1 across dags as per definition.

What you expected to happen:

Setting task_concurrency has no impact

How to reproduce it:

Add task_concurrency to any operator and try to run many dags at same time, so that these two tasks will be running at same time

Anything else we need to know:

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:5 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
ashbcommented, Dec 15, 2020

Just tested this on 2.0.0rc3 and it’s behaving correctly – only one task runs at a time…

0reactions
goakguncommented, Dec 13, 2020

any update on this?

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to control the parallelism or concurrency of an Airflow ...
The scheduler will not create new active DAG runs once this limit is hit. ... max_active_runs=1) # Allow a maximum of 10 tasks...
Read more >
Airflow Task Parallelism. How to control concurrency
This defines the maximum number of task instances allowed to run across all active DAG run for the specific DAG. If not set...
Read more >
T300870 Airflow concurrency limits - Wikimedia Phabricator
concurrency : This is the maximum number of task instances allowed to run concurrently across all active DAG runs for a given DAG....
Read more >
7 Common Errors to Check When Debugging Airflow DAGs
7 Common Errors to Check When Debugging Airflow DAGs. Tasks not running? DAG stuck? Logs nowhere to be found? We've been there.
Read more >
Amazon MWAA automatic scaling
The default setting of 10 Workers in Maximum worker count. An Apache Airflow configuration option for celery.worker_autoscale of 5,5 tasks per worker. This ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found