-
Apache Airflow version2.10.2 If "Other Airflow 2 version" selected, which one?No response What happened?I'm currently using an airflow instance setup with I have a task assigned to a "heavy" queue: @task.virtualenv(
trigger_rule=TriggerRule.NONE_FAILED_MIN_ONE_SUCCESS,
requirements=[
"my_package=1.1.1"
],
index_urls=["my_index_url"],
venv_cache_path="/tmp",
queue="heavy"
)
def my_task(arg1, arg2, params=None):
... but for some reason the task is running on a worker configured to only handle the default queue? -> IMPORTANT: The DAG Run existed prior to the addition of the What you think should happen instead?Tasks assigned to a specific queue should only run in workers assigned to that specific queue. How to reproduceNot sure how. Operating SystemUbuntu 22 Versions of Apache Airflow ProvidersNo response DeploymentDocker-Compose Deployment details
Anything else?-> IMPORTANT: The DAG Run existed prior to the addition of the Are you willing to submit PR?
Code of Conduct
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
That's the reason. Past dag-run queues cannot be changed after dag_run and specifically "task instance" object entry gets created. See https://airflow.apache.org/docs/apache-airflow/stable/database-erd-ref.html - the task_instance model contains "queue" so once task instance has the queue set there, re-running the same task instance will use what is there. You could likely modify it manually, but currently there is no way - I think to modify it from UI - other than running backfill (@dstandish ?) which I think should do what you want. Which could be possibly a good idea to change in Airlfow 3. There are similar discussions happening about other features of backfill and pool behaviours at the devlist that might get improved in Airflow 3 - https://lists.apache.org/thread/zbm6tvlcz62nc9hl1mzrzz9t4bcrjngc , https://lists.apache.org/thread/jmj842wsw78clk9twdrz1t71ogsbk10s and others - so if you think it's a good idea to introduce such feature, feel free to start a new thread at the devlist. Converting it into a discussion in case more discussion here is needed, but I encourage you to continue at the devlist. |
Beta Was this translation helpful? Give feedback.
That's the reason. Past dag-run queues cannot be changed after dag_run and specifically "task instance" object entry gets created.
See https://airflow.apache.org/docs/apache-airflow/stable/database-erd-ref.html - the task_instance model contains "queue" so once task instance has the queue set there, re-running the same task instance will use what is there.
You could likely modify it manually, but currently there is no way - I think to modify it from UI - other than running backfill (@dstandish ?) which I think should do what you want.