Tasks stuck in "Queued" (LocalExecutor) #14236
Unanswered
FredericoCoelhoNunes
asked this question in
Q&A
Replies: 1 comment
-
I wasn't able to run any of my DAGs, so I decided to test with the Airflow "tutorial" DAG (https://airflow.apache.org/docs/apache-airflow/stable/tutorial.html), with the SequentialExecutor. I don't know why this is happening, because this is a locally hosted Postgres database - if anyone has any insights, I'd love to hear them! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi everyone,
I would love to hear your advice regarding the best way to set up the LocalExecutor.
Up to this point I have been using the SequentialExecutor and the DAG in question runs just fine. However, I have recently started needing to run tasks in parallel, so I have launched a local PostgreSQL database and switched to the local executor.
However, the same DAG now doesn't even start running - the first tasks are stuck in the "Queued" state for a minute or so, and then are set to "Failed". Here is a log file (dag/task names were modified):
At first I thought it might be a database connection issue, but
airflow db check
only takes ~10 seconds and returns a success message. And this database is being run locally so I don't see why it would be significantly slower than the default sqlite database.I have no clue how to go about debugging this! So any tips would be much appreciated :)
Cheers
Beta Was this translation helpful? Give feedback.
All reactions