Skip to content

ImportError in TriggerDagRunOperator makes every Dag useless when paused by default #49499

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
1 of 2 tasks
krisgeus opened this issue Apr 21, 2025 · 4 comments
Closed
1 of 2 tasks
Labels
area:core kind:bug This is a clearly a bug needs-triage label for new issues that we didn't triage yet

Comments

@krisgeus
Copy link
Contributor

Apache Airflow version

3.0.0

If "Other Airflow 2 version" selected, which one?

No response

What happened?

Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/serialization/serialized_objects.py", line 1066, in detect_task_dependencies
from airflow.providers.standard.operators.trigger_dagrun import TriggerDagRunOperator
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/standard/operators/trigger_dagrun.py", line 63, in
from airflow.exceptions import DagIsPaused
ImportError: cannot import name 'DagIsPaused' from 'airflow.exceptions' (/home/airflow/.local/lib/python3.12/site-packages/airflow/exceptions.py)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/serialization/serialized_objects.py", line 1787, in to_dict
json_dict = {"__version": cls.SERIALIZER_VERSION, "dag": cls.serialize_dag(var)}
^^^^^^^^^^^^^^^^^^^^^^
File "/home/airflow/.local/lib/python3.12/site-packages/airflow/serialization/serialized_objects.py", line 1696, in serialize_dag
raise SerializationError(f"Failed to serialize DAG {dag.dag_id!r}: {e}")
airflow.exceptions.SerializationError: Failed to serialize DAG 'whirl-local-api-to-s3-example': cannot import name 'DagIsPaused' from 'airflow.exceptions' (/home/airflow/.local/lib/python3.12/site-packages/airflow/exceptions.py)

What you think should happen instead?

DagIsPaused Exception should be evailable

How to reproduce

Startup 3.0.0rc4 with any dag

Operating System

OSX (Docker)

Versions of Apache Airflow Providers

pip freeze | grep apache-airflow-providers
apache-airflow-providers-amazon==9.6.0
apache-airflow-providers-celery==3.10.6
apache-airflow-providers-cncf-kubernetes==10.4.2
apache-airflow-providers-common-compat==1.6.0
apache-airflow-providers-common-io==1.5.3
apache-airflow-providers-common-messaging==1.0.0rc3
apache-airflow-providers-common-sql==1.25.0
apache-airflow-providers-docker==4.3.1
apache-airflow-providers-elasticsearch==6.2.1
apache-airflow-providers-fab @ file:///docker-context-files/apache_airflow_providers_fab-2.0.1rc0-py3-none-any.whl#sha256=aa23f05bffb1497cd7778569e9130466a06884335834654024499504a6024ae5
apache-airflow-providers-ftp==3.12.3
apache-airflow-providers-git @ file:///docker-context-files/apache_airflow_providers_git-0.0.2rc0-py3-none-any.whl#sha256=700eb56025686e47dba7be910b1e752fdcdd419dfd54e0e77a98bded39a3a2f6
apache-airflow-providers-google==15.0.1
apache-airflow-providers-grpc==3.7.3
apache-airflow-providers-hashicorp==4.1.0
apache-airflow-providers-http==5.2.1
apache-airflow-providers-microsoft-azure==12.3.0
apache-airflow-providers-mysql==6.2.1
apache-airflow-providers-odbc==4.9.1
apache-airflow-providers-openlineage @ file:///docker-context-files/apache_airflow_providers_openlineage-2.2.0rc0-py3-none-any.whl#sha256=9d15493974ae0a7a4c3828970dbf549a003311bf454c57998e72bf0a9f1261a7
apache-airflow-providers-postgres==6.1.2
apache-airflow-providers-redis==4.0.2
apache-airflow-providers-sendgrid==4.0.1
apache-airflow-providers-sftp==5.2.0
apache-airflow-providers-slack==9.0.4
apache-airflow-providers-smtp==2.0.2
apache-airflow-providers-snowflake==6.2.1
apache-airflow-providers-ssh==4.0.1
apache-airflow-providers-standard @ file:///docker-context-files/apache_airflow_providers_standard-1.0.0rc0-py3-none-any.whl#sha256=2f477f01152930eba95305f199df355cb279c23d190b3160a42e28a68fd72bdb

Deployment

Other Docker-based deployment

Deployment details

No response

Anything else?

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

@krisgeus krisgeus added area:core kind:bug This is a clearly a bug needs-triage label for new issues that we didn't triage yet labels Apr 21, 2025
Copy link

boring-cyborg bot commented Apr 21, 2025

Thanks for opening your first issue here! Be sure to follow the issue template! If you are willing to raise PR to address this issue please do so, no need to wait for approval.

@tirkarthi
Copy link
Contributor

@kaxil
Copy link
Member

kaxil commented Apr 21, 2025

Correct it has been moved -> #49500

This was an error in the Docker image but does not affect Airflow 3.0.0 or Standard provider 1.0

@kaxil kaxil closed this as completed Apr 21, 2025
@kaxil
Copy link
Member

kaxil commented Apr 21, 2025

Uninstall apache-airflow-providers-standard and force install apache-airflow-providers-standard==1.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:core kind:bug This is a clearly a bug needs-triage label for new issues that we didn't triage yet
Projects
None yet
Development

No branches or pull requests

4 participants