Skip to content

[PE-3860]: A blog about our Airflow upgrade journey #131

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Update _posts/2023-06-07-airflow-upgrade-to-2-5-3.md
Co-authored-by: Maksym Dovhal <[email protected]>
  • Loading branch information
Dmytro Suvorov and Maks-D authored Jun 15, 2023
commit 6b4985e14765fdac1cdd2a80241e0a3fd0bb0490
2 changes: 1 addition & 1 deletion _posts/2023-06-07-airflow-upgrade-to-2-5-3.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ Issues we knew about and/or caught during the testing were:
- some of `timetable` functions also changed, and we had to find new ways to do what we did before merging (like `dag.timetable.infer_data_interval(your_execution_date).end` became `dag.timetable.infer_manual_data_interval(run_after=exec_dt).start`)
- `node:12.22.6` Docker image we used for building npm (`airflow/www/static/dist`) was too old for the new and shiny Airflow (switched to the `node:16.0.0`)
- `TriggerRuleDep` changed, and our custom rules that were using/overriding `_get_dep_statuses` and `_evaluate_trigger_rule` also started to fail
- Okta integration started to fail because of new Flask AppBuilder (simply adding `server_metadata_url` to existing configuration solved the issue)
- [Okta integration](https://tech.scribd.com/blog/2021/integrating-airflow-and-okta.html) started to fail because of new Flask AppBuilder (simply adding `server_metadata_url` to existing configuration solved the issue)
- some Airflow configuration params (like `AIRFLOW__CORE__SQL_ALCHEMY_CONN` -> `AIRFLOW__DATABASE__SQL_ALCHEMY_CONN`) were changes
- `AwsLambdaHook` changed to `LambdaHook` in the `AWS` provider. `function_name` parameter was moved from the `AwsLambdaHook.init()` function to the `invoke_lambda` function (put it closer to the execution)
- BaseOperator's `task_concurrency` parameter changed to `max_active_tis_per_dag`
Expand Down