Building complex data workflows using Azure Data Factory can get a little clunky - as you orchestration needs get more complex you hit limitations like not being able to nest loops or conditionals, running simple Python, bash or PowerShell scripts is difficult, and costs can grow quickly as you are charged per task execution. Recently another option become available, Managed Airflow in ADF.
Apace Airflow is a code-centric open-source platform for developing, scheduling and monitoring batch-based data workflows, built using the python language Data Engineers know and love. But until Managed Airflow, getting it working in Azure was a complex task for customers more used to PaaS services such as ADF, Databricks and Fabric. It is also an important ETL orchestrator on AWS and GCP, so cross cloud compatibility becomes simpler to achieve.
In this session we’ll look at what Airflow is, how it’s different from ADF, and what advantages Managed Airflow in ADF gives us. We talk about the idea of a DAG for building the workflow, and then work through some demos to show just how easy it is to use Python to write an Airflow DAG’s and import them into the Managed Airflow Environment as pipelines. We then dive into the excellent monitoring UI and find out just how easy is it to trigger a pipeline, view it to see the dependencies between tasks, and monitor runs.
By the end of the session attendees will have a good understanding of what Airflow is, when to use it, and how it fits into the Azure Data Platform.