22-25 April 2026

Airflow in Fabric - an introduction informed by experience

Proposed session for SQLBits 2026

TL; DR

Are you look for an introduction to Airflow? Are you wanting to use Airflow in Fabric? This is the session for you. In this session we will run through what airflow is, how it works and how it works in Fabric.

Session Details

Apache Airflow is the dominant OSS data workflow orchestrator, and is now available within Fabric as first class workload. So now you can use Airflow DAGs with Fabric to orchestrate your data processes and even to integrate with non-Fabric systems

In this session we will:
- Outline the basics of Apache Airflow including the methods by which it interacts with other services.
- Explain how Airflow fits in with the Fabric platform specifically the settings Fabric exposes and how it fits from a capacity perspective
- Discuss and demo integration patterns between Airflow and Fabric services
- Look at ways to Implement robust scheduling, dependency management, and error handling
- Look at a practical example of how Airflow can simplify complex Spark transformation processes
- Discuss where Airflow makes sense, where it doesn't and why Airflow and Pipelines work extremely well together
- Critically look at how Airflow DAGs allow developers to leverage their Python experience when creating workflow

Whether you’re struggling with Fabric workflow limitations, struggling to understand Airflow and what it can do for you, or looking to bring existing DAGs to Fabric - this session provides actionable guidance.

This session assumes a reasonable degree of familiarity with Fabric, no familiarity with Airflow is required.

3 things you'll get out of this session

1 - understand what airflow is and the core concepts are 2 - understanding of how airflow works in Fabric, the capabilities exposed 3 - basics on DAG authoring that would allow an attendee to get started