SQLBits 2022
Automate the deployment of Databricks components using Terraform
Introduction into Terraform, Databricks provider and steps required to build an automated solution to provision Databricks workspace and resources into Azure cloud platform using Terraform.
Databricks is a great data analytics tool for data science and data engineering, but provisioning Databricks resources (workspace, clusters, secrets, mount storage etc.) can be complex and time consuming.
Automating deployment of Databricks resources has been tricky in the past using Terraform an Infrastructure as Code tool. It has required using mix of Terraform Azure providers and/or ARM, PowerShell, Databricks CLI or REST APIs. This made it harder to repeat and caused inconsistent environments.
Databricks introduced its own Terraform provider to assist with deploying and managing Databricks resources into Azure, Google (GCP) and Amazon Web Services (AWS) cloud platforms. Giving the ability to automate deployment of Databricks resources at the time of provisioning the infrastructure, making it easier to manage and maintain.
This session will be introducing you to Terraform, Databricks provider and take you through the steps required to build an automated solution to provision Databricks workspace and resources into Azure cloud platform using Terraform.
By the end of this session, you will have everything you need to automate your Databricks environments deployments and ensure consistency.
Feedback link: https://sqlb.it/?7101
Automating deployment of Databricks resources has been tricky in the past using Terraform an Infrastructure as Code tool. It has required using mix of Terraform Azure providers and/or ARM, PowerShell, Databricks CLI or REST APIs. This made it harder to repeat and caused inconsistent environments.
Databricks introduced its own Terraform provider to assist with deploying and managing Databricks resources into Azure, Google (GCP) and Amazon Web Services (AWS) cloud platforms. Giving the ability to automate deployment of Databricks resources at the time of provisioning the infrastructure, making it easier to manage and maintain.
This session will be introducing you to Terraform, Databricks provider and take you through the steps required to build an automated solution to provision Databricks workspace and resources into Azure cloud platform using Terraform.
By the end of this session, you will have everything you need to automate your Databricks environments deployments and ensure consistency.
Feedback link: https://sqlb.it/?7101
Speakers
Anna-Maria Wykes's other proposed sessions for 2026
MCP Unleashed From “Huh?” to “Heck Yeah!” Building Smarter AI Knowledge Bases - 2026
Migrating the Mammoth - 2026
What we Learned Migrating a Financial Giant from Hudi to Delta (and Why Iceberg was in the Mix) - 2026
From “Who Wrote This ETL?” to Databricks, Claude Saves the Day via Microsoft Foundry - 2026
Getting Started with Claude in Microsoft Foundry - 2026
Anna-Maria Wykes's previous sessions
How to Run Code Clubs for Neurodiverse Children
Code Clubs offer an amazing opportunity to introduce our next generation to coding, with simple brightly colored drag-and-drop tooling to get them started, we are successfully inspiring many to join the tech industry.
In this session I want to talk you through my journey setting up a Code Club for neurodiverse children, what I found worked, and what doesn’t. I hope that from this session you will be inspired to follow the same path I have, using your amazing tech experience to empower some of the most vulnerable children, enabling them to become inspired not just by coding, but the tech industry itself.
Introduction to the wonders of Azure DevOps
Azure DevOps is the leading deployment tool for build and release solutions end to end. It helps you plan your Agile project, manages Git code, and deploys solutions using Continuous Integration (CI) and Continuous Deployment (CD) pipelines.
In this session we will cover some of the core components of Azure DevOps and show you how to implement a secure deployment pipeline, using unit tests and gating with your CI builds and CD releases.
Automate the deployment of Databricks components using Terraform
Introduction into Terraform, Databricks provider and steps required to build an automated solution to provision Databricks workspace and resources into Azure cloud platform using Terraform.
So you want to be a Data Engineer?
An introduction to becoming a Data Engineer, Anna, Mikey and Ust will introduce the technology stack, tools and development skills needed for data engineering and show you how and where to go to learn them. We'll also show you how the skills you already have can kickstart your journey to becoming a Data Engineer.
Scala for Big Data the Big Picture
An opportunity to explore Scala, and why it is truly a “Data Engineers language”. Using Azure Functions, Data Factory, Azure Data Lake Gen2 and Databricks the basics will be explored, followed by real world examples
Falek Miah
falekmiah.com
Falek Miah's previous sessions
Value of DevOps Release Process in Data Teams
Have you ever wondered why release plans, approaches, and environments are important in the world of data operations? Many data professionals come from various backgrounds without prior software development experience, leading to questions about the necessity of these concepts.
In this session, we will discuss the significance of DevOps Release Processes for data teams. We will explore how insufficient processes can lead to delays in deployment, introduce breaking changes, hinder team collaboration and result in multiple releases.
Introduction to the wonders of Azure DevOps
Azure DevOps is the leading deployment tool for build and release solutions end to end. It helps you plan your Agile project, manages Git code, and deploys solutions using Continuous Integration (CI) and Continuous Deployment (CD) pipelines.
In this session we will cover some of the core components of Azure DevOps and show you how to implement a secure deployment pipeline, using unit tests and gating with your CI builds and CD releases.
Spark Execution Plans for Databricks
Introduction into Spark Execution Plans for Databricks for optimizing code and execution.
Building Your Data Analytics Team - live RunAs Radio episode
Panel Discussion moderated by Richard Campbell of RunAs Radio!
Automate the deployment of Databricks components using Terraform
Introduction into Terraform, Databricks provider and steps required to build an automated solution to provision Databricks workspace and resources into Azure cloud platform using Terraform.