In this talk, we will present a client's case on how to build metadata driven datahub using Databricks Delta Lake. We will start by discussing the business case and the full architecture of our solution which consists of multiple Azure resources (ADF, Azure SQL, MDF, Databricks, Power BI, ADLS). For this session, we will only focus on Databricks (Delta Lake). From there, we will take a further look on our core metadata model that contain different source systems metadata, technical validation rules, functional validation rules, how to ingest, process, and load the source data to Databricks Delta Lake. Finally, we will zoom in on our solution of using Databricks Delta Lake, what is Delta Lake, why Delta Lake, and how to implement it by diving deep into the code, running demos, and showing Databricks Notebooks.


In this session, we strive to share our knowledge of using Databricks Delta Lake through presentation, code examples and notebooks. We will explain our challenges, lesson learned, and the use of Databricks Delta Lake to address them. You will walk away with an understanding of how you can apply this techniques to a similar data architecture and how to gain benefit from it.


What you'll learn:

1. Client use case

2. What is Databricks Delta Lake

3. Minimizing code development by architecting metadata driven solution using Databricks

The video is not available to view online.