So we need to load data at scale into Synapse SQL Pools. What should we be thinking about.
There are a few options for loading : COPY and Polybase does it matter which one to use. What difference does the file format make Parquet or Delta or CSV ? What above performance ? How quickly can we load a TB of data? 2TB? what about 10TB? Or more????
So lets see how much data we can load in 20 minutes...

Feedback link:
Presented by Mark Pryce-Maher at SQLBits 2022