Some transformations that can save your ETL in Power BI
To build analytical models, we need to start by extracting, transforming, cleaning, preparing and loading the data. This session analyzes a set of scenarios that may happen during the ETL step using the Power Query in Power BI. ETL process is the first and definitely , a layer of a Business Intelligence system, we could say, it’s a “Hidden layer “, final user, who looks at the analytical reports and dashboards, does not see all the stuff necessary for cleansing, filtering and wrangling data before loading it.
By first, we will look briefly the definition of ETL process in general and ETL process in Power BI in particular
And then, we will travel through the three steps of ETL: Extraction, transform (that includes Data quality and Data transformation, and Load data to the Data model
Power Query editor in Power BI is a powerful tool for ETL purposes, data could be extracted from many different sources and transformed from different format to model, that means: cleansed and improved considering business requirements.
During the Power BI ETL Extraction, Power BI connects to a large amount of data sources: File, Database, Azure, Online Services, more Online Services and other. Power BI uses the specific connection and navigation wizards that allows us an easy way for getting data from the source. In this session, we will connect to different data sources, including files, folders, SQL Server tables, R Scripts, and blank queries. We will combine queries and files. There are several transformations that should be done before combining files, due to we should create them for Sample File first. Besides that, we will create and use functions and parameters.
There are transformations by a table, by any column and specific transformations by text, number and date columns. The use of transformations for pivoting, unpivoting and transposing tables, filtering by rows and columns, filling columns, and others will be used for solving problems related to the structure or quality issues of the data. We will analyze some situations for which will be necessary to clean before grouping, filtering and using columns for evaluations.
We will also cover some aspects of the M programming language, as it for handling the model.
We will execute a lot of transformations, that definitely could save our ETL allowing us to create: data model, additional queries, parameters, and functions. Finally, we will obtain a beautiful star schema model.