Christina Leo is a Cloud Solution Architect at Microsoft specialising in the Data and AI Platform. Prior to joining Microsoft in 2016, she worked as an independent consultant focusing on T-SQL and BI development, as well as database performance tuning in the finance and investment banking sector. When she isn't sharing her 18 years of technical experience with the data community, you can find her racing around the Solent on a 40' sailboat or often in the Thames on a little Laser.
Christina E. Leo
Christina E. Leo's Sessions
Cognitive Services, Bots, and Search: Dev Tools for AISQLBits 2019
In this session, we’ll look at the different options within the Cognitive Services suite, show you how to connect to the APIs using Python code, walk through a live bot demo, and build an Azure Cognitive Search index. You should leave this session feeling like you’ve had a jump start to further your AI developer skill set.
Analytics on Azure: What to Use WhenSQLBits 2019
Azure offers a comprehensive set of big-data solutions that help you gather, store, process, analyse and visualise data of any variety, volume or velocity, so you can discover new opportunities and take quick action. In this overview session, we’ll look at the various components within Azure that make up the Modern Data Warehouse, enable Real-Time Analytics, and support Advanced Analytics scenarios. You should leave with a high level understanding of the capabilities and limitations of each of the products within the Azure Analytics portfolio.
Building Your T-SQL Tool Kit: Window Function FundamentalsSQLBits 2014
Sets are king when it comes to SQL Server, but sometimes you need to see data row by row. Window
Functions help you get the best of both worlds. Learn when and where these functions can help you get what you need without compromising performance.
Why APPLY?SQLBits 2013
Most T-SQL developers know that the APPLY operator can be used to invoke a table valued function. But do you know the other ways APPLY can be used? Come learn five additional use cases and leave with a few new tricks up your T-SQL sleeves.
What's Buried in the Plan Cache?SQLBits 2012
In this session, we'll examine the query plan cache to see what plans are saved, what plans are reused, when plans are recreated, methods for observing the contents of the plan cache, and finally,
methods for manipulating plan reuse and recreation.
Working with Server Side TracesSQLBits 2011
Learn to create and customise TSQL scripts for capturing SQL Profiler data in a server side trace. Examine methods for stopping, starting and storing these traces, and finally look at free tools available for analysing the captured data.
Cognitive Services, Bots, and Search: Dev Tools for AISQLBits 2019
In this session, we’ll look at the different options within the Cognitive Services suite, show you how to connect to the APIs using Python code, walk through a live bot demo, and build an Azure Cognitive Search index. You should leave this session feeling like you’ve had a jump start to further your AI developer skill set.
Building Your T-SQL Tool Kit: Window Function FundamentalsSQLBits 2014
Sets are king when it comes to SQL Server, but sometimes you need to see data row by row. Window
Functions help you get the best of both worlds. Learn when and where these functions can help you get what you need without compromising performance.
Why APPLY?SQLBits 2013
Most T-SQL developers know that the APPLY operator can be used to invoke a table valued function. But do you know the other ways APPLY can be used? Come learn five additional use cases and leave with a few new tricks up your T-SQL sleeves.
What's Buried in the Plan Cache?SQLBits 2012
In this session, we'll examine the query plan cache to see what plans are saved, what plans are reused, when plans are recreated, methods for observing the contents of the plan cache, and finally,
methods for manipulating plan reuse and recreation.
Working with Server Side TracesSQLBits 2011
Learn to create and customise TSQL scripts for capturing SQL Profiler data in a server side trace. Examine methods for stopping, starting and storing these traces, and finally look at free tools available for analysing the captured data.