Tips for handling large data volumes

Ever tried deleting 100 million records. If you have you will know your transaction log will likely blow up, you will block all access to the table and it will generally be painful. In this session we will look at some of the techniques you can use to make this easier. We will look at how 
  • deletes and updates work and how indexes affect them
  • how to break the job into smaller chunks
  • how to run those chunks in parallel

After this session you will understand more about why these operations are painful, what you can do to make them easier to do and reduc the impact on your end users.

Presented by Simon Sabin at SQLBits VIII
  • Downloads
    Sorry, there are no downloads available for this session.
  • SpeakerBIO

    Simon runs a Data Consultancy enabling companies to make the most of the data they have.

    He has worked with data for all his career and worked with companies across all industry sectors including online retail, insurance, finance, motor sport.

    He works with companies to help them

    1. Improve their data development practices including implementation of devops, agile methodologies and continuous integration.

    2. Understand and define a cloud data platform strategy.

    3. Optimise their data platform, including performance, scalability, security and certification

    Education of people is at the heart of what Simon and his company stand for. it is epitomised by SQLBits, which Simon founded in 2007. Its the largest SQL Server conference outside of Europe and many view as the best in the world, always maintaining a free element and ensuring education for everyone.

    He is also a SQL Microsoft Certified Master and has been Microsoft MVP since 2005

    You can follow him @simon_sabin or read his blog
  • Video
    The video is not available to view online.