By Michael O’Donnell at Quest.
A perfect storm is underway
Organizations today are navigating a rapidly evolving technological landscape, challenging their competitiveness and innovation pace. This reality is compounded by a significant talent shortage and a complex array of tools and vendors, as noted by Deloitte1 and The Boston Consulting Group2. The crux of these challenges is the ‘Data Dilemma’ – managing the overwhelming growth in data volume and complexity. Addressing this is crucial for businesses to thrive in an environment where data is key to innovation and progress.
The Data Dilemma
In the data management cockpit, one might think I’m oversimplifying, but with respect to operations, databases and diverse data sources underpin the performance of numerous applications and business processes within an organization, with the primary focus being data availability and system performance. But I consider data as a byproduct of the line of business applications and other software systems that support business processes. This data is amassing at an unprecedented rate. The period between 2018 and 2021 saw data volumes surge to approximately 84 zettabytes (ZB), with projections indicating a continued expansion at a compound annual growth rate (CAGR) of 21% until 2024, potentially reaching a staggering 149 ZB. 2
This burgeoning digital universe is predominantly unstructured—95% to be exact—including streams of video, voice, and text. Despite this, the accumulation of structured data is accelerating more rapidly, driven by the expanding domain of business intelligence applications that demand more orderly data. Concurrently, over half of the data hoarded by some enterprises is classified as dark data, unutilized in decision-making or insight generation—a fact highlighted in industry interviews. Widely recognised, the management of such data represents a colossal challenge, yet within it lies massive opportunity.
The crux of this data deluge is not solely its storage and back-up/recovery but the pressing need for insight and intelligence; the distillation of information to enhance utility and inform decisions. The data dilemma is perpetuated by this cycle: the quest for more refined information to underpin better decision-making processes. The appetite of the end user for better systems, software, apps, experience also contributes but that doesn’t help with this argument.
In the intricate world of data management, we often see a distinct dichotomy: on one side is Operations, the realm of databases and diverse data sources underpinning numerous applications and business processes, primarily focused on data availability and system performance. I like to think of this domain as ‘Enginerium.’ On the other side lies Intelligence, driven by the pursuit of insights from this amassed data, a territory I refer to as ‘Bizlandia.’ This burgeoning realm of Intelligence is accelerating, particularly with the structured data needed for business intelligence applications. However, a key challenge exists: a significant gulf of miscommunication often separates these two sides. Common between the groups and almost bridging both worlds are Data Warehouses, Data Lakehouses, and Data Pipelines, serving as crucial intersections where the operational robustness of Enginerium meets the analytical acumen and insight hunger of Bizlandia. These platforms not only facilitate the storage and management of vast volumes of data but also enable the distillation of this data into actionable insights, thereby enhancing decision-making processes and business strategies.
Microsoft Data Platforms; Bridging Roles and Platforms
The roles of Database Administrators (DBAs), Data Engineers, Data Analysts, and Data Leaders often seem at odds. These seemingly disparate roles, sometimes conflicting, are intimately linked, each playing a vital part in the data lifecycle. This is why I love SQLBits so much. The Microsoft Data Stack is the only ecosystem that will consistently bring these roles together. For good reason; Microsoft’s data technology ecosystem, encompassing tools like SQL Server, Azure SQL Database, Power BI, Azure Service Fabric, and Microsoft Fabric, plays a crucial role in unifying the functions of DBAs, Data Engineers, Analysts, and Leaders. These tools blend transactional and analytical functionalities, enabling real-time data analysis and modernizing database management, especially in cloud environments. Additionally, Microsoft’s ecosystem includes platforms like Azure Service Fabric, which further support scalable, distributed application development. This comprehensive suite of technologies fosters robust data management, crucial for the success of AI and ML initiatives and driving the future of integrated data management across various operational and analytical needs.
On the database front, technology performance is also progressing with NoSQL, DBaaS, Graph DB, and even the fundamental concept of the traditional database. SQL Server’s in-memory OLTP and in-memory columnstore features illustrate the evolving nature of database technologies, blending OLTP and OLAP functionalities. This convergence enables real-time data analysis, providing businesses with timely insights for quick decision-making. This can’t be overlooked as it is a foundational concept of the structure of the database that has been asked since Codd; is the database to be used for transactional or analytical purposes – it can’t be both. That’s no longer the case.
Outside of direct technologies, Data Analysts, Data Engineers, and Data Leaders require a robust understanding of data maturity, including effective data-driven enablement and quality management. They need skills in data modeling and metadata management, coupled with the ability to curate and contextualize data for business relevance. Proficiency in data-driven enablement workflows and understanding data’s business impact are crucial for maximizing data’s value within their organizations.
To enhance their capabilities, Database Administrators (DBAs), Database Engineers, and Database Reliability Engineers require tools and strategies that streamline database management and ensure high performance and reliability. They need solutions for performance monitoring, backup, and disaster recovery to manage SQL Server environments effectively. Additionally, having tools for database replication and migration, as well as capabilities for SQL query tuning and optimization, is essential. Implementing robust security measures and maintaining compliance with regulatory standards are also crucial aspects of their roles.
Try these additional resources for more information; quest.com/solutions/sql-server/ and erwin.com/whitepaper/7-steps-to-maximizing-the-value-of-your-data-ebook/
About the author;
As an analyst soaring high in the world of data, Mike is your Icarus in navigating the vast skies of information. His mission: to uncover hidden insights, much like a Top Gun of data, ensuring our IT operations are as reliable as a Boeing 747. In this world of data, I’m the Sully of analysis, expertly landing complex datasets safely. Let’s join forces, like a crew of the Millennium Falcon, and turn data into powerful action. Ready for takeoff?