Sep 2024

Data Observability Takes Centre Stage at Big Data LDN

Written by Positive Team

Data Observability Takes Centre Stage at Big Data LDN

In a data-centric world, it’s no surprise that observability took centre stage at Big Data London 2024.

The discussions revolved around how organisations can guarantee the trustworthiness, timeliness, and relevance of their data, which is precisely where observability proves invaluable. This process extends beyond merely monitoring data flows; it emphasises  a culture of reliability and integrity in data practices.

Attending the conference provided a unique opportunity to delve deeper into this evolving field. Talks by industry leaders like Ramon Chen and Mahesh Kumar from Acceldata and Pete Williams from Penguin Random House UK stressed the growing importance of data observability in ensuring data quality, operational efficiency, and cost savings. It was refreshing to see industry experts align on the idea that observability is not just an add-on but a necessity in today’s data-driven landscape.

The essential role of active metadata

BDL 1

Ramon Chen’s session explained how data observability is more than just a trend; it’s essential for maintaining trust in data across an organisation. With data pipelines becoming increasingly complex, organisations often face issues like data drift (where the meaning or behaviour of data changes over time), schema drift (unexpected changes in the structure of the data), and pipeline inefficiencies (bottlenecks that slow down data processing or lead to errors). Chen emphasised how traditional data tools often fail to catch these issues early, causing problems at the consumption layer, where they’re much harder and more costly to fix. By adopting observability practices, businesses can track data in real time, spot anomalies earlier, and optimise their infrastructure, which leads to significant cost savings and increased reliability.

A key takeaway from Chen’s talk was how active metadata plays a crucial role in observability. Instead of merely cataloguing data passively, organisations can use active metadata to generate proactive alerts and insights, which can prevent issues before they escalate. This shift toward proactive data quality highlights the future of data management, where observability goes beyond quality checks to ensure continuous pipeline health and operational efficiency.

The Three E Framework for Building Trust in Data

Pete Williams’ session built on these ideas, introducing the Three E Framework for building trust through observability: Establish, Enable, and Exploit. He explained that trust in data starts with building a robust platform that ensures pipeline transparency and data governance from day one (Establish). To fully harness the power of observability, organisations must also Enable their teams by fostering data literacy—ensuring that everyone within the organisation understands how to use and benefit from the data platform. This involves getting teams aligned on the value of data observability and making sure they’re equipped to leverage the insights it provides. Finally, in the Exploit phase, organisations combine their technical foundation and knowledgeable workforce to fully use observability for better decision-making and operational efficiency. This perspective resonated deeply with me; true transformation requires buy-in from all levels of an organisation.

Managing Data Entropy with Observability

Mahesh Kumar from Acceldata emphasised the importance of observability in managing data entropy. He explained how the increasing volume and variety of data types, coupled with higher business demands, create chaos within data pipelines. Observability is the key to managing this entropy. He shared real-world examples of companies who saved millions by reducing data drift, pipeline failures, and cloud costs. From improving sales forecasts to enhancing data engineer productivity, Mahesh demonstrated that observability isn’t just about data quality—it directly impacts the bottom line. This realisation is vital; organisations that fail to embrace observability may find themselves lagging behind their competitors.

Embracing observability is not just a technical enhancement, but a fundamental shift in how organisations operate. The insights gained from Big Data London reaffirmed my belief that investing in observability can lead to more informed decision-making, greater efficiency, and a stronger competitive edge. 

Observability is a foundational pillar for organisations aiming to navigate the complexities of modern data environments. By incorporating active metadata and promoting a culture of transparency, businesses can not only measure but also enhance their observability practices. This holistic approach empowers organisations to build trust in their data, adapt swiftly to changes, and unlock new levels of efficiency and innovation.

Our newsletter

Sign up to our monthly industry insights