Data professionals live in a complex, chaotic world. The modern data stack is too complex. New cloud data toolchains are fragmented. Data architecture patterns are diverse and complicated. And data itself, of course, is diverse, always multiplying and forever changing.
On top of all this, the process of ingesting, storing, transforming, predicting, visualizing and governing data is highly distributed across various people and departments in your organization.
It’s no surpise, then, that data professionals’ jobs are chaotic and stressful. There’s a constant fear that somewhere along the journey data takes from source to value, suddenly, everything will break, and you will be left holding the bag.
Something is missing from our data systems; the ability to identify, measure and quantify expectations versus the reality in production data systems. What is the variance between what is happening now and what should be happening? Is it on time? Late? Is it trustworthy? What is happening now? Will my customers find a problem before we do?
That missing piece that connects data system expectations and reality is a ‘data journey.’ In this session, Christopher Bergh explains what the data journey is, where it’s broken and how organizations can use DevOps principles (DataOps) and observability to make sure data is truly driving an organization’s success.