Skip to content

The Data-Dollar Disconnect

What do we mean by “data-dollar disconnect”, and how is it pertinent?

Our assertion is that a lot of data is generated and it’s often handled in a fashion that may not maximise its potential value. The data-dollar disconnect is a term for this relationship, it refers to the disparity between the potential value of the data and the amount of revenue that it actually generates.

Bridging this gap can be a challenge, as it has technical and cultural aspects. It requires a balanced combination of skills, tools, and processes to successfully extract and deliver value from data assets. This means that a strategic approach is necessary to close the gap effectively.

However, with the right resources at hand, organisations can overcome this challenge and unlock more of the potential in their data.

We’re great at collecting and storing data, but not so good at getting value from it

The term big data gained traction in the late 90s. By 2017 an article in the Economist had declared, and I paraphrase, that “data is the new oil”, alluding to the monetisation potential of data.

In 2021, the International Data Corporation (IDC) predicted that 2022 would see the world produce 94 zettabytes of data, which would cost something in the realm of $3 trillion to store. Come 2022, data monetisation was estimated to be worth a staggering $2.9 billion dollars! However, if you look closer, the numbers are not impressive — on average, less than 0.1% of the data had been successfully monetised!

Read another way, we’re really good at spending money to collect and store data, but we’re considerably less successful at extracting value from it. For the Tolkien fans amongst us, we look a lot like Smaug, sitting on a massive pile of gold. Unlike the dragon, however, we have to do something with our hoard to make it valuable.

Common wisdom is to gather all the data in the land and build ever more sophisticated abodes (marts, cubes, warehouses, lakes, etc… ) to store it. The hope is that the domain wizards (read data scientists ) then convert it into algorithms, or data products, that make money.

That’s the theory. The reality is a bit different, as the following story highlights. Enter the CIO of a ~100-year-old German upstream energy services supplier. Eager to bring together diverse organic and inorganic datasets and unlock the value trapped within, the CIO embarked on a transformation project to create a state-of-the-art data warehouse. Two years passed, the data warehouse went live and opened for business. Following the launch, the CIO reviewed the usage metrics to find that ~5% of the warehouse was being used, with a peak usage only ever hitting 15%.

There are two takeaways from this story. First, more than 85% of the effort (i.e. 20 months) did not produce any returns; in other words, it was wasted! Second, the 15% of users who had used the platform (this is the data that was actually monetised) could have started using it within 4 months but instead had to wait 20 more!

Architectural and data challenges

Data architecture is not special or different. Like other technical fields, it has been victim to a number of fads and fashions. We’ve moved from relational databases to NoSQL and, in some cases, back again. We’ve introduced systems that scale massively at the expense of introducing mind-boggling complexity. More recently, there’s been considerable marketing pressure to adopt the data lakehouse.

Seldom are these changes made for the right reasons. Buyers lack the visibility of the specific use cases that the products are built for and sellers are incentivised for growth over fitness for purpose.

As with all things architecture, there are rarely right or wrong answers, rather a series of trade-offs around fitness for purpose, cost, and complexity.

In our experience, platforms are usually created with the “build it and they will come" mindset. We believe it’s beneficial to instead apply a strategy whereby you stop, step back, and re-evaluate the business priorities that the platform needs to service. We are then empowered to make decisions that not only contribute to business outcomes but also simplify our ecosystems, which results in a low friction environment and a faster time to market.

Assuming that platform issues can be resolved, there remain multiple challenges to handling data, both old and new. For example:

Operational complexity

Are pipelines healthy — are consumers able to get data when they need it or are there delays? How do customers know what’s happening upstream? Do we see changes in systems behaviour before our clients and is there a mechanism for managing reputational risk and trust in data?

Data quality

Is the data accurate, is it complete, is it delivered in a timely fashion — can clients use the data with confidence? A common idiom in machine learning is “rubbish in, rubbish out”, but this applies to all aspects of data, not just the fun stuff.

Data governance

Where is the data that we want? Do we know who owns it? Where do we turn if we have questions or concerns? Can the data be used safely and securely — do we need to worry about PII (personally identifiable information) or any legal frameworks? If we utilise ML (machine learning), can we explain to regulators and auditors where the data came from and how it came to be?

Is the data easy to understand?

To derive value, the data needs to be understood beyond the immediate requirement — beyond the person or team who created the database table or defined the event on Kafka. Data dictionaries are often used to mitigate this problem, but, like all data resources, they tend to drift because keeping them up to date with changes within a sprawling system is tedious.

Associated costs

The above points all have cost implications. People and computers are seldom cheap and rarely free. In summary, there are myriad problems and each comes with a potentially hefty price tag.

Mindsets for reducing risk and driving value

We have found two mindsets particularly useful for de-risking project delivery and driving value for our clients:

  • Product Thinking: By taking a customer-centric approach, we have found that we deliver more, smaller, composable components faster, which enables our clients to leverage more of their data assets quicker.
  • Platform Thinking: By creating flexible digital platforms that are not bound to business domains, we can create environments that allow companies to change direction in reaction to their needs and market demands.

Neither of these approaches is new, but what do they mean in terms of working with data?

Imagine enterprise data as a swimming pool full of Lego bricks of all different colours. Picture a child standing, staring longingly into the pool. Product Thinking is asking the child what toy they want, say a red Lego truck. Platform Thinking is sorting the Lego bricks by shape and colour so you can build different toys quickly.

Product Thinking = data products

A product should solve a specific want or need that stems from a pain point — it is the antithesis of the “build it and they will come” mindset. Product Thinking is a human-centric approach to product design. It places an emphasis on understanding what users need, defining the value proposition and designing a solution that can deliver that value. When we say “data products” we tend to mean reports, dashboards, models or applications.

Successful data products typically take a complex set of inputs, manipulate their shape, and distil a simple story that is presented to the end user in an easy-to-understand and intuitive way while providing valuable insights.

By placing the customer first, the way that we engage with the world changes. A customer has to be identified, requirements discovered, and a plan of operation created — gone are the days of “build it and they will come”. This ensures that we build the right things, that they behave as expected and, ultimately, deliver value, financial or otherwise.

A secondary benefit of this approach, much like Platform Thinking, is that data products are flexible, they can be composed, like functions in our code. This is a value multiplier.

Building data platforms with Platform Thinking

Platform Thinking is a strategic approach to building technology solutions that helps deliver architectures that are flexible and scalable.

Through examining business needs and aligning key technologies, we create simpler environments and clear operating models. Simple, low friction environments enable rapid experimentation, which stimulates innovation and, in turn, leads to a happier culture and faster time to market.

When we talk about flexibility we are focusing on the merits of solving a single specific use case versus enabling an organisation to support a variety of data-driven activities e.g. business intelligence, analytics, and ML. By doing so, not only do we solve the organisation’s current problems but we also address its ability to react faster to changing business needs and market conditions — there’s rarely a need to couple the platform to a specific use case or industry, even when highly regulated.

It is critical to call out that loose coupling between the platform and the use case does not imply that the platform is not led by the use case; it very much is. However, the components and capabilities are designed to be modular and composable to avoid monolithic construction, limit the blast radius of changes, and also to promote reuse.

In order to deliver flexible platforms, there are three areas we lean into:

Self-service, fully automated infrastructure

Teams work best when they are empowered to act independently. It’s crucial that delivery teams can provision resources as and when they are needed with minimal process overhead. This requires equipping them with the necessary tools, as well as putting safeguards in place.

Policy automation

A low friction environment is key to innovation. Through automation of security and governance, we enable teams to experiment while maintaining a healthy risk posture. As teams onboard new members, permissions are automatically managed. This enables them to contribute immediately, rather than wait for a series of ticket approvals.

Domain-oriented

A frequent complaint when working with data is not knowing who owns what or whom to address when there are issues. A domain-driven approach clarifies ownership and responsibilities. It defines the seams that create empowerment within boundaries. Having clear lines of communication prevents siloes and reduces the need for complicated rules of engagement between teams.

Harnessing the power of Product and Platform Thinking

Data scientists in a €45bn European mega-retailer harnessed the power of this duality to achieve spectacular benefits.

They had developed several ML models for optimising inventory, improving store layouts to drive footfall, that sort of thing. In other words, data products focused on solving a business need and generating returns.

They needed a way to deploy these models, reuse them to build more complex ones and to do so faster. So we helped them build a platform that:

  • Provided them with a self-serve infrastructure to deploy the models themselves.
  • Enabled discovery of other models and to use them for building new ones faster.
  • Provisioned tools for visualisation, rapid design, automated testing and other delivery automation functions.

As a result, they started building more models in less time and helping the business make money faster by saving costs, increasing footfall, improving margins, optimising promotions, and in many other ways.

We combine Product Thinking and Platform Thinking to deliver lasting value for our partners

Platform Thinking enables us to create a low friction environment in which teams can deliver at pace. Ease of use encourages rapid development of data products, each delivering a quantifiable customer value. By continuously iterating on this pattern we can help close the data-dollar divide.

Product and Platform Thinking, applied to data, ensure rapid delivery of value to users, at scale. This duality drives adoption, creates flexibility, and enables scalability. It allows you to win users as well as adapt to business needs and market conditions.

Working alongside our clients, we have consistently leveraged this dual approach to deliver value, upskill teams, and have fun! If you would like to talk about our approaches, please don’t hesitate to reach out.

Insight, imagination and expertly engineered solutions to accelerate and sustain progress.

Contact