Skip to content

Solving the “data-dollar disconnect” unleashes the value of your data

Organisations require a robust business strategy to unlock the revenue-generating potential of their data

Organisations are spending trillions on storing data but monetising less than 0.1% of it. Bridge this “data-dollar disconnect” with a dual-thinking business strategy to boost revenue.

The data-dollar disconnect is the disparity between the value of data and the amount of revenue it generates for an organisation. Bridging this gap requires a balanced combination of skills, tools and processes to successfully extract and deliver value from data assets. 

A business strategy that integrates Product Thinking and Platform Thinking is one that is customer-centric and flexible. This dual approach enables organisations to leverage more of their data, extract its value more quickly and react better to changes in market demands.

Data isn’t being successfully monetised

In 2021, the International Data Corporation (IDC) predicted that 2022 would see the world produce 94 zettabytes of data, which would cost around $3 trillion to store. But data monetisation was estimated to be worth $2.9 billion dollars in 2022. While nearly 3 billion dollars may sound large, looking at it in relative terms it is less impressive — on average, less than 0.1% of the data had been successfully monetised!

Organisations are really good at spending money to collect and store data, but considerably less successful at extracting value and getting enduring business impact from it. For the Tolkien fans amongst us, we look a lot like Smaug, sitting on a massive pile of gold. Unlike the dragon, however, we have to do something with our hoard to make it valuable.

Common wisdom is to gather all the data available and build ever more sophisticated abodes (marts, cubes, warehouses, lakes, etc… ) to store it. The intent  is that data scientists then convert it into algorithms, or data products, that can be monetised.

That’s the theory. The reality is a bit different, as the following anecdote highlights. 

Enter the CIO of a 100-year-old German upstream energy services supplier. Eager to bring together diverse datasets and unlock the value trapped within, the CIO embarked on a transformation project to create a state-of-the-art data warehouse. Two years passed, the data warehouse went live and opened for business. Following the launch, the CIO reviewed the usage metrics to find that 5% of the warehouse was being used, with a peak usage only ever hitting 15%. More than 85% of the effort did not produce any returns; in other words, it was wasted. 

Investing in data systems is not enough in itself enough to guarantee a return.

Architectural and data challenges

Like other technical fields, data architecture has seen many fads and fashions. We’ve moved from relational databases to NoSQL and, in some cases, back again. We’ve introduced systems that scale massively at the expense of introducing mind-boggling complexity. More recently, there’s been considerable marketing pressure to adopt the data lakehouse.

Seldom are these changes made for the right reasons. Buyers lack the visibility of the specific use cases that the products are built for and sellers are incentivised for growth over fitness for purpose.

As with all things architecture, there are rarely right or wrong answers, rather a series of trade-offs around fitness for purpose, cost and complexity.

Oftentimes, platforms are usually created with the “build it and they will come” mindset. At Nearform, we believe it’s beneficial to instead apply a strategy whereby you stop, step back and re-evaluate the business priorities that the platform needs to service. We are then empowered to make decisions that not only contribute to business outcomes but also simplify our ecosystems, which results in a low-friction environment and a faster time to market.

Two mindsets for unlocking the value of data

At Nearform, we have found two mindsets particularly useful for de-risking project delivery and driving value for our clients:

  • Product Thinking: By taking a customer-centric approach, we have found that we deliver more, smaller, composable components faster, which enables our clients to leverage more of their data assets quicker.

  • Platform Thinking: By creating flexible digital platforms, we can create environments that allow companies to change direction in reaction to their needs and market demands.

What do these approaches mean in terms of working with data?

Imagine enterprise data as a swimming pool full of Lego bricks of all different colours. Picture a child standing, staring longingly into the pool. Product Thinking is asking the child what toy they want, say a red Lego truck. Platform Thinking is sorting the Lego bricks by shape and colour to build different toys quickly.

Product Thinking enables data products

A product should solve a specific want or need that stems from a pain point — it is the antithesis of the “build it and they will come” mindset. Product Thinking is a human-centric approach to product design. It emphasises understanding what users need, defining the value proposition and designing a solution that can deliver that value. 

Successful data products typically take a complex set of inputs, manipulate their shape and distil a simple story that is presented to the end user in an easy-to-understand and intuitive way while providing real-time insights that bring real value. “Data products” tends to mean reports, dashboards, models or applications.

By placing the customer first, the way we engage with the world changes. A customer has to be identified, requirements discovered and a plan of operation created — gone are the days of “build it and they will come”. This ensures that we build the right things, that they behave as expected and, ultimately, deliver value, financial or otherwise.

A secondary benefit of this approach is that data products are flexible, they can be composed, like functions in code. This is a value multiplier.

Building data platforms with Platform Thinking

Platform Thinking is a strategic approach to building technology solutions that helps deliver architectures that are flexible and scalable.

By examining business needs and aligning key technologies, we create simpler environments and clear operating models. Simple, low-friction environments enable rapid experimentation, which stimulates innovation and, in turn, leads to a happier culture and faster time to market.

When we talk about flexibility we are focusing on the merits of solving a single specific use case versus enabling an organisation to support a variety of data-driven activities — e.g. business intelligence, analytics and machine learning (ML). By doing so, not only do we solve the organisation’s current problems but we also address its ability to react faster to changing business needs and market conditions — there’s rarely a need to couple the platform to a specific use case or industry, even when highly regulated.

It is critical to call out that loose coupling between the platform and the use case does not imply that the platform is not led by the use case; it very much is. However, the components and capabilities are designed to be modular and composable to avoid monolithic construction, limit the blast radius of changes and also to promote reuse.

To deliver flexible platforms, there are three areas Nearform leans into:

Self-service, fully automated infrastructure

Teams work best when they are empowered to act independently. It’s crucial that delivery teams can provision resources as and when they are needed with minimal process overhead. This requires equipping them with the necessary tools, as well as putting safeguards in place.

Policy automation

A low-friction environment is key to innovation. Through automation of security and governance, we enable teams to experiment while maintaining a healthy risk posture. As teams onboard new members, permissions are automatically managed. This enables them to contribute immediately, rather than wait for a series of ticket approvals.

Domain-oriented

A frequent complaint when working with data is not knowing who owns what or whom to address when there are issues. A domain-driven approach clarifies ownership and responsibilities. It defines the seams that create empowerment within boundaries. Having clear lines of communication prevents siloes and reduces the need for complicated rules of engagement between teams.

When a retailer harnesses Product and Platform Thinking

Data scientists in a €45bn European mega-retailer harnessed the power of this duality to achieve spectacular benefits.

They had developed a large number of ML models for optimising inventory and improving store layouts to drive footfall and shopper engagement. In other words, data products were focused on solving a business need and generating returns.

They needed a way to deploy these models, reuse them to build more complex ones and to do so faster. Nearform helped them build a platform that:

  • Provided a self-serve infrastructure to deploy the models themselves.

  • Enabled discovery of existing models to accelerate the build of new models.

  • Provisioned tools for visualisation, rapid design, automated testing and other delivery automation functions.


The data scientists started building more models — and in half the time — and helping the business make money faster by saving costs, increasing footfall, optimising promotions, and improving margins.

Dual thinking extracts lasting value from your data

Applying Product Thinking and Platform Thinking to an organisation’s data delivers rapid value at scale. This combination of thinking drives adoption, creates flexibility and enables scalability. It solves the “data-dollar disconnect” and allows organisations to win customers as well as adapt to business needs and market conditions.

You may also like

Insight, imagination and expertly engineered solutions to accelerate and sustain progress.

Contact