Taming the data beast for better Digital CX: one step at a time

Data is key to unlocking exceptional customer experience yet enterprises face an unprecedented data sprawl which challenges them to find meaning in the mess. To avoid the common pitfalls around data management and presentation, enterprises must break down their silos and build enduring, scalable solutions that let businesses make sense of their information now and in the future.

The increasing volumes and varieties of data that are flooding today’s enterprises make the problem of information analysis exponentially more difficult as time goes on. Often the most data-intensive parts of an organisation have been left to the stewardship of highly technical analysts and engineers who are masters of their own domain, but who don’t have the tools or the skillset to help colleagues derive business value from “their” data.

Deriving value from data becomes a boardroom issue because of the triple pressures facing the enterprise: the need to keep the customer experience ahead of the competition; the growing swell of data which mounts hourly; and the increased use of machine learning and AI from market-leading competitors who are showing what best practice looks like when it comes to deriving intelligence from information.

Helpful Tips in data management and presentation: one step at a time

In our engagements with enterprises, we are frequently brought in to help transformation executives and their developer colleagues tackle some truly enormous data challenges. Decades of dispersed and siloed data that must urgently be normalised, combined and migrated, to strict deadlines. Tremendously valuable data sets on which (literally) billion-dollar investment decisions are made, but which have become so difficult to access and manage, the company controlling the data can grow no further.

Here are the common pitfalls we’d advise executives to watch for if they are planning to wrangle their own formidable data sets into shape.

  1. Recognise that your current tools are from a different era. When enterprises take a hard look at their systems, they often discover that the technology they are using to manage their data is 20 years old and was adopted in an era of small data. These tools can no longer support the modern, global and dynamic enterprise, particularly as it looks to exploit insights from the data and improve customer experiences.
  2. Use defence requirements when proving your business case. There may still be boardroom colleagues who doubt whether now is the time to invest in data management platforms for the next decade. Prove your case in part by pointing to the governance and security requirements mandated by regulation and punished by hawk-eyed regulators. Loss or exposure of customer data can carry sizeable fines: more than large enough to warrant investment now in fit-for-purpose, secure data management systems.
  3. Be wary of adopting new data management tools that suit immediate requirements only – especially ‘off-the-peg’ packages, which themselves are often built on inflexible foundations and assumptions from another era. If you’re tackling your data due to firefighting or short-term emergencies, the pressure may be on to find a quick fix. But the most effective data programs are those that cater to today’s inputs and tomorrow’s unknowns. Make sure the solution you implement provides end-to-end visibility, integrates across silos, and consolidates the unique variety of data your organisation uses into useful visualisations on which business decisions can be made.
  4. Don’t forget the end user. Remember: the people viewing the data aren’t necessarily the data stewards – they may be business colleagues or even end customers to whom the data will be exposed via a self-service web portal. Invest the time to get the visualisation right and you’ll automatically unlock more value from your data because you’ve enabled comprehension of those data sets by a wider range of users. This is a great way to help break down the silos that develop in every growing organisation.
  5. Don’t work in isolation of other departments. This can be a huge temptation because, as already mentioned, your organisation’s data experts may be used to working in splendid isolation. But collaboration is key, even if you’re afraid it will slow down the process. You need as much input as possible and a variety of perspectives on the data to ensure that what you’re building now isn’t just scalable, but is also of genuine use to colleagues right across the organisation.

Collaboration is the key to unlocking data’s true potential

We’ve been fortunate to see the business value that enterprises can realise once their data is tamed, secure and available. In one such engagement, we worked with a company that provides market intelligence and analytical insight based on an array of data sources. The company was struggling because its data was stored and processed in ad-hoc formats and was difficult for its in-house experts to use. Although the data supported a valuable business process, the technology surrounding the data was unwieldy and not fit for purpose to meet the growing needs of the business.

We helped the company modernise its entire data platform. Today, that data is now offering much more flexibility, better service to its internal users, and new revenue opportunities, as external customers can also now see and work more easily with the data.

A key success factor in that engagement was the client’s commitment not to get NearForm to resolve the data challenge independently, but to collaborate with its internal team to build their capability, so they could create and manage the new platform and similar products themselves.

Enable a Self-Serve Model with a user-centric approach

In another client engagement, which you can read in more detail here, we helped one of the world’s largest media companies tackle a giant data challenge: migrating decades of content from multiple international territories into a single content management system, and doing so in a timely way.

The solution there wasn’t a single, magical piece of software, but rather close collaboration with the client that involved reimagining the entire migration. In the end, we co-created a new tool that actually walks each territory and its data owners through the process of normalising its own data, allowing a rapid and accurate transition of its content into the new unified content management system.

A key success factor in that engagement was recognising and building upon the domain knowledge of the people closest to the data – the experts in each global territory who had created and owned that data – and enabling a self-service model. In the end, the people who knew the data best were able to ready it for a smooth migration, without having to understand the details of how the migration happened.

Before you jump into the taming pen, plan your next steps

The power to tame and derive value from your enterprise’s data is within your grasp, but it starts with standing back. Recognise the assumptions you’re making about the data and how you use it and question those assumptions.

  • Does the data need to be stored, accessed and managed in certain ways?
  • Could the data be opened to other departments or direct customers and deliver value in that way?
  • Does your team have the necessary skills to code an entirely new platform where that data will reside and interface with other systems, people, and automation or intelligence tools?
  • What could you enable your internal and external customers to do if they could just see the intelligence locked away in your data?

Your organisation’s growing data stores are a unique source of value for you and your customers. Finding new and optimal ways to store, query and integrate that data is a doable task; it just requires a recognition that the end result will be worth the effort.

The business benefits of unlocking the power of your data can be transformative. One of the most recent data visualisation programs we’ve worked on is Clinic.js, a novel open source performance toolset that uses data analysis and visualisation to show engineers where applications could be made faster, with instant AI-enhanced recommendations. NearForm consultants have already used this to not only improve some of the world’s most complex Node.js applications for clients but also make the Node.js platform itself even faster. We advise the same approach for any visualisation program  empower the end-user, by putting their needs first.

 

At NearForm, we have experience in designing & developing efficient data architecture for data consolidation, integration & visualisation. We’ve built robust, scalable systems for organizations with thousands of users across disperse sites requiring immediate access to real-time data. Talk to us today to discuss how we can collaborate on addressing your data challenges & opportunities.

Alan Slater is NearForm’s Senior Data Visualiser, with 10 years’ experience of teasing actionable intelligence out of unwieldy datasets and the heads of experts, turning it into usable tools and visualisations. He describes data visualisation development as an exploration, punctuated by joyful moments of discovery where a user or stakeholder sits down with a prototype and immediately sees a new nuance or opportunity in data they may have worked with for years.

Keep up to date with our latest tech observations and industry insights in our monthly newsletter.

 

 

? Federico Beccari

Top