Is a high-performing data architecture top of your digital agenda?

Conor O’Neill, Chief Product Officer at NearForm, recently spoke to the IT Directors Forum in the UK on how a high-performing data architecture can drive overall business performance and value – and what characteristics that architecture most needs. 

There has been much talked about the importance of security, availability, integration, analysis and access when it comes to the role of data in today’s enterprise world – but you talk about data performance and the need to embrace performance as a philosophy.

Is this in the context of a piece of software code or of wider digital transformation in an organisation?

I’m really talking about the latter. It’s true to say that, in the early years, we grew the NearForm reputation based on our ability to solve major software performance problems but we now address performance improvements across the entire enterprise system architecture from top to bottom, from people to processes to technology solutions. The imperative goes far beyond just code. 

This organisational imperative is ultimately being driven by digitalisation across all industries, changing the way we manufacture, communicate, learn, work, and do business, and by the role of data as a growth driver in all of this.  Our message now to clients is this: you have to move “performance” to the top of your priority list, look at it in broad terms and bake it in from the start – just like security.

You’ve mentioned how data is what’s driving the performance imperative and the topic of your talk was about how companies can make their data architecture high-performing. Is that a key success factor?

I don’t want this to put off any decision makers who think, “data isn’t my thing.” What’s crucial to remember is that your business can’t be high-performing unless your data is. I’m talking about data that’s fast and available so that every person or system that draws on that data never needs to wait. They can get exactly what they want, first time, every time.

Can you give an example of “performance” in this wider sense?

Think about what your people could accomplish if they could complete a commonly-executed business process in five minutes instead of 45 minutes. We had a customer who was trying to achieve this performance boost in one of its web-based workflows. It was entirely doable, but everybody in the team, client-side and NearForm side, had to decide that this performance metric was our focus: it was front and centre.

If all the data that the workflow uses is computerised, how could anything take 45 minutes? Is this a legacy systems problem?

Yes, partly. It was about both systems and processes. I’ll give you another example: think about how modern shopping websites work. If you want to offer a personalised experience at checkout – this is something we’ve done for several clients – you have to pull in data from an array of different systems. But it gets more complex than that because those systems could have come into your organisation through one or more companies that you’ve acquired, or from separate business units, or from outside partners. These systems might be legacy systems, but they might simply just be wildly different, with a huge variance in their APIs and response times. That complexity and heterogeneity can be a nightmare when your customers expect a slick personalised checkout, but behind-the-scenes everything is straining at the seams. 

That’s where tools like GraphQL really help: it wraps up all of those APIs and systems and delivers only the exact data needed at a particular moment. That makes it hugely easier for everyone, including front-end developers who are trying to deliver an excellent, consistent customer experience — and a fast one. Fast data delivers a faster experience.

big data

What about big data? Why has the focus moved from big data to “fast” data?

Because it’s all about performance. Take Condé Nast International for instance. They had a big data problem that was also a fast data problem, but the focus on speed and performance helped us drive through the best solution.  We migrated decades of content from multiple international territories into a single content management system, in a timely way that allowed rapid and accurate transition of its content.  

The solution we co-developed with them allowed them to do it much faster, with 99.9% accuracy before any manual intervention is needed. That significantly speed up the entire modernisation program, which gave a faster time to value for the organisation, who ultimately wanted to deliver its content across devices to end-users faster, in a more consistent, more reliable way. 

That better, high-performance data architecture ultimately meant more business agility for the organisation.  It removed duplication of work and systems. 

So, a high-performance data architecture can deliver on business goals? 

Absolutely. New business opportunities are unleashed. Once your data is in great shape, it becomes a fluid friend that you can do anything with inside your organisation, including pushing it out in new commercial offerings to customers. But first you need that data, and you need context around it, in order to give it value in terms of new concepts for your business or new products for customers.  

In the Banking sector, the requirements of PSD2 are causing much wailing and gnashing of teeth in some organisations, but many others are using it as an opportunity to modernise access to their customer data and create routes to new customers and new services.

Customers want personalisation, and that demands data, available in context and in real time. Customers also want speed, of course. They want immediate gratification from web and  (and mobile) services they use. A high-performance data architecture can deliver that speed as a noticeable differentiator for an online service. 

So, how can companies get their data in shape to support better performance?

There are a few aspects of that challenge. One is making sure your data is understandable to the people who are accessing it. Visualisations help there. We’ve blogged before at NearForm about Clinic.js, a data visualisation tool we created to help Node.js developers who weren’t necessarily deeply knowledgeable to diagnose performance problems in their code. We put a lot of work into figuring out how to visualise performance problems in a way that would be useful: in brief, know your audience, and understand that simple user experiences are hard to deliver.

One of the Clinic.js tools, which is Clinic BubbleProf, is the kind of thing companies can use to diagnose the root cause of slowness in their systems. For example, in one of our demos for BubbleProf, we show that a slow application isn’t, in fact, slow, it’s trying to retrieve data from a database with a non-indexed table. The “slow” app is really just waiting for a response. That’s the kind of performance issue that can be resolved just by adding a database index; then, the app performs as it should.

It’s easy to see how performance and visualisation are interconnected: one makes it so much easier to monitor and improve the other.

Definitely. Data visualisations that really work – that are transformative – can actually be viral. If you can show actionable insights and value, that visualisation will spread to other parts of your organisation.

What’s important to remember is that the number of data points organisations can access — or could potentially access, with sensor and other IOT data – is going to increase exponentially. The time to make sure your organisation is positioned to make the most of that data, master it, and build on it, is now.

 

Conor O’Neill is Chief Product Officer at NearForm and is responsible for all productization activities and works closely with NearForm’s Open Source and R&D teams to evolve the web platform. Some of the projects he has responsibility for in NearForm are Clinic.js and the NodeConf EU Digital badge.

Feel free to connect with him on LinkedIn.

Top