
10th January 2025
Article originally published in theintermediary.co.uk Jan 2025
For years, the data debate has focused on the quality and quantity of what you put in.
Both are important because without enough comparables, conclusions cannot be accurate and without being discerning about which you use, even statistically accurate conclusions are useless. With technological advancements, companies now use data to make faster, smarter decisions.
To borrow from a recent article from Harvard Business School, consider Starbucks, which closed hundreds of poor-performing stores in 2008. Rather than continuing with a suck it and see approach, then-CEO Howard Schultz decided to partner with an analytics company to use data such as demographics and traffic patterns to determine new locations’ likelihood of success before investing.
Consider, too, Amazon, which uses reams of data to power its ‘you may be interested in’ algorithms. McKinsey estimated that, in 2017, 35% of Amazon’s consumer purchases could be tied back to the company’s recommendation system.
Financial services companies have also got better at using data, with open banking regulation opening up access to vast amounts of customer data without compromising on security. Not only has this allowed companies to offer customers more targeted products and services, that data flow has also worked to protect firms from transacting with customers that present the wrong type of risk for their appetite.
As any retail bank’s compliance department will tell you, data has been a gamechanger. More recently, the debate has become more mature. It’s not just about where you access the data you put in, having diverse sources and experienced algorithms to make sense of it. As any technology develops, the infrastructure that supports it has to evolve to support it.
The first iPhone made it possible to search the web from the palm of your hand. On a 1997 Nokia? Not so much. How data is accessed and processed relies on the same principle but unlike replacing a mobile phone handset each year, upgrading the technology infrastructure that supports dataflow in retail banking is much more complex.
Think of it like this. In 1859 London’s sewer system needed to serve the city’s two and a half million population. By 1868, it was serving four million. Today, the capital is home to almost nine million people and the sewer system is literally cracking up. Water companies spend billions upgrading pipes and tunnels that are now more than 150 years old.
Climate change has brought hotter temperatures and wetter winters. That puts even more strain on water pipes and supply. Flooding becomes more common, water wastage through underground leakage is vast, the more damage is done the harder it is to keep patching up the cracks.
Data is no different, and managing its flow is increasingly critical for banks and building societies’ risk management. Poor performing pipes are bad for everyone, and in financial services things are in dire need of a strategic rethink.
Customers expect their data to be protected. Corporate governance requires companies to guard against data theft or loss. Regulation demands that firms use data more effectively to identify risk and remediate failure. Profitability relies on operational efficiency.
For years banks and mutuals have patched software together as takeovers forced multiple systems to integrate with one another. The huge sensitivity of protecting customer account data and history has made it almost prohibitive to move from legacy systems to new platforms.
The cautionary tale of TSB’s IT transfer to Banco Sabadell’s systems in 2018 is testament to the sheer scale of what can go wrong when you try to replace creaking technology infrastructure. That risk has led to apathy when it comes to the business-to-business pipework that sits underneath banking institutions. Now the risk is that very apathy and regulators are keenly aware of it. Like a garden that has been left to its own devices, some overdue attention is in order. A myriad of sprawling legacy infrastructure cannot be replaced overnight but the work must start, and in many instances has begun, to put this right.
We expect 2025 to illustrate not only importance of the right data but also the right interoperability to support it all. Data security is paramount but if firms are to retain their competitive edge, so too is maximising how efficiently data is used to drive business decisions and function. For lenders, this means considering who to partner with to access quality data at minimal risk. It also requires a broader rethink about how that is achieved to deliver the best result in the most efficient manner.
The quantity of data companies have access to is growing exponentially and firms must have the operational capacity to process it quickly enough to keep pace. Increasingly, regulation and policy mean what we measure and how is increasingly nuanced and requires more sophisticated thinking to overcome time lags in data accuracy.
Credit and property risk data are key among the interoperability points lenders have with a myriad of suppliers. These interfaces and the infrastructure that supports them need to be fit for the purposes of the modern age. That age is here and we are investing heavily in new ways of delivering competitive advantage to our clients.
Mark Blackwell is chief operating officer at Core Logic