Joined-up thinking and doing is the only way to deliver value

8th February 2023

Article originally published in Mortgage Introducer January 2023 – page 18

If the last year taught us anything it was about the importance of joining things up. Whether in thinking, or decision-making, a wilful disregard for the interconnectivity of things led, in one very famous example, to a near disaster in financial markets that the UK is still working through.

With joined-up thinking comes a need for greater inter-operability. De-coupling is very much in fashion in many areas of thought, but the pandemic showed us that cloud-based interoperable solutions were not only available, but they were also capable of being swiftly implemented. These now offer even more opportunities to scale and build quickly, at low-risk, new APIs, improved interoperability, and better interconnectivity than old infrastructure can ever deliver.

Moving data around is putting pressure on legacy systems. Bandwidth, or the size of a pipe required to move data around between the many parties in the value chain, is a real block in some instances. And yet the need to access, interpret and analyse data is more pressing than ever.

Lenders and valuers are acutely aware of the increased value of data in mortgage valuations and consequently, their underwriting decisions. And the need to understand new types of data input and its value is increasing – not decreasing. Whether we are assessing the portfolios of landlords or deciding upon lending to residential homeowners, different data points about property are informing decisions.

And then where the data is unknown, another set of decisions and inputs may be required. Properties without an energy performance certificate, for example, but which refinance on a product transfer basis present a wholly unknown risk. There are around 11 million of those in England and Wales; that’s a lot of unknown risk. Algorithms and other data points to deliver benchmark scoring will be required – and much of it will demand effective interoperability with external data sources.

Some of this, of course, already goes on but the manner in which it is done is not always the most efficient. In my recent conversations with valuers and lenders, there is a common call for a one-stop shop going forward for much of this work.

Part of the reason for this one-stop-shop approach of course is that property, and more pertinently the data around property, is always changing. Assessing energy efficiency risk is just one of the challenges lenders and valuers face when it comes to climate change. Property in coastal locations or flood plains are flagged for geographical risks; now we have experienced a summer with temperatures above 40 degrees in Britain and winter temperatures falling below -10 Celsius in the south of England climate change risk looks rather more complex.

So, interoperability and interconnectivity require an infrastructure to deliver these things efficiently. You won’t be surprised to know our Buy-to-Let hub and Lender Hub are doing just that for many lenders. They offer a new fit for purpose infrastructure that connects the mortgage value chain with data providers through APIs. The output is that it streamlines mortgage lending operations and improve the process of making sound risk decisions.

As part of that journey, we learned through building our Buy-to-Let Hub, the importance of regulatory change to the decision-making process and built a solution that not only speeds up the underwriting process but also significantly reduces the administrative burden on brokers when submitting buy to let portfolios to lenders. Infrastructure that works delivers for multiple parties in the value chain.

We can see it coming again in the form of Consumer Duty legislation which will require more data points in due course to evidence good outcomes. Much of this will be internal, but some of that decision-making will require interactivity with other datasets. Property prices, affordability and borrowers’ personal credit circumstances will all impact a decision to lend.

Access to these datasets relies on good quality inputs and better-quality infrastructure to carry the breadth and depth of data that good lending decisions demand. In the world of valuations, value judgements for the UK housing market cannot in many cases be distilled into an algorithmic binary judgement. But the joining up of data and processing can support and inform the rapid scaling of lower risk decisions. All methodologies can use a lot of data but that must support the most appropriate decision-making process for the risk in consideration.

Some elements will be climate related, location related, comparable and database analytics will usually be the most appropriate for those assessments. But others will need a more nuanced blend. Whatever the process, the data underlying the ultimate decision needs to be accessed and processed, interpreted, and understood quickly and reliably if the customer outcome is to offer lenders and borrowers proper value.

Lender Hub

Back to news

All news