Back to basics
17 August 2016
Some institutions are finding themselves tangled up in legacy
systems of duplicated functionalities. If they want to run smoothly,
it’s time to strip back their infrastructure, says Tim Brazier of GFT
Image: Shutterstock
How is technological complexity affecting financial institutions?
Financial institutions have amounted huge technical debt—a bulk of legacy technology that is very complex. This affects ability to innovate and to implement change, it has cost implications and risk implications, and it could ultimately affect the experience of the end user. It is a problem that extends beyond the financial industry, but it seems to be felt more acutely here.
There are two main drivers for this. The first is the effect of historical acquisitions where the acquired organisation has never been entirely integrated, leaving duplicate technologies. The second is that, over the last 10 to 20 years, financial services firms have grown very quickly, and their primary drivers have been to develop new products, new markets and new customers.
This means institutions haven’t placed enough emphasis on their architecture or on maintaining their technology estate.
Now, as we move into a world of capital restraint, these banks are under huge cost pressures and they have to meet very stringent regulatory agendas, so the problem of technical debt and legacy technology can become quite acute.
Has the issue worsened in recent years? Why?
This is a problem that has been building up as technology keeps evolving and moving. It is natural that technology becomes legacy anyway if it is not updated and renewed, but this has been exasperated by acquisitions and goals for growth.
Firms have been trying to address this for some years, with varying degrees of success. Increasingly, as cost pressures and capital constraints continue, along with continued pressure to innovate and to keep up with the regulatory agenda, there comes a point that an institution cannot build a future on something that is not a solid foundation.
It’s not an easy problem to address. There is a tendency to deal with the most immediate set of problems, and that is the regulatory compliance, the fire fighting and the cost-cutting measures. But, now I think many organisations are running out of things they can do in that respect. They have outsourced, they have offshored, and they have pushed their suppliers and contractors to reduce costs, but none of that actually helps with making a change within this kind of legacy environment.
What can institutions do to simplify?
Every organisation is different in terms of maturity, so the approach will be different. However, there are some simple components that can make a big difference.
Having a good understanding of the technical estate, and the state of applications, is an absolute must, but actually many organisations don’t have a complete grasp of that. That is the starting point: understanding what the estate does, where the duplication is, and what is really driving the cost.
We believe firms have the opportunity to reduce their applications by up to 50 percent, simply because of the level of duplication throughout the enterprise. It seems ridiculous, but banks have thousands and thousands of applications, and a lot of those functionalities are doubled up. There is a general understanding that knowing the cost drivers within the technology estate is important, but still a lot of financial institutions don’t have a clear idea of what the impact of the technology on costs can be, both directly and indirectly.
Effective governance is also important—someone has to make sure the firm isn’t adding to the problem, make sure that new solutions being built aren’t creating more duplicates or more inefficiencies. Ideally, they would start to build that process into programmes, actively reducing the technical debt as they go. While, historically, programmes have been measured on functionality and capability, perhaps they should be measured on whether they have made the foundation of the technology estate any better.
Will this require a long-term culture change?
There is no silver bullet here. It’s a very difficult problem to tackle, and it will always compete with other initiatives. A simplification programme would typically be multi-year, but people tend to lose confidence in these long-term projects, and IT departments don’t have a good reputation for delivering them very successfully. But, there are opportunities for quick wins to be had, which can help prime the pump and prove that this can be done.
Removing duplicate functionality and creating some savings that way can help as justification for a longer programme, and so then they can start to tackle the medium-level complexity. Longer-term, transformational programmes need to be focused on both addressing the legacy problems and delivering new functionality, and that requires a subtle change in terms of governance and culture.
Both business sponsors and IT people like new things. They want to see new technology, new features and new products, and so there is a natural tendency to focus on the exciting things—it takes discipline to focus on the underlying architectures.
So, to make these changes effectively, it’s important to have fairly senior sponsorship within the organisation. They’re going to need board-level support to make the changes on an enterprise level.
Banks are under pressure to innovate, but should they tidy up their underlying infrastructure first?
I don’t think it’s an either/or situation. Banks will always feel that if they don’t innovate they will lose a competitive advantage, however there is an obvious correlation between complexity of existing infrastructure and ability to innovate. They need to do both.
The simpler an institution’s underlying estate, and the more solid the architecture, the more likely it is to successfully innovate and to meet new regulatory and cost requirements. If you try to build a blockchain application, for example, on an infrastructure that is not well architected, then you’re clearly going to have a suboptimal outcome.
Evolution of applications should be tied very closely to innovation in infrastructure—such as vistulisation and cloud technologies—but often we see that those two activities are not completely aligned. If they are more harmonised and more coordinated, firms can actually optimise both of those programmes much more successfully.
Financial institutions have amounted huge technical debt—a bulk of legacy technology that is very complex. This affects ability to innovate and to implement change, it has cost implications and risk implications, and it could ultimately affect the experience of the end user. It is a problem that extends beyond the financial industry, but it seems to be felt more acutely here.
There are two main drivers for this. The first is the effect of historical acquisitions where the acquired organisation has never been entirely integrated, leaving duplicate technologies. The second is that, over the last 10 to 20 years, financial services firms have grown very quickly, and their primary drivers have been to develop new products, new markets and new customers.
This means institutions haven’t placed enough emphasis on their architecture or on maintaining their technology estate.
Now, as we move into a world of capital restraint, these banks are under huge cost pressures and they have to meet very stringent regulatory agendas, so the problem of technical debt and legacy technology can become quite acute.
Has the issue worsened in recent years? Why?
This is a problem that has been building up as technology keeps evolving and moving. It is natural that technology becomes legacy anyway if it is not updated and renewed, but this has been exasperated by acquisitions and goals for growth.
Firms have been trying to address this for some years, with varying degrees of success. Increasingly, as cost pressures and capital constraints continue, along with continued pressure to innovate and to keep up with the regulatory agenda, there comes a point that an institution cannot build a future on something that is not a solid foundation.
It’s not an easy problem to address. There is a tendency to deal with the most immediate set of problems, and that is the regulatory compliance, the fire fighting and the cost-cutting measures. But, now I think many organisations are running out of things they can do in that respect. They have outsourced, they have offshored, and they have pushed their suppliers and contractors to reduce costs, but none of that actually helps with making a change within this kind of legacy environment.
What can institutions do to simplify?
Every organisation is different in terms of maturity, so the approach will be different. However, there are some simple components that can make a big difference.
Having a good understanding of the technical estate, and the state of applications, is an absolute must, but actually many organisations don’t have a complete grasp of that. That is the starting point: understanding what the estate does, where the duplication is, and what is really driving the cost.
We believe firms have the opportunity to reduce their applications by up to 50 percent, simply because of the level of duplication throughout the enterprise. It seems ridiculous, but banks have thousands and thousands of applications, and a lot of those functionalities are doubled up. There is a general understanding that knowing the cost drivers within the technology estate is important, but still a lot of financial institutions don’t have a clear idea of what the impact of the technology on costs can be, both directly and indirectly.
Effective governance is also important—someone has to make sure the firm isn’t adding to the problem, make sure that new solutions being built aren’t creating more duplicates or more inefficiencies. Ideally, they would start to build that process into programmes, actively reducing the technical debt as they go. While, historically, programmes have been measured on functionality and capability, perhaps they should be measured on whether they have made the foundation of the technology estate any better.
Will this require a long-term culture change?
There is no silver bullet here. It’s a very difficult problem to tackle, and it will always compete with other initiatives. A simplification programme would typically be multi-year, but people tend to lose confidence in these long-term projects, and IT departments don’t have a good reputation for delivering them very successfully. But, there are opportunities for quick wins to be had, which can help prime the pump and prove that this can be done.
Removing duplicate functionality and creating some savings that way can help as justification for a longer programme, and so then they can start to tackle the medium-level complexity. Longer-term, transformational programmes need to be focused on both addressing the legacy problems and delivering new functionality, and that requires a subtle change in terms of governance and culture.
Both business sponsors and IT people like new things. They want to see new technology, new features and new products, and so there is a natural tendency to focus on the exciting things—it takes discipline to focus on the underlying architectures.
So, to make these changes effectively, it’s important to have fairly senior sponsorship within the organisation. They’re going to need board-level support to make the changes on an enterprise level.
Banks are under pressure to innovate, but should they tidy up their underlying infrastructure first?
I don’t think it’s an either/or situation. Banks will always feel that if they don’t innovate they will lose a competitive advantage, however there is an obvious correlation between complexity of existing infrastructure and ability to innovate. They need to do both.
The simpler an institution’s underlying estate, and the more solid the architecture, the more likely it is to successfully innovate and to meet new regulatory and cost requirements. If you try to build a blockchain application, for example, on an infrastructure that is not well architected, then you’re clearly going to have a suboptimal outcome.
Evolution of applications should be tied very closely to innovation in infrastructure—such as vistulisation and cloud technologies—but often we see that those two activities are not completely aligned. If they are more harmonised and more coordinated, firms can actually optimise both of those programmes much more successfully.
NO FEE, NO RISK
100% ON RETURNS If you invest in only one asset servicing news source this year, make sure it is your free subscription to Asset Servicing Times
100% ON RETURNS If you invest in only one asset servicing news source this year, make sure it is your free subscription to Asset Servicing Times