LEAVING LEGACY SYSTEMS BEHIND
What are the key challenges facing Financial Services organisations today and how can they survive and thrive? In the second of our four part series examining what FS organisations need to do to survive and thrive in today’s challenging marketplace, John Yonker, CEO, Simplitium, highlights the need to leave legacy systems behind.
Financial Services have experienced significant growth in transaction volumes and instrument complexity over many decades and that trend seems likely to continue. Yet it remains the case that many institutions are running their core operations on systems which are 10, 20 or even 30 years old. At a time when organisations are becoming ever more data-centric, they need to process more data faster and accurately to satisfy both clients and regulators. Legacy systems are increasingly struggling to meet the demands of the marketplace and are a major overhead in terms of cost.
An obvious weakness is that patching/maintaining legacy systems is both expensive, resource-heavy and may pose security issues. A report in Financial News last year highlighted that banks typically spend 80% of their IT budgets on legacy technology maintenance and a tier-one bank could easily spend up to $300m a year on existing software which constantly needs expensive updates to comply with regulatory requirements1. These systems are written in virtually obsolete code and have long since stopped being widely supported. Finding specialists who still know these systems is both costly and increasingly problematic to source.
A report in Financial News last year highlighted that banks typically spend 80% of their IT budgets on legacy technology maintenance and a tier-one bank could easily spend up to $300m a year on existing software
Critically, these systems don’t scale or integrate with newer solutions very easily, impacting the organisation’s ability to handle more business or expand into new markets. As the rate of change continues to accelerate, limitations in extending capability become real pain points as institutions seek to keep up with demand for new and complex instruments, such as handling cryptocurrencies or implementing real-time risk management. The historic batch processing model now looks increasingly archaic as clients increasingly expect real-time interactions to be the norm.
Financial Services is increasingly data-driven and the ability to be able to process, interrogate and visualise data is key to supporting effective risk management and improving decision-making. Handling data efficiently starts with having core systems which allow easy data mining and enable search and business rules to be executed. Legacy systems often struggle to handle such actions as their lack of data fields inhibits what can readily be achieved.
Legacy systems are pervasive across all major sub-sectors of financial services yet if they have so many weaknesses, why haven’t institutions replaced them? Some organisations have been able to change their processes without changing their system but, ultimately, they risk becoming less effective operationally as their processes become limited by the way the system works. There may also be the concern that as cyber-attacks continue to grow both in number and sophistication, these systems may be increasingly vulnerable.
Financial Services is increasingly data-driven and the ability to be able to process, interrogate and visualise data is key to supporting effective risk management and improving decision-making.
For many, the cost and risk of undergoing a major systems transformation has just been too much. Additional challenges arise from the outcome of merger and acquisition where buying firms then end up with more legacy systems, compounding the difficulties. The time and effort required is significant and despite the apparent weaknesses, they may still be stable and reliable. Additionally, there’s a massive risk introduced to the business when changing even one of these systems, as was seen in several UK banking system migrations in 2018. Performed erroneously, trust can be quickly eroded, which could devastate the business both operationally and reputationally to an extent they may not be able to recover.
While firms have been able to implement add-on functionality to integrations and interfaces to avoid the big upgrade it usually comes at a cost – are they as productive as they could be? Are they meeting customers’ requirements fully and managing their data appropriately? There must be a high likelihood that they are not. Increasingly, it seems we must reach a tipping point where delaying addressing the weaknesses in these systems must compromise the business. An obvious solution is to identify and work with fintech partners who are specialist providers in hosting these core platforms. Failure to fundamentally address the root issues surely only leads to an inability to service customers effectively. The opportunity to optimise operations on a modern, proven base infrastructure that is cost effective, secure, scalable and offers access to latest functionality and the ability to build modern apps from is surely too good to ignore.
About John Yonker:
John Yonker was appointed to chief executive of Simplitium in September 2018. Prior to joining the firm in 2016, John’s background was in Equities technology. During his 9 years at Credit Suisse (New York / London / Singapore / Hong Kong), he progressed to regional management positions for European and Asia Equities IT Trading and Execution. John then spent 4 years at Macquarie (Hong Kong) as regional Asia IT Operations manager before completing an MBA at HEC Paris (entrepreneurship specialisation) . Upon graduation, he joined Simplitium to develop Simplitium’s insurance service (ModEx) and thereafter filled the role of COO.