<img height="1" width="1" style="display:none;" alt="" src="https://dc.ads.linkedin.com/collect/?pid=158793&amp;fmt=gif">

Optimising resource & efficiencies in Financial services - how to survive & thrive (Part 1)

Blog - 20.02.2019

DATA AT THE HEART OF THE BUSINESS

What are the key challenges facing Financial Services organisations today and how can they survive and thrive? In this blog series, John Yonker, CEO of Simplitium, offers his thoughts. In the first part of this four part series, John looks at how data has become a core asset for firms and why operating models must optimise around it.

Simplitium - Data

Data is increasingly at the heart of all modern businesses and the financial markets are no different. Electronic trading sees millions of messages created daily, while regulatory and risk management requirements are challenging financial services firms to capture, store and analyse data that may span multiple years, departments and regions, at ever increasing levels of granularity. Transactional data growth continues to accelerate exponentially as transaction volumes increase. IDC forecasts1 that by 2025 the global ‘datasphere’ will grow to 163 zettabytes (1 zettabyte = 1 trillion gigabytes) which is ten times the 16.1Zb of data generated in 2016. Often, even structured data is scattered across multiple departments, geographies and systems, and is of variable quality. So, faced with this data ‘tsunami’, how can FS organisations best prepare themselves to not only survive, but thrive?

Adding to the data burden is the multitude of new unstructured data, such as emails, social media and information from third-party providers. While increased data should offer more opportunities to generate greater customer value, the perceived operational challenges of harnessing this data are so daunting that many financial services firms may be fearful of embracing the challenge despite data potentially being one of their most valuable assets.

Data-driven organisations make decisions based on facts and data, which dramatically increases their speed of decision-making, as well as enabling them to be more responsive and proactive. Speed of responsiveness and the ability to accurately forecast likely scenarios are key as they all provide the potential for competitive advantage. A holistic approach that delivers consistency in the collection, storage and use of data as soon as it enters the organisation is essential, not just for regulatory compliance purposes but also to reduce reconciliations and waste across business processes.

Data-driven organisations make decisions based on facts and data, which dramatically increases their speed of decision-making, as well as enabling them to be more responsive and proactive.

As data volumes continue to expand, investment firms can enhance their models. The inclusion of new data points and unstructured data enhances models, resulting in better trades and subsequently higher ROI. The challenge for managers is finding appropriate new data points to use as “trading signals” and being able to rely on exchanges and clearing houses who can handle increased volumes quickly and accurately.

Underpinning all this is the need for robust, scalable infrastructure that firms can rely on. To support firms facing this ever-increasing deluge of data, they need technology platforms with intuitive user interfaces that can quickly process millions of data points accurately and present the data in an accessible format allowing them to analyse and interrogate the data easily.

A modernised infrastructure which can handle these data volumes will support the industry in meeting growing calls for improved transparency in the way they conduct their business. There is increasing pressure on firms to be more transparent, seen in the shape of regulatory interventions, such as MiFID II, but also in clients across the financial services industry being more demanding on the transparency surrounding the services they license. Improved transparency benefits not only individual stakeholders who will be able to make more informed financial decisions, but also for businesses who will benefit from improved levels of trust and more open knowledge sharing in the market.

To support firms facing this ever-increasing deluge of data, they need technology platforms with intuitive user interfaces that can quickly process millions of data points accurately.

Data is now a core asset for firms and operating models must optimise around this, especially for functions like Risk. Looking forward, it seems highly unlikely that the flood of data will lessen - financial services firms will increasingly need to be more agile, leaner and capable of both handling and making sense of all the data available to them. Most likely, we will see them partnering with fintech specialists and other experts in their fields to efficiently access the resource, skills and technologies that they need.

However, these external providers must always remember the data being processed is the property of the client (unless explicitly stated otherwise). It is the client’s private data being outsourced and optimised to create competitive advantage. This relationship can only flourish when based on the principles of trust and security.

John Yonker
CEO
Simplitium


About John Yonker:

john-yonkerJohn Yonker was appointed to chief executive of Simplitium in September 2018. Prior to joining the firm in 2016, John’s background was in Equities technology. During his 9 years at Credit Suisse (New York / London / Singapore / Hong Kong), he progressed to regional management positions for European and Asia Equities IT Trading and Execution. John then spent 4 years at Macquarie (Hong Kong) as regional Asia IT Operations manager before completing an MBA at HEC Paris (entrepreneurship specialisation) . Upon graduation, he joined Simplitium to develop Simplitium’s insurance service (ModEx) and thereafter filled the role of COO.