As trading processes become ever more sophisticated and regulators race to catch up, the end-to-end testing of trading infrastructure is an increasingly crucial component of compliance. Financial services firms face severe penalties for trading errors – including fines, loss of reputation, potential bankruptcy and even personal repercussions for senior executives. The cost of failure is high - yet many organizations are struggling to successfully implement comprehensive testing systems.
A number of cautionary tales in recent years have highlighted the dire consequences of ineffective testing. In March of this year the US Securities and Exchange Commission (SEC) fined the New York Stock Exchange and two affiliate exchanges a combined US$14 million as the result of five separate investigations into multiple regulatory failures which included a four-hour trading halt back in July 2015 as the result of a flawed software rollout. During the Facebook IPO in July 2012, Nasdaq’s IPO Cross application and central matching engine went into an infinite loop due to a fault in its system reconciliation check, which prevented the market from opening and caused almost 40,000 inbound orders to get stuck in the processing queue.
And in August of the same year Knight Capital, then the largest trader in the US equities market (with an NYSE market share of around 17%) faced bankruptcy after a technician made a manual error in the updating of some old code that resulted in a US$460 million loss within 45 minutes of trading.
Rigorous testing is no longer a choice but a necessity – and regulators around the world are cracking down. Europe in particular is feeling the strain, due to the demanding new testing specifications implemented by MiFID II that came into effect in January 2018. Article 17 of the new regulations specifically requires that all investment firms engaged in algorithmic trading ensure their systems are fully tested, while Article 48 requires regulated markets and trading facilities to ensure their systems can continue trading under conditions of severe market stress and that their members carry out appropriate testing of algorithms.
These changes have required both investment firms and trading venues to implement substantial changes to their existing processes in order to comply – and not everyone has been successful. Research from the FCA released in February 2018 found that a majority of UK firms under review did not have appropriate policies or governance in place around algorithmic trading testing, nor did they apply their testing process consistently across all areas of the business. In addition, it found that many firms were not providing the right management information to executives, did not have suitable sign-off procedures in place, and were not capable of producing an audit trail of their testing process.
Clearly, regulations alone are not enough to inspire effective change. So what’s the hold-up?
The problem is that over time, trading operations have become complex networks of different applications using diverse technologies tied together into an overall trading infrastructure. Even the most basic orders today bounce through multiple gateways, and the numerous processes required to speed them on their way must all occur within milliseconds. Testing has inevitably evolved in a similarly incoherent fashion, and as a result firms often lack an over-arching 'enterprise' perspective or a coherent testing strategy, relying instead on a multitude of different testing agents. More often than not, the testing strategy relies heavily on manual testing thereby increasing the total cost of operations, error rates, and time required.
The cost of consolidating multiple testing platforms can be high – but the cost of maintaining them can be even higher. Many firms now spend more time managing their complex testing environment than on the testing itself, and as a result the overall expense associated with the testing process – in time, resources, and cash – has risen significantly, without producing any discernible increase in trading system reliability. The knock-on effect of this web of complex systems, complicated testing environments, regulatory demands and regulatory change, is that the number of tests and associated results is expanding rapidly.
Analyzing all of this test data is also challenging – particularly since the teams charged with analyzing the data are often busy actually running the systems themselves. As a result, firms are not reaping the potential benefits that a good testing program could bring, either – there just isn’t time to focus on adding value. The challenges are many – from test design, test management, test configuration, and test results analysis. It’s clear that firms need to be approaching this differently.
The good news is that solutions are now emerging that can assist firms in their upgrading process and actually reduce the total cost of operations rather than further increase complexity. In May this year, for example, the recently merged Itiviti and Ullink launched a brand new integrated platform for automated enterprise-level testing of trading system, helping firms to automate their manual testing procedures. Itiviti's VeriFIX offers continuous integration and regression testing, resulting in faster and more reliable testing cycles and improved quality. Scalability and improved software usage within testing strategy allows multiple users to test simultaneously on multiple agents, while multi-protocol testing through unit and isolation testing breaks down testing into smaller units for improved accuracy, both for single modules and when performing end-to-end testing.
All of this allows smaller team to oversee a larger testing operation by increased use of Automation. By means of service virtualization, functional as well as behavioral changes can be tested and emulated, and real-time report tracking enables the instant handling of any issues. This enterprise-wide approach marks a growing trend towards the acknowledgement of a more inclusive and coherent strategy towards algorithmic testing – about time, given the increasingly stringent regulatory requirements.
As the financial services industry finally starts to settle into the new MiFID II environment, solutions like VeriFIX will become ever more crucial in order to maintain market competitiveness.