OMS Implementation: Top Six Pitfalls from a Compliance Perspective

Implementations and conversions present many challenges in the form of unexpected gaps, regulatory or business changes that must be accommodated, and complex problems that can tax the skills of your team. In our previous article, we took a look at the overall pitfalls for implementation. In this article, we’ll take a closer look at the top six pitfalls from a compliance perspective.

1. Treating Your Automated Compliance Library as An Afterthought

Most implementations and conversions are done under tremendous time pressure. The automated compliance library is often tied to the go-live of the trading system. In an effort to save time, many implementations defer a thorough review of the source documentation. Additionally, compliance teams often find it impossible to write and test the rules themselves. Vendors have attempted to address that gap by offering rule-writing services. The vendor provides a resource that codes the rules and works with the compliance team to test them and move them quickly into production.

However, this can introduce risk because the rules in the library will be based on the vendors understanding of the requirements from the compliance team rather than a comprehensive working knowledge of the source documentation.

2. Ignoring Proper Testing Methodology

When testing resources are constrained, rules may be migrated into production without rigorous testing. Instead, if the test results look like they are within an acceptable tolerance of the results of the prior system, the rules are assumed to be operating correctly.

Proper testing methodology should include two, distinct types of testing:

  1.    Unit testing - which tests the math of a rule to ensure the results are correct. 
  2.    Use case testing - which tests the appropriateness of a rule for an account. 

Even when rules that have been unit tested are mathematically correct, skipping use case testing often results in rules that are poorly suited for the way an account trades. For example, a test restricting the use of currency spots may be correct and working properly for an equity account, yet creating hundreds of “false positive” results for the emerging markets desk as they go about their normal trading day. This creates the kind of grinding that slows down the trading desk and results in frustrated users and missed opportunities. Worse, the resulting “noise” can hide legitimate violations as they are drowned out by those false and misleading violations, leading to a major breakdown in the critical business process.

Robust testing will ensure that the rules in production are accurate and appropriate for your firm’s lines of business.

3. Waiting for the Elusive “Phase 2” to Close Data Gaps

Data feeds are never perfect, but some gaps impact results more than others. Missing or incorrect shares and/or outstanding data can lead to regulatory violations if a firm owns more than the regulations allow.  Similarly, gaps in issuer data can leave a firm overexposed to an entity, either through direct holdings or derivative positions. The nuances of how a compliance system treats issuer and security data varies from product to product and gaps can often go unnoticed until they result in a trading error. Too often, the closure of data gaps discovered during the project is deferred until the “Phase 2” that unfortunately never happens. When the inevitable error emerges, it is traced back to bad data. Prioritizing and closing critical data gaps can avoid future errors.

4. Neglecting Closed Accounts and Obsolete Rules

One task that can be easily overlooked in the rush to go-live is the deactivation of accounts that have been closed, or rules that are no longer valid. Although a seemingly harmless decision, leaving closed accounts active on your systems can create system performance and data issues. Active, obsolete rules can trigger "false positives," slowing the process of the trade and frustrating portfolio managers (much like improper testing, mentioned above).

Obsolete accounts should be archived (or deactivated) from your system wherever possible. Good maintenance will avoid the frustration and confusion that can arise from scrolling through obsolete accounts and responding to “noise” from outdated rules.

5. Not Putting the End-User Back in End-User Testing

Once the unit and use case testing has been completed, user acceptance testing (UAT) should be performed by the end-users who will use the system on a day-to-day basis. If resources are tight during implementation, experienced consultants or testers with front-office experience can help save your traders and portfolio managers time and frustration by serving as a proxy before the UAT stage, but end-users should test the system to their own satisfaction before going into production.

6.  Failing to Provide for Annual Maintenance

Once the implementation is complete compliance departments often neglect to put a plan in place to test the compliance library on a periodic basis. Uncontrolled library growth, inconsistent interpretation of mandates, rules that don’t match actual trading activity or regulations, false positives, excessive overrides and data issues are just a few of the reasons that money managers miss violations and risk monetary and reputational loss. Without a rigorous, annual testing of the automated library and supporting data, the results portfolio managers and compliance teams rely upon are statistically likely to harbor error rates of 25% or more. Budgeting for, and implementing, an annual review process will ensure that your firm is adhering to best practices.

Contact IMP to learn more about how our team of expert consultants can help you to avoid these pitfalls and instead implement industry best practices for your future system implementations.