• Insight
  • Wednesday 27th April 2022

CDM myth #3: a solution looking for a problem

For my next instalment on the Common Domain Model (CDM) myth-busting series, I wanted to respond to one of the most frequent criticisms being levelled at it. For a primer on CDM, please check my first post on the topic.

Recalling the goal of that blog series, which is to uncover some fundamental truths about the CDM by dispelling some myth, there is probably no better way to do it than to respect and answer criticisms. They often reflect a market perception that cannot just be brushed aside.

The one that I hear most often can be summarised as this: “The CDM is a solution looking for a problem.”

Houston, we still have a problem

As in earlier posts, I like to go back to the source, i.e. ISDA's White paper on the future of derivatives processing, where the idea of the CDM first emerged. In this paper from 2016, ISDA laid out a starkly clear diagnosis of the current state of the trade processing landscape, which had grown incredibly brittle and inefficient. Before the CDM development even started in 2018, an industry working group estimated the cost savings that its widespread implementation in post-trade processing could generate at around $3-4BN.

Not making the CDM a solution to a specific problem is what allows it to tackle the inefficiencies it was designed to unlock.

Accurate or not, that’s not the point. The sheer scale of the problem demanded a fresh approach, and the remedy put forward was based on common-sense principles of standardisation and collaboration. My point is this: the CDM has always been the analysis of a problem before being the outline of a solution.

Since then, unless I’ve missed a headline, there hasn’t been any market, regulatory or technological development that could have contributed to fundamentally improving that assessment or changing its recommendations. If you work for a financial institution that does not suffer any longer from any of these trade processing inefficiencies, you need not read any further – you do not have any problem that the CDM can solve – lucky you!

But if you acknowledge that the problem persists, then the question is: does the CDM solve it?

The Inbetweeners

This question warrants a finer-grained analysis of the inefficiencies’ root cause. Look closely at the trade processing landscape and you will find solutions at every point in the trade lifecycle, from matching to reporting to collateral management, some of them widely adopted by the industry. Indeed in most cases, you will be able to choose between many available solutions.

But the diagnosis is not about a lack of solutions – it’s about a lack of inter-operability. To quote the white paper: “The lack of a coordinated approach with clearly defined, unbiased objectives is at the heart of these challenges, as the industry’s limited resources are not always focused on developing and delivering common solutions.”

Keeping the CDM as a genuine common denominator is key to ensuring its status as a pivot between different trade processes.

So in fact, the last thing that the world needs right now is yet another trade processing solution (in the same way as it doesn’t need another standard, as I argued in my earlier post). The industry has an in-betweener type of problem that can only be solved by an in-betweener type of approach. That is precisely why the CDM is not a solution – and never should be.

What? It’s not even a solution?

Let me explain. Assume that the CDM was geared towards solving a specific pain point in the trade lifecycle – take matching and confirmation for example. Rapidly, the model would be customised to that specific use-case and its restricted set of stakeholders (execution venues, banks’ middle-office staff, to name a few), rather than adhering to holistic principles.

Why bother with a composable product model (harder to design) if I can simply drive all my trade execution flow by declaring a product type upfront? Well… It matters for trade reporting, if we then have to specialise the reporting logic – e.g. think of the challenge of reporting the “price” attribute – based on potentially many, many different product types. But if the reporting crowd is not in the room, who cares?

Before we know it, the CDM would have replicated the issue that current solutions suffer from, i.e. the lack of inter-operability between them at different points in the trade lifecycle.

Lock-up by API

There is a natural response to this issue, though. It is for the provider of said solution gradually to extend its product offering to cover other parts of the trade lifecycle. In the example above: offering reporting services too. Hence the tendency, in the way the market is currently structured, towards vertically integrated solutions – or “platforms”, as they’re often called (they’re anything but). But this narrows end users’ choice: there certainly isn’t one that is best at doing everything.

I often call this reduction of competition: “lock-up by API”. Here is a typical scenario. You have just successfully integrated vendor X in your architecture, after months (if not more) of painstaking work. A little while later, another vendor Y emerges that provides the same solution, only better, faster and cheaper. Your response to Y’s sales pitch? “Sorry, we’d really love to use your product, but we can’t be bothered going through that integration all over again…” In fact, even if the price-quality ratio of X’s service degrades over time, you’re still kind of stuck. Sounds familiar?

It’s more accurate to think of the CDM as a library distributed in multiple languages and accompanied by visual representations.

Again the answer to this problem was clearly laid-out in the white paper: the industry needs to develop common foundations for “the process, behaviours and data elements” of the trade lifecycle. Such foundations should be solution-agnostic but “once agreed, these processes should be technically encoded as common domain models (CDMs)”. This is how the CDM was born.

Six degrees of separation

That the CDM is “coded” probably explains why it’s often mistaken for a “solution”, based on the confusion that code = system = solution. In fact, it’s more accurate to think of the CDM as a library (in programming-speak) distributed in multiple languages and accompanied by visual representations. That code library is directly usable in solution implementations, but critically remains independent of any particular solution or system (see my previous post that addresses this point).

ISDA’s on-going Digital Regulatory Reporting (DRR) programme provides the best illustration of this separation in action. DRR delivers a standardised, coded interpretation of the trade reporting rules using CDM-based data to represent the transaction inputs.

Firstly, DRR is only an application of the CDM – it doesn’t leak into the CDM so the latter remains free of any reporting perspective. For instance, the enrichment of transaction inputs with static referential data, often jurisdiction-specific, is part of DRR but not of the CDM. Keeping the CDM as a genuine common denominator is key to ensuring its status as a pivot between different trade processes without being encumbered by any in particular.

And secondly, DRR is distributed as an executable code library just like the CDM is. DRR is not a compliance solution but it ensures that the market applies reporting rules consistently regardless of the specifics of any implementation (internal or vendor-provided).

The last thing that the world needs right now is yet another trade processing solution.

In summary, not making the CDM a solution to a specific problem is what allows it to tackle the inefficiencies it was designed to unlock in the first place.

Ready to roll?

Please get in touch with REGnosys and see how we can help you benefit from the CDM or DRR.

Leo Labeis

Leo Labeis

Founder & CEO at REGnosys

Read more

Reporting Audit

First, map your data into CDM...

We have it covered it. All it takes is a +/- 2-day immersion workshop with your tech team to build your mappings.
As a by-product, you get a ready-to-use API to convert all your trade data

… And access your audit results on-line…

Once your trade data have been mapped, our reporting engine compares its output to your reports and analyses any discrepancy.
Analysis is developed within 6 weeks and results published into a web application.

… Through a powerful user interface

Forget static audit reports that end-up on a shelf.
Our reporting engine is available on-demand to reconcile your reporting process end-to-end and down to single trade flow, through a fully interactive interface.
Contact us

CDM Integration

We do the data mappings for you…

Just extract transaction data from your booking, reporting or any other systems, we handle the rest.
We build mappings in immersion with your tech team, as a packaged +/- 2-day workshop that includes valuable CDM training – and is fun (yes, really)!

… And deliver you a packaged output…

Forget data mapping spreadsheets and forget hard-coded translation buried deep into your code base.
What you get is a transparent, maintainable CDM translation dictionary, automatically packaged into an API to translate your internal trade messages.

… Which you can start using right away

Start using the API for testing right away. For production deployment, we offer a range of hosting options that adapt to your technology stack.
The application grows with you. Just edit your dictionary to connect more and more systems to CDM.
Contact us

Model-Driven Regulation

Bring-on the regulatory text…

Can be anything in your existing corpus, as long as it’s digestible and relatively self-contained.
The target is to deconstruct that text and reconstruct a model of the regulation in +/- 2 days.

… We’ll handle the rest…

Our team of regulatory and engineering experts works in immersion with you and guide the process from start to finish.
Our promise: some executable code running by the end of the workshop, delivered to you and ready for demonstration.

… And you’re ready to develop rules on your own

All it takes is a cross-functional team, ideally all-encompassing from policy to technology, who is open to a fresh (and fun!) approach.
The buck doesn’t stop there. Your team is now empowered to carry that project forward inside your organisation.
Contact us

Data Modelling

Take an existing data pipeline of any form…

Just choose among your existing business processes to experiment a model-driven approach in +/- 2 days.​
We can start from any kind of artefact, from XML messages down to Excel or even PDF documents.

… To demonstrate the model-driven approach…

Our team of data engineering experts works in immersion with you and guide the process from start to finish.​
Our promise: some executable data pipeline running based on your model by the end of the workshop.

… Which you can deploy within your organisation

All it takes is a cross-functional team of developers and non-developers, who is open to a fresh (and fun!) approach.​
The buck doesn’t stop there. That team is now empowered to carry that project forward and build model-based pipelines for your organisation.
Contact us