It’s monday morning as I write this and I came across a very interesting piece related to banking regulatory changes in the US. Chances are it will in some shape or form transfer over to other countries (and vice-versa).
If you haven’t heard if it, it’s called the Volcker compliance after the former head of the Federal Reserve Board in the US.
It goes live on June 30, 2014.
Here’s the kicker:
This regulation required banks to provide detailed reports on seven different metrics. If they have the data that provides those metrics in a Data Warehouse designed as a Data Vault, then …
Ta da … They’re already compliant (They’ll still have to build the reports though).
However, only as long as they’ve implemented their DV correctly and followed the methodology as taught here:
The funny thing is prominent financial analysts have said, “Why weren’t they doing this anyway?”
After all, these types of requests are pretty common in that particular industry (and in many others as well).
The DV methodology requires you to keep all the data in raw form (albeit integrated) housed within DV structures. There was a very similar compliance initiative kicked off for banks in India and it’s still ongoing.
And, the DV has already been proven in banks. DV 2.0 is already being implemented at least in one bank.
Regulatory reporting and audit don’t just touch banks. Utilities, Telco, Non-Banking Financial Institutions and many other industries have to be compliant as well. It helps to have a DW designed as a Data Vault and implemented correctly …
… because, it automatically takes care of any audit or regulatory issues with the data AND it still provides a foundation to quickly build out any of your “managed” self-service data marts.
Banks requested 2 years to be compliant and were given only 6 months. To banks who already have a DV which covers their bases … I have to say, “Congratulations!” and to those who don’t, I hate to say, “I told you so”. For folks who’re working in business intelligence and aren’t taking account of these things, I’d say, “You probably should, because you’re there to serve the businesses you design your solutions for”. Too many IT people don’t pay heed when I talk compliance and audit readiness and sometimes have even opposed me saying it’s unnecessary and an overhead. It’s not an overhead in the Data Vault and DV 2.0.
Another interesting thing to note. There were originally 20 different metrics which have been reduced to 7 after the banks started complaining. If you think they won’t ask for the other 13 at another point in time, the you have another thing coming, because they’ve already slated that to be done by July 21, 2015.
For most of the core data, you should be able to get everything done with a Data Vault and specific regulatory marts on top of it. If the core data is already available in a DV, the mart builds should be fairly quick. If not, DV builds are also fairly quick now thanks to plenty of automation tools getting more and more mature.
In case, you have data on big data systems involved in these reports, you may need to look at DV 2.0
Dan Linstedt (Inventor of the Data Vault and DV 2.0)
with Sanjay Pande (Co-Founder, LearnDataVault.com)