[Many thanks to Shield on whose blog the full version of this article has been published, available here]

The world is in awe of the transformative power of data. It promises to unlock mind-blowing insights, unveil lucrative new opportunities, save our planet from oblivion, spreading its tentacles into pretty much every corner of our business & personal lives…So it may seem surprising that lurking beneath the 21st-century veneer of Blockchain, AI, NLP, ML, there are vast swathes of banking that have fallen beneath the radar of the data revolution, with little change over the last 20, 30 or even 40 years.

In the mid-90s, I began my career as a business analyst putting risk systems into banks. I was cool with the envisioning stage, where we helped clients to see how our tech could transform the way they did their jobs, but I struggled inordinately when it got to the dreaded data mapping…Weeks of work, field by field, instrument by instrument, writing the rules to take data from each source system and convert it into the structure required by the new system. I wanted to cry, curl up in a ball, close my eyes, wake up in another life…And to rub salt into the wound, I knew that much of the logic I wrote would never be used, or would become obsolete before it was needed because something somewhere had changed.

Over 20yrs on, that laborious process for taking data from one system to another remains largely unchanged, but at a far greater scale: More systems. More data. More flows. Armies of analysts doing a similar job to the one that I found utterly soul-destroying 20 odd years ago. Vastly more effort going to waste.

So just why is this process so painful? Might there be another way?

Click on the link below to find out: