Download the reportDownload ↓
By: Levi Folk
Published on: 2022-11-16
When it comes to fund reporting sometimes it feels like the universe conspires against you. The universe experiences entropy meaning that disorder is steadily increasing. Systems move from order to chaos and that certainly rings true when it comes to data and reporting.
Financial data is rapidly changing and has multiple stakeholders causing it to degrade in quality over time if not properly tended to, and the consequences of this disorder add up in terms of errors, costs, and time delays with reporting. Gartner found that, “organizations estimate the average cost of poor data quality at $12.8 million per year.”
Many firms lack a clear process around data governance. I recently asked a potential customer about her process for updating fees on her regulatory reports. Her response: “I ask the guy with the moustache in fund accounting.” I didn't have the heart to ask what happens when you get an out of office alert?
Unless key stakeholders have a means to access their data in a timely manner while ensuring the quality of that data, asset managers are suffering the consequences in operational inefficiencies, reporting errors and poor outcomes.
The reality is that fund reporting is decentralized across multiple departments at most companies. It cuts across marketing, legal, compliance and sales and each department has responsibility for fund reports often with overlapping data leading to operational inefficiencies in the best-case scenario and mixed messaging and errors in the worst.
Using spreadsheets for your data storage is akin to using your dishwasher for dish storage. It works, but that is not what it was designed for, and it's not very practical. Spreadsheets lead to concurrency errors with your data when there are multiple users, and the more people that are collaborating on a file, the more likely it is for chaos to ensue. Ideally each department is accessing the same data through a centralized, on-demand solution. Even if everyone in the company knows to get the fees data from the guy with the moustache, the manual approach creates several inefficiencies including wasted time, greater errors and higher operational costs.
One of the key challenges across reporting is oversite and awareness— knowing when data was updated and by whom. If these questions tend to get asked through police style interrogation upon discovery of an error and alibis are required to establish one’s innocence, then oversite and awareness are lacking.
Centralized data with an audit trail of changes removes the guess work out of reporting with the result being higher accountability for changes to reports and knock-on effect of fewer errors because the process is clearer and accountability is higher. And it saves you having to bug the guy with the moustache.
A system that allows users to query data at a point in time, to understand what the data looked like at that time— sometimes referred to as bi-temporal data— is an indispensable tool to improving data quality, to minimizing errors, and to maintaining a proper audit trail of that data.
That system also needs to offer the multiple stakeholders at your firm easy access to that data through a centralized API— essentially a software interface for connecting a different application in your firm or at a third-party firm to your data. The API is the means for accessing your data that you have centralized.
Once you have a means to centralize, update and access your data from one place, you are off to the races.