Regulatory reporting requirements have increased manifold over the last few years, including capital and liquidity reports, stress testing, resolution planning, etc., and ensuring accuracy and integrity of all these reports is challenging for those accountable.
As noted in the Principles of Risk Data Aggregation (“BCBS-239”), a key aspect of accuracy and integrity is the requirement of reconciliation between risk and accounting systems data; mandating the same level of controls to risk data, as is applicable to financial data. However, this has proved to be the proverbial Achilles heel in banks’ implementation efforts.
As per the latest status report in 2020 (See chart below), ‘Accuracy and Integrity’ has the least number of fully compliant banks. Further, the number has decreased from 6 in 2017 to 4 in 2019. This is perhaps because initially, requirements were partially understood or deemed compliant only on the basis of a roadmap, but actual implementation was delayed.
As noted above, reconciliation between regulatory and accounting data remains difficult to achieve. Indeed, when BCBS-239 was published, the British Banker Association (BBA) commented that this was not feasible; other industry bodies, e.g., French Banking Federation, Japanese Bankers Association, etc., echoed this concern.
Highlighted below are a few examples of why reconciliation between regulatory and accounting data is so difficult:
1. Scope and granularity of data sourcing: Various credit, liquidity, interest rate risk, and stress testing reporting requires transaction level granular data across assets, liabilities, and contingent exposures; additional data is required for climate risk reporting.
This requires data sourcing and transformations from a variety of IT systems spanning loan booking systems for different product types, trading and collateral management, deposits, core-banking, asset-liability management (ALM), general ledger (GL), etc., further fragmented across lines-of- business and geographies.
Legacy systems, manual processes, and end-user-computing (EUC) systems make the data-sourcing even more complex.
2. Exposure-at-default (EAD) and adjusted carrying value (ACV) calculations: Various accounting sub-components, e.g., loan provisions and charge-off amounts, accrued fees and interest, unrealized gains/losses, etc., need to be combined to derive EAD and ACV amounts; and additional adjustments are needed for specific accounting rules, e.g., leases, purchased credit impaired loans, available of sale and fair value assets, unsettled transactions, trade date vs settlement date basis, etc.
Further, various types of off-balance sheet exposures such as undrawn exposure, notional amounts, guarantees issued, non-cash collateral, etc. may not be captured in finance systems, making sourcing and reconciliation difficult.
3. Additional data attributes and calculation complexity: Basel calculations require many additional data attributes that are not typically captured in finance systems, such as maturity, cash-flow schedules, collateral information, etc.
Many calculation approaches like securitizations, investment funds, eligible minority interest, etc., require look-through to underlying assets. Another key challenge is to have common customer identifiers across all systems, to aggregate all exposures across the bank to each customer/counterparty group.
4. Intracompany transactions: Given the complexity of legal entity (LE) structures, booking models, and various intra-group transactions, reconciliation of intra-company transactions is difficult, as they need to be excluded at holding company level, but included in the LE levels.
Further, inter-company exposures need to be included in certain regulatory reports although they are eliminated in accounting consolidation.
5. Accounting vs Basel classification differences: Regulatory definitions for various asset classes, such as retail and wholesale classification, defaulted exposures, repo-style exposures, securitization, etc., may differ from accounting classification, making reconciliation difficult.
6. Jurisdictional rules differences: Global banks must calculate and report capital numbers in multiple jurisdictions, under different accounting regimes, and regulatory calculation rule differences, which necessitates customised reconciliation approaches.
7. Finance close processes: Timing delays between end-of-month data sourcing, and finance close processes and period-end adjustments give rise to reconciliation differences.
Illustrative reconciliation challenges for derivatives:
Derivatives, as an asset class, provide a good example of why transparent reconciliation between accounting and finance data remains very difficult, for reasons discussed below:
There are certain classification differences, such as long settlement transactions are classified as derivatives for regulatory purposes but not for accounting, some derivatives are classified as securitization, etc.
Derivative transactions can have multiple legs, and are often booked back-to-back with external or intra-group counterparties, or as offsetting transactions as clearing members. All these present reconciliation challenges.
The on-balance sheet value (fair value/mark to market) may be calculated differently between accounting and risk, employing different pricing formulas and inputs.
Additionally, the scope and methodology for accounting netting, which also varies across jurisdictions (such as IFRS vs US GAAP) is different from Basel netting eligibility and calculation rules; and netting rules for cash collateral are also different for leverage ratio calculation.
Further, cash collateral margin, receivable or payable, is posted in different GL accounts than the derivatives (i.e., typically in Other Assets/Liabilities or Due From/Due To accounts); and notional amounts, non-cash collateral, and reference asset data for credit and equity derivatives, are off-balance sheet, and may not be captured in GL systems.
CVA calculation is also different between accounting and regulatory reports.
The final calculation requires many additional data attributes not captured in GL, e.g., exchange traded (ETD) / cleared (CCP) / over the counter (OTC), underlying asset class details, option delta, if applicable, etc.
Reliability of regulatory reports continues to dominate the supervisory agenda, as can be seen from multiple ‘Dear CEO’ letters from PRA, or from ECB 2023-205 supervisory priorities. As supervisors increasingly expect banks to apply the same standards of controls to regulatory reporting that apply to financial reporting, banks are likely to face more stringent reconciliation requirements.
1. Increased reconciliation requirements: Some reporting templates (such as leverage ratio; FR Y-9C Schedule HC-R, Part II) explicitly require reconciliation details; some others (such as Corep; Finrep) have reconciliation validations.
‘6G’ liquidity reporting in US (FR 2052a) introduces reconciliation requirements; although, to address industry pushback that it is overly burdensome, the preamble clarifies that reconciliation is not required at transaction level but only at an aggregated product and counterparty level, subject to reasonable assumptions.
ECB annually performs reconciliation between individual banks’ Pillar 3 and regulatory reports data. Increasingly, detailed reconciliation will be mandatory for preparation and attestation of key regulatory reports.
2. Capital buffer and other penalties: Additional data quality metrics disclosures may be mandated under pillar 3, esp. manual adjustments and reconciliation breaks etc.
Ultimately, regulators may require pillar 2 buffers linked to data quality metrics, either a qualitative buffer to compensate for governance weakness, or more quantitatively defined risk weights as a function of data quality metrics, as part of operational risk; and even impose other penalties for data shortcomings and reporting errors.
While there are a range of practices observed in the industry, many banks have created a central data-repository where all finance and risk data are collected and reconciled centrally; and all reporting is prepared from this centrally reconciled and cleansed dataset.
This, in turn, requires maintenance of data definitions from source systems, and designing of data transformation rules accordingly.
This requires significant collaboration of business stakeholders to define the rules correctly, and is prerequisite to technology transformation programs like cloud migration. Ultimately, such projects present an opportunity to implement reconciliation-by-design, and provide longer term competitive advantage in terms of streamlined processing and high-quality reporting capabilities.
About the Author:
Pushpam is an expert in Risk and Prudential (Basel) regulations in EY, London office. He has over 20 years of global experience across US, UK, the Middle East, and India.