On this page OSIS™ will post comments, blogs and case studies focussing on bank loan portfolios, the modelling and consistency of risk weighted assets. OSIS™ has devised multi-period Bayesian economic capital models that capture the accuracy of internal rating systems, back-testing data and loan level data. We focus on parameter uncertainty especially of the steady state or through the cycle risk parameters, asset correlation as well as autocorrelation. Ultimately, we aim for investors to better understand the risk of bank failure through a bottom up approach.
In other words, are the dykes sufficiently high to withstand conceivable flood levels.
Basel 3 Standardised floor and default observations on residential mortgages in the EU
On 3 January 2017 the Basel Committee of Banking Supervision (BCBS) announced that it needs more work before the Committee can reach agreement on the package of proposals which would lead to Basel IV. One of the proposals was to introduce a Standardised floor. With the Standardised floor, the Committee wants to limit unexplained differences in the Risk Weighted Average (RWA) calculations of individual banks. Banks who have applied for the Internal Ratings Based Approach (IRBA) can calculate their own Probability of Default (PD), Loss Given Default (LGD) and Exposure at Default (EAD) estimates which are the inputs for the Basel RWA calculations. With the introduction of Standardised floor, the banks need to use the higher of RWA (determined through a new developed Standardised approach multiplied by a still-to-be determined percentage) and the RWA (based on their IRBA calculation).
Regarding the Standardised Approach for residential mortgages, the Committee have proposed to make the RWA calculation dependent on the value of the property in relation to the size of the loan, so-called loan-to-value (LtV). The higher the ratio – house value divided by loan value – the higher the RWA, thus the more the bank needs to put aside for regulatory capital.
OSIS have analysed the default performance of c. 10 million European residential mortgage loans published by the European DataWarehouse and compared the observations per LtV bucket between the various countries.
The observations show large discrepancies between two groups of countries. In one group we have Italy, Spain, Portugal and Ireland and in the other Sweden, Germany, The Netherlands, Belgium, France and the UK. On average the default rates in the first group are three times higher than the second group. This is interesting information for when the Committee wants to give proposals further thought.
In the OSIS tool below, the user can select countries and different LtV buckets to make any comparison – the defaults rates are shown on a quarterly basis. The source data is derived from loan level data of securitization transactions. In these transactions, banks have the option to repurchase loans from the underlying pool of assets which could hide observed defaults. Not all banks choose to repurchase and therefore the user can choose to filter out transactions where any repurchase have taken place.
In the debate on regulatory capital many people talk about capital increase, a few talk about facts. In this article we looked at a large set of historical default data and showed how regulators and market participants could learn from it.
We updated the assumptions in the Basel formula and PD estimates by the banks with historical data of PECDC in a Bayesian framework.
The results are surprising and hopefully will shift the debate to a better understanding of bank credit risk: a more sustainable solution for preventing a next banking crisis than just increasing capital levels.
The learning Basel formula
Case study asset correlation
Banks have to hold capital as a buffer for unexpected losses in their asset portfolio. It is important that capital buffers are sufficient even under severe stress as continued trust in bank solvency is important for a well functioning economy. In the late nineties, the Basel Committee introduced a formula to calculate the minimum amount of capital based on the banks’ assessment of expected losses the asset correlation factor. Asset correlation measures the volatility of losses and hence the systematic risk of a large simultaneous bank loan losses during an economic downturn. A higher asset correlation assumes a higher sensitivity to systematic risk and results in a higher capital charge.
Correlations of credit defaults and losses are difficult to measure as representative data are often lacking. Many market participants like banks, investors and credit rating agencies have used this Basel formula as an important benchmark against which to calibrated their own capital models. The financial crisis caused much uncertainty and caused significant doubt into the adequacy of the Basel formula. Recent amendments (Basel 3) have pushed bank capital levels higher, but the uncertainty around the correlation parameter has not diminished.
At the time the Basel formula was designed there were no default data available in line with the Basel definition that could have been used for a calibration of the asset correlation. Today, more data are available, but time series are still too short for accurate correlation estimates. Furthermore, it is unclear to what extent assets correlation change over time.
OSIS™ analyzed default data from 2003 until 2012 collected by 17 banks from Europe, South Africa, Australia and Northern America containing over 37,000 default observations from 2,9 million obligors. Of the different methods available to estimate the asset correlation we have used methods presented by Fitch , Löffler and a Bayesian approach . For the Fitch method, we have studied the sensitivities using a standard non-parametric bootstrap, while for the Bayesian approach the parameter uncertainties are obtained directly by sampling from the posterior distribution via Markov Chain Monte Carlo simulations. Exhibit 1 shows the implied correlations for SME and Large Corporates as bars with the lines representing a 95% confidence interval.
Exhibit 1: Source OSIS, based on data from PECDC
The Basel II formula sets a fixed correlation for large corporates at 12% – 24% depending on their probability of default while Basel suggests a smaller 8% – 24% for SME. Traditional (frequentist) estimates have been criticized for their bias in small samples. We test the robustness of our Bayesian estimate and cannot find such downward bias in simulation runs with artificially generated data. Our analysis suggests that the Basel assumptions for correlation are conservative and differ significantly from empirical estimates.
What can we learn from this? Certainly we have to acknowledge that ten years of a data is important but not nearly enough to draw strong conclusion to revise a regulatory framework. Although the data has captured the downturn in 2009, more detailed studies in peripheral Europe that has exhibited much larger than average volatility should be conducted. On the other hand, the empirical evidence is important for the current discussion about how to encourage lending to smaller companies. We also used the data in a Bayesian analysis in combination with an informed expert prior based on the
Basel assumption. As expected, the data “drag-down” the posterior correlation mean away from the Basel prior resulting in a coherent estimate that can serve as a compromise for economic risk calculations.
We believe that using historical data are indispensable for our understanding about the risk we face. However, if data are scare, the combination with expert knowledge of banks, supervisors, credit rating agencies or any analyst becomes critical. The Bayesian analysis result is a coherent “middle ground” estimate that reduces the risk of being overly aggressive or conservative.
The assessment of bank capital adequacy requires a thorough understanding of the Basel II risk parameters as well. Our analysis has been extended to include measured and estimated uncertainties for those parameters to be reported elsewhere. In combination, our analysis contributes to the current investigation of bank asset quality and helps re-establish confidence in bank solvency, an important prerequisite for improving economic activity.
For a Dutchman it is like living behind a dyke. Without the benefit of historical experience, we would not live in the hinterland even if dykes get raised by, say, 15m. Only data combined with expert assessments about the future can make us comfortable that an increase of 3m provides adequate security in a changing world.
Economists have helped to cause this crisis; what are their arguments to get out of it?
There is much debate between bankers, politicians and economists whether capital buffers need to increase. Some economists claim that banks haven’t learned anything from this crisis and question their resistance. To what extent can the public have more faith in economists and what was their role in cause of the current crisis?
Since years Economists have preached –supported by high priest Alain Greenspan – the efficient market theory. In short this theory says that the market is always right and price bubbles don’t exists. As a consequence new accountancy rules from early 2000 forced banks to value their assets in function of market prices. As a consequence the market did the thinking like the Credit Rating Agencies with their ratings, causing mental laziness amongst bankers, investors, supervisors and last but not least economists.
Late January 2010 the ABS market required a risk premium of 4% on a portfolio of Dutch government guaranteed residential mortgage loans of which the first 3% of potential losses were enhanced by subordinated tranches. Set against the margin that banks receive on their mortgages that market premium would require a market value write-down of about 25%.
To suggest that a higher buffer would save banks against these market losses is like proposing that we must raise the dikes by 20 meters. Not because we think that the water is rising so high but imagine that an omniscient person claims it would. A much better and cheaper solution is to do more research on the likelihood of high flood levels and share the findings with a broad audience.
The same should happen with the banks. European legislation requires that banks should publish their historical losses, but this is rarely enforced. It’s not only a better and cheaper buffer it also keeps people conscience of the existence of risk and that it is never completely ruled out. And if these buffers get overflown, we know better how to act to get on save grounds again.