After the credit crisis erupted in summer 2007, it soon became clear that the write-downs in sub-prime exposures would be unprecedented in scale, they would wipe out large portions of banks’ capital, and they would force regulators to reassess the way they define and measure risk. Behind such losses were new products such as securitization, structured finance and credit derivatives which had resulted in the build-up of significant credit risk within the banks’ own trading books. Yet too little capital had been set against these credit exposures because they benefited from the ‘trading book’ treatment. In theory, credit risk should be treated consistently regardless of whether it is a direct exposure (a loan within the banking book) or an indirect exposure (a MBS, a CDO or a synthetic position with credit derivatives held in the trading book). Regulators are now busy defining new rules which should eliminate the regulatory capital arbitrage between banking book and trading book. The Basel Committee has just published the proposed ‘Enhancements to the Basel II Framework, including the capital regime for trading book positions’ on the 16th of January 2009 (see: www.bis.org) The result? A potential increase in the minimum capital for trading activities by a large multiple, estimated in the range of 3 to 5 times more than current levels. The new requirements could come into effect in 2010 or 2011.
Before securitization and structured finance, banking and trading were two distinct activities, easy to identify, with distinct accounting and regulatory treatment for the ‘banking book’ and the ‘trading book’. The distinction is still at the basis of current capital requirements, but it is no longer working as well as initially intended.
The banking book should include mainly loans, which used to represent the single largest component of the banks’ balance sheet. Loans are direct exposures to credit risk which banks hold until they are repaid (to maturity). As they are long term, they are accounted for at amortized cost (essentially historic cost plus accrued interest less provisions) but they are subject to strict regulatory capital requirements.
On the other hand, the trading book should include short term investments, very liquid securities, bonds, shares, derivatives, long and short positions, held for short term gains. The trading book is a mixture of assets and liabilities. The trading book should be marked to market in the profit and loss statement, resulting in a significant volatility of the banks’ results. On the other hand, from the regulators’ point of view, the trading book should be relatively safer and, therefore, subject to lighter capital requirements.
For example, a bank has a customer loan portfolio of €100bn. Regulators require this bank to have a minimum €8bn of capital (8% capital ratio). This limits leverage (assets to equity) to a maximum of 12.5 times, assuming that capital is represented entirely by common equity. In practice, leverage can be higher to the extent that capital itself can be levered with a combination of equity and hybrids (quasi-equity).
Now, where does the 8% capital ratio come from? Why 8%? Is it sufficient?
|Expected default rates||5%|
|Expected recovery rates||75%|
|Worst case losses|
|Worst case default rates||15%|
|Worst case recovery rates||50%|
|Worst case losses||€7.50bn|
|Minimum regulatory capital||€8.00bn|
|Maximum leverage (assuming capital all equity)||12.50x|
Well, if we expect that 5% of customer loans will go bad and that 75% will be recovered from bad loans, then losses should amount to an expected €1.25bn, which should be covered mainly by provisions (in reality, accounting standards do not allow recognition of losses until evidence of actual deterioration in borrowers’ financial situation, i.e. only ex-post). To avoid duplication, capital should cover only unexpected losses, in a worst case scenario, in excess of what has already been covered by provisions. Therefore, let’s assume that the worst case scenario means default rates increase to 15% and recovery rates fall to 50%. In such a scenario, losses would increase to €7.5bn, i.e. €6.25bn more than what has been set aside with provisions. To summarize, the bank has €100bn customer loans, it has already provided for €1.25bn losses but losses could be – at worst – an additional €6.25bn. Well €8bn of capital should be enough.
These figures are just an illustration and the key issue is evidently the definition of a sufficiently conservative worst case scenario (defined by Basel II rules as a 1 in 1000 years event). But the beauty of this analysis is that it is based on common sense, it can be scrutinized and subject to easy reality checks.
Now, moving away from traditional lending, let’s consider a bank with a trading book of €100bn. What is the risk of this trading book? This has always been a more difficult question to answer. A trading book is much less homogeneous, it is not even a set of positions that the bank is sitting on, it is a flow of buy and sell decisions, it involves conscious risk-taking as well as a substantial amount of hedging. Trading instruments are exposed to a variety of different market risks: the original list included interest risk, equity risk, commodity risk and currency risk, but it was more recently extended to include credit risk as well. One trading book could be structured as virtually risk free (though not a very profitable one). Yet one single position could possibly wipe out 100% of the initial investment, or even more if there is leverage embedded in it. The point is: it is impossible to quantify risk from the outside without opening the book. On the other hand, the trading book has two attractive features which should support proper monitoring and measurement of risk in a standard, generally accepted and scientific way.
First of all, traded investments tend to be easy to value, with easily verifiable prices. Therefore, it is possible to track historic prices and returns, to use statistic models to measure average returns and volatility and to estimate expected and unexpected losses, at difference confidence levels. So, value at risk, defined as the maximum loss that banks should not exceed over a certain period of time and at a certain confidence, soon emerged as the ‘best available’ risk measurement and risk management tool. Regulators became comfortable to use it as the preferred basis for capital requirements, provided it was calibrated to a sufficiently conservative holding period and confidence level.
The second feature of trading instruments is that they should be very liquid. If a position proves to be wrong and makes losses, the bank should be able to close it quickly, in a matter of days, limiting the loss to a few days loss. For instance, after Société Générale uncovered the unauthorized positions accumulated by Jérôme Kerviel in January 2008, it took them approximately 3 days to unwind them. On this assumption of liquidity, regulators calibrate the ‘regulatory value at risk’ for the trading book on a 10 day holding period and a 99% confidence level. Regulatory capital is then simply a 3 to 4 times multiple of that figure (depending on the robustness of the bank’s risk models) plus an additional charge for specific risk within the trading book.
Going back to a €100bn trading book. How much would regulatory capital be?
|Value at risk (1 day, 99%)||€100m|
|Value at risk (10 days, 99%)||€320m|
|Capital charge for general market risk||€950m|
|Minimum regulatory capital||€1.9bn|
Well, if the value at risk over a 1 day holding period at 99% confidence is – say – €100m, then it is possible to scale it to a 10 day holding period by multiplying it by the square root of 10. Then the general market risk capital charge would be a multiple of 3 times and the specific risk charge would be an additional charge, which could be – say – just as big. Adding the general and specific capital charges together, the bank would be able to operate with less than €2bn capital and almost 60 times leverage.
Is this reasonable?
Why not? It all depends on the stated value at risk (as well as a less than transparent specific risk capital charge) and there is limited scope to argue against a stated value at risk based on financial statements, from the outside. The bank could have declared a €50m value at risk, simply by claiming lower correlation rates and higher diversification benefits across the trading portfolio, resulting in less than €1bn capital and over 100 times leverage ratio. Indeed, leverage ratios above 100 times are not uncommon at the level of investment banking divisions within large European universal banks. In other words, in theory, there is no minimum reasonable value at risk figure and therefore no maximum acceptable leverage for trading activities.
When current capital requirements for trading activities were introduced under the Capital Adequacy Directive in 1996, these activities represented a minor proportion of European banks activities (less than 10% in terms of assets). By 2007, in the most sophisticated universal banks they accounted for a large part of their balance sheet, in fact, in some cases, most of their balance sheet.
Unfortunately, the two assumptions behind the ‘trading book’ treatment for regulatory capital proved disastrously wrong. First of all, the fastest growing exposures, in particular credit derivatives, CDOs, subprime MBSs, are not transparently priced, and require complex models, unobservable parameters and a significant number of assumptions. The difficulty in valuing troubled assets is the principal reason why the TARP project didn’t fly. Second, these positions were not liquid at all, especially when everybody turned out to be overexposed to them in a less than transparent way. If banks could have sold CDOs within 10 days at the onset of the credit crisis, the fall-out would have been much less severe. Instead they had to hold on to exposures, or even increase them (by buying back assets sold to SIVs to protect investors’ money and their own reputation).
All of this has proved that securitization and structured finance did not transform illiquid credit exposures into tradable liquid securities, but into non transparent less liquid securities which ended up clogging the system. Risk cannot be eliminated, it can be only passed around. Securitization and structured finance ended up transferring risk away from the banking book where it was properly monitored, managed and regulated, to non-regulated financial institutions, such as mutual funds and hedge funds, and also to their own trading books, which became, in essence, huge hedge funds within the universal banks.
As a result, massive leverage was allowed to build up in the financial system, virtually undetected. At least in the US and Canada, regulators monitored the leverage ratio of banks and securities firms . In Europe, on the other hand, such was the faith in banks’ sophisticated internal models that leverage ratios were allowed to balloon as long as regulatory capital ratios were kept under control. These flawed risk and regulatory capital metrics based on value at risk gave a false sense of security that European banks were properly capitalized, when, in fact, they were already financially stretched.
As all of this becomes clearer by the day, regulators are busy redefining higher capital requirements in particular for the activities which have been identified as the main contributors to the crisis. The ‘incremental risk capital’ charge, introduced in the recent ‘Enhancement to the Basel II Framework, is intended to eliminate the regulatory capital arbitrage between banking and trading activities, by raising capital requirement for the latter to a level consistent with the former (starting by calibrating regulatory value at risk for the trading book to the same one year holding period and 99.9% confidence level applied to the banking book).
In addition, it is now apparent that the complexity of some innovative products has moved to a level that is too difficult not only to monitor and regulate from the outside, but also to manage from the inside. For this reason, regulators and analysts can and do no longer trust blindly the output of sophisticated, black-box like, internal risk models. In particular, value at risk might have been the ‘best available’ tool, yet it has not been ‘good enough’ to estimate risk. Two parallel regulatory developments are now taking place. On the one hand, regulators are trying to catch up with the increasing complexity of banks’ businesses with tighter, more sophisticated rules, such as stress testing. On the other, European supervisors are evaluating the benefits of introducing a maximum leverage ratio, like in the US and Canada (and, recently, in Switzerland). The leverage ratio is simple, crude, it is not risk sensitive at all, but at least it enables a minimum of reality check from the outside, and it should help to contain systemic risk within acceptable levels.