But as Mr Nakaso watched western markets in July 2007, he had a sense of déjà vu. “I see striking similarities in what I see today with the early stages of our own financial crisis [in Japan] more than a decade ago,” he privately warned international contacts shortly after IKB, a German lender, imploded as a result of subprime losses. “Probably we will have to be prepared for more events to come ... the crisis management skills of central banks and financial authorities will be truly tested.”
His fears proved well-founded. On August 9 2007, the European Central Bank sent shock waves around world financial capitals when it injected €95bn ($150bn, £75bn) worth of funds into the money markets to prevent borrowing costs from spiralling sharply. The US Federal Reserve soon followed suit. But while the central banks had billed these moves as “pre-emptive” actions to quell incipient market tensions, they did not bring the panic to an end.
On the contrary, as markets that were crucial for raising funds started to dry up last August, a network of financial vehicles slid into crisis, causing the price of many debt securities to collapse. That started a chain reaction that created liquidity and solvency crises at US and European banks – on a scale last seen in Japan almost exactly a decade ago.
A year later, there is still no sign of an end to these problems. Instead, the sense of pressure on western banks has risen so high that by some measures this is now the worst financial crisis seen in the west for 70 years.
What has made this upheaval so shocking is not simply its scale and duration but the fact that almost all western policymakers and bankers were caught unawares. “If you had said a year ago that America could suffer a banking crisis on the scale of Japan, people would have laughed,” one former senior US regulator admits.
Or as the Bank for International Settlements, which groups central banks, observes in its latest annual report: “The duration of the turmoil, its scope and the growing evidence of effects on the real economy have come as a great surprise to most commentators, private as well as public.”
Adding that it “is essential we understand what is going on”, the BIS points out that the crucial question is: “How could problems with subprime mortgages, being such a small sector of global financial markets, provoke such dislocation?”
The answer to this seeming mystery lies in the slippery concept of financial “faith”. Over the past decade, western banking has experienced an extraordinary burst of innovation, as financiers have discovered ways to slice and dice their loans – such as the now controversial subprime mortgages – and then turn these into securities that can be sold to investors all over the world.
Tracking the scale of this activity with any precision has always been hard, since much of it occurs in private deals. However, industry data suggest that between 2000 and 2006, nominal global issuance of credit instruments rose twelvefold, to $3,000bn (£1,519bn, €1,929bn) a year from $250bn. This activity appears to have become particularly intense from 2004, partly because investors were searching for ways to boost returns after a long period in which central banks had kept interest rates low.
To be sure, ahead of last summer’s crisis some policymakers and investors were uneasy about the scale of this explosion. In particular, there was growing concern that “slicing and dicing” was fuelling a credit bubble, leading to artificially low borrowing costs, spiralling leverage and a collapse in lending standards. When world leaders gathered in Davos for the annual economic forum in January 2007, Jean-Claude Trichet, governor of the ECB, complained about the opacity of some financial innovation and warned that there could soon be some “repricing of credit risk”.
From 2005 onwards, Timothy Geithner, president of the New York Federal Reserve, called on banks to prepare for so-called “fat tails” – a statistical term for extremely negative events which occur more commonly than usual banking models suggest. Behind the scenes, a few bankers and investors also prepared for a crash. Deutsche Bank, for example, started betting on subprime defaults as early as 2006, while JPMorgan Chase placed trades to protect itself from a crash in spring 2007 and asset managers such as Pimco and BlackRock stopped purchasing many debt instruments in early 2007. Yet most investors, bankers and even regulators did not change their behaviour to any significant degree, owing to a widespread adherence to three big assumptions – or articles of faith – that have steathily underpinned 21st century finance in recent years.
The first of these was a belief that modern capital markets had become so much more advanced than their predecessors that banks would always be able to trade debt securities. This encouraged banks to keep lowering lending standards, since they assumed they could sell the risk on. “Abundant market liquidity led some firms to overestimate the market’s capacity to absorb risk,” says the Institute of International Finance, a Washington-based lobby group, in a recent report. “The same buoyant environment resulted in market pressure for high returns ... and high levels of competition among financial firms.”
Second, many investors assumed that the credit rating agencies offered an easy and cost-effective compass with which to navigate this ever more complex world. Thus many continued to purchase complex securities throughout the first half of 2007 – even though most investors barely understood these products.
But third, and perhaps most crucially, there was a widespread assumption that the process of “slicing and dicing” debt had made the financial system more stable. Policymakers thought that because the pain of any potential credit defaults was spread among millions of investors, rather than concentrated in particular banks, it would be much easier for the system to absorb shocks than in the past. “People had looked at what had happened to the Japanese banks and said, ‘this simply cannot happen here’, because the banks were no longer holding all the credit risk,” one senior European policymaker recalls.
In private, some central bank officials harboured doubts about this new creed. From 2003, senior officials at the BIS in Basel, for example, repeatedly warned that risk dispersion might not always be benign. However, such warnings were largely kept out of public view, partly because the US Federal Reserve was convinced that financial innovation had changed the system in a fundamentally beneficial way.
Consequently, no attempt was made to force banks to boost their capital reserves to offset exploding debt issuance; instead, regulatory rules permitted banks to cut their capital levels sharply, which they duly did. “People really believed that the world was different,” recalls Larry Fink, head of BlackRock investment group. “There was this huge trust in the intellectual capital of Wall Street – and that appeared to be supported by the fact that banks were making so much money.”
As a result, when high rates of subprime default emerged in late 2006, there was initially a widespread assumption that the system would absorb the pain relatively smoothly. After all, the system had easily weathered shocks earlier in the decade, such as the attacks of September 11 2001 or the collapse of the Amaranth hedge fund in 2006. Moreover, the US government initially estimated that subprime losses would be just $50bn-$100bn – a tiny fraction of the total capital of western banks or assets held by global investment funds.
In fact, the subprime losses started to hit the financial system in the early summer of 2007 in unexpected ways, triggering unforeseen events such as the implosion of IKB. And as the surprise spread, the three pillars of faith that had supported the credit boom started to crumble.
First, it became clear to investors that it was dangerous to use the ratings agencies as a guide for complex debt securities. In the summer of 2007, the agencies started downgrading billions of dollars of supposedly “ultra-safe” debt – causing prices to crumble. Last week, for example, Merrill Lynch sold a portfolio of complex debt at 22 per cent of its face value, even though this had carried the top-notch triple-A rating.
Then, as bewildered investors lost faith in ratings, many stopped buying complex instruments altogether. That created an immediate funding crisis at many investment vehicles, since most had funded themselves by issuing notes in the asset-backed commercial paper market. It also meant that banks were no longer able to turn assets such as mortgages into subprime bonds and sell these on. That in turn meant the second key assumption that had underpinned 21st-century finance – that the capital markets would always stay liquid – was overturned. Worse still, the third pillar of faith – that banks would be better protected from a crisis because of risk dispersion – also cracked. As investment vehicles lost their ability to raise finance, they turned to their banks for help. That squeezed the banks’ balance sheets at the very moment that they were facing their own losses on debt securities and finding it impossible to sell on loans.
As a result, western banks found themselves running out of capital in a way that no regulator or banker had ever foreseen. Peter Fisher, a managing director of BlackRock and former US Treasury undersecretary, wrote in a recent paper: “It seems clear that risk dispersion did not work as expected. Major financial institutions did not succeed in shedding risks so much as transferring them among their own business lines.”
Banks started hoarding cash and stopped lending to each other as financiers lost faith in their ability to judge the health of other institutions – or even their own. “Firms became reluctant to participate in money markets ... as a result subprime credit problems turned into a systemic liquidity crunch,” says the IIF.
Then a vicious deleveraging spiral got under way. As banks scurried to improve their balance sheets, they began selling assets and cutting loans to hedge funds. But that hit asset prices, hurting those balance sheets once again. What made this “feedback loop” doubly intense was that the introduction of mark-to-market accounting earlier this decade forced banks to readjust their books after every panicky price drop – in contrast to the pattern seen in the 1990s Japanese banking crisis, or the Latin American debt debacle of the 1980s.
At several points over the past year, policymakers have hoped that this vicious cycle might be coming to an end. Last autumn, for example, conditions briefly improved; early this year brought another respite when central banks pumped more liquidity into the system. Similarly, when the Fed stepped in to prevent the implosion of Bear Stearns in March, sentiment stabilised for a period.
However, in practical terms, the real challenge for financiers and policymakers now – as in Japan a decade ago – is how to build a new sense of trust in finance. In the medium term, regulators are preparing reforms that aim to make the system look credible, even in a world where the benefits of risk dispersion are no longer taken as a creed. These would force banks to hold more capital and ensure that the securitisation process is more transparent. Separately, groups such as the IIF are trying to introduce measures that could rebuild confidence in complex financial instruments.
More immediately, the banks are trying to rekindle investor trust by replenishing their capital bases. The IIF calculates that in the year to June, banks made $476bn in credit writedowns, as debt prices plunged in the panic (although tangible credit losses are hitherto just $50bn). However, they have also raised $354bn in capital. Financiers are also trying to restart trading in frozen debt markets. Experience from earlier financial crises suggests that this will only occur when investors are convinced that they have seen true “clearing prices”. Events such as Merrill Lynch’s recent fire sale of its CDO portfolio may be a step in this direction.
But while confidence is returning in some areas, it continues to be undermined in others. A decade ago in Japan, the banking woes started with a property slump but later spread when banks were forced to cut their lending – which unexpectedly created more bad loans. Thus far, banks have not yet encountered this “second round” effect on a significant scale.
Though defaults are rising on consumer loans, for example, losses on corporate debt remain modest. However, most bankers and policymakers fear that a second wave is simply a matter of time. That makes it hard to predict when the credit crunch will end, how big the total losses may eventually be or even whether the banks are adequately capitalised yet.
“What we learnt in Japan is that banks have a tendency to underestimate how their assets could deteriorate due to the feedback problems,” Mr Nakaso recalls.
A year into the credit crisis, in other words, trust remains a rare commodity in the banking world. It will take years, not months, to restore that crucial ingredient – particularly given that so many of the assumptions underlying 21st-century finance have turned out to be so dangerously wrong.