Friday, February 01, 2008

Risky business
[My latest Muse column for Nature online news…]

Managing risk in financial markets requires better understanding of their complex dynamics. But it’s already clear that unfettered greed makes matter worse.

It seems to be sheer coincidence that the multi-billion dollar losses at the French bank Société Générale (SocGen), caused by the illegal dealings of rogue trader Jérôme Kerviel, comes at a time of imminent global economic depression. But the conjunction has provoked discussion about whether such localized shocks to the financial market can trigger worldwide (‘systemic’) economic crises.

If so, what can be done to prevent it? Some have called for more regulation, particularly of the murky business that economists call derivatives trading and the rest of us would recognize as institutionalized gambling. “If our laws are not extended to control the new kinds of super-powerful, super-complex, and potentially super-risky investment vehicles, they will one day cause a financial disaster of global-systemic proportions”, wrote John Lanchester in the British Guardian newspaper [1]. But how well do we understand what we’d be regulating?

The French affair is in a sense timely, because ‘systemic risk’ in the financial system has become a hot topic, as witnessed by a recent report by the Federal Reserve Bank of New York (FRBNY) and the US National Academy of Sciences [2]. Worries about systemic risk are indeed largely motivated by the link to global recessions, like the one currently looming. This concern was articulated after the Great Depression of the 1930s by the British economist John Maynard Keynes, who wanted to understand how the global economy can switch from a healthy to a depressed state, both of which seemed to be stable ‘equilibrium’ states to the extent that they stick around for a while.

That terminology implies that there is some common ground with the natural sciences. In physics, a change in the global state of a system from one equilibrium configuration to another is called a phase transition, and some economists use such terms and concepts borrowed from physics to talk about market dynamics.

The analogy is potentially misleading, however, because the financial system, and the global economy generally, is never in equilibrium. Money is constantly in motion, and it’s widely recognized that instabilities such as market crashes depend in a sensitive but ill-understood ways on feedbacks within the system that can act to amplify small disturbances and which enforce perpetual change. Other terms from ‘economese’, such as liquidity (the ability to exchange assets for cash), reveal an intuition of that dynamism, and indeed Keynes himself tried to develop a model of economics that relied on analogies with hydrodynamics.

Just as equilibrium spells death for living things, so the financial market is in trouble when money stops flowing. It’s when people stop investing, cutting off the bank loans that business needs to thrive, that a crisis looms. Banks themselves stay in business only if the money keeps comes in; when customers lose confidence and withdraw their cash – a ‘run on the bank’ like that witnessed recently at the UK’s Northern Rock – banks can no longer lend, have to call in existing loans at a loss, and face ruin. That prospect sets up a feedback: the more customers withdraw their money, the more others feel compelled to do so – and if the bank wasn’t in real danger of collapse at the outset, it soon is. The French government is trying to avoid that situation at SocGen, since the collapse of a bank has knock-on consequences that could wreak havoc throughout a nation’s economy, or even beyond.

In other words, bank runs may lead, as with Northern Rock, to the ironic spectacle of gung-ho advocates of the free market appealing, or even demanding, state intervention to bail them out. As Will Hutton puts it in the Observer newspaper, “financiers have organised themselves so that actual or potential losses are picked up by somebody else - if not their clients, then the state - while profits are kept to themselves” [3]. Even measures such as deposit insurance, introduced in the US after the bank runs of the 1930s, which ensures that depositors won’t lose their money even if the bank fails, arguably exacerbate the situation by encouraging banks to take more risks, secure in the knowledge that their customers are unlikely to lose their nerve and desert them.

Economists’ attempts to understand runaway feedbacks in situations like bank runs draw on another area of the natural sciences: epidemiology. They speak of ‘contagion’: the spread of behaviours from one agent, or one part of the market, to another, like the spread of disease in a population. In bank runs, contagion can even spread to other banks: one run leads people to fear others, and this then becomes a self-fulfilling prophecy.

Regardless of whether the current shaky economy might be toppled by a SocGen scandal, it is clear that the financial market is in general potentially susceptible to systemic failure caused by specific, local events. The terrorist attacks on the World Trade Centre on 11 September 2001 demonstrated that, albeit in a most unusual way – for the ‘shock’ here was not a market event as such but physical destruction of its ‘hardware’. Disruption of trading activity in banks in downtown Manhattan in effect caused a bottleneck in the flow of money that had serious knock-on consequences, leading to a precipitous drop in the global financial market.

The FRBNY report [2] is a promising sign that economists seeking to understand risk are open to the ideas and tools of the natural sciences that deal with phase transitions, feedbacks and other complex nonlinear dynamics. But the bugbear of all these efforts is that ultimately the matter hinges on human behaviour. Your propensity to catch a virus is indifferent to whether you feel optimistic or pessimistic about your chances of that; but with contagion in the economy, expectations are crucial.

This is where conventional economic models run into problems. Most of the tools used in financial markets, such as how to price assets and derivatives and how to deal with risk in portfolio management, rely on the assumption that market traders respond rationally and identically on the basis of complete information about the market. This leads to mathematical models that can be solved, but it doesn’t much resemble what real agents do. For one thing, different people reach different conclusions on the basis of the same data. They tend to be overconfident, to be biased towards information that confirms their preconceptions, to have poor intuition about probabilities of rare events, and to indulge in wishful thinking [4]. The field of behavioural finance, which garnered a Nobel prize for Daniel Kahneman in 2002, shows the beginnings of an acknowledgement of these complexities in decision-making – but they haven’t yet had much impact on the tools widely used to calculate and manage risk.

One can’t blame the vulnerability of the financial market on the inability of economists to model it. These poor folks are faced with a challenge of such magnitude that those working on ‘complex systems’ in the natural sciences have it easy by comparison. Yet economic models that make unrealistic assumptions about human decision-making can’t help but suggest that we need to look elsewhere to fix the weak spots. Perhaps no one can be expected to anticipate the wild, not to mention illegal, behaviour of SocGen’s Kerviel or of those who brought low the US power company EnRon in 2001. But these examples are arguably only at the extreme end of a scale that is inherently biased towards high-risk activity by the very rules of engagement. State support of failing banks is just one example of the way that finance is geared to risky strategies: hedge fund managers, for example, get a hefty cut of their profits on top of a basic salary, but others pay for the losses [3]. The FRBNY’s vice president John Kambhu and his colleagues have pointed out that hedge funds (themselves a means of passing on risk) operate in a way that makes risk particularly severe and hard to manage [5].

That’s why, if understanding the financial market demands a better grasp of decision-making, with all its attendant irrationalities, it may be that managing the market to reduce risk and offer more secure public benefit requires more constraint, more checks and balances, to be put on that decision-making. We’re talking about regulation.

Free-market advocates firmly reject such ‘meddling’ on the basis that it cripples Adam Smith’s ‘invisible hand’ that guides the economy. But that hand is shaky, prone to wild gestures and sudden seizures, because it is no longer the collective hand of Smith’s sober bakers and pin-makers but that of rapacious profiteers creaming absurd wealth from deals in imaginary and incredible goods.

One suggestion is that banks and other financial institutions be required to make public how they are managing risk – basically, they should share currently proprietary information about expectations and strategies. This could reduce instability caused by each party trying to second-guess, and being forced to respond reactively, to the others. It might reduce opportunities to make high-risk killings, but the payoff would be to smooth away systemic crises of confidence. (Interestingly, the same proposal of transparency was made by nuclear scientists to Western governments after the development of the US atomic bomb, in the hope of avoiding the risks of an arms race.)

It’s true that too much regulation could be damaging, limiting the ability of the complex financial system to adapt spontaneously to absorb shocks. All the more reason to strive for a theoretical understanding of the processes involved. But experience alone tells us that it is time to move beyond Gordon Gekko’s infamous credo ‘greed is good’. One might argue that ‘a bit of greed is necessary’, but too much is liable to bend and rupture the pipes of the economy. As Hutton says [3], “We need the financiers to serve business and the economy rather than be its master.”


References

[1] Lanchester, J. ‘Dicing with disaster’, Guardian 26 January 2008.
[2] FRBNY Economic Policy Review special issue, ‘New directions for understanding systemic risk’, 13(2) (2007).
[3] Hutton, W. ‘This reckless greed of the few harms the future of the many’, Observer 27 January 2008.
[4] Anderson, J. V. in Encyclopedia of Complexity and Systems Science (Springer, in press, 2008)
[5] Kambhu, J. et al. FRBNY Economic Policy Review 13(3), 1-18 (2008).

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.