Tuesday, January 26, 2010

The illusion of stability, part 1

“The real trouble with this world of ours is not that it is an unreasonable world, nor even that it is a reasonable one. The commonest kind of trouble is that it is nearly reasonable, but not quite. Life is not an illogicality; yet it is a trap for logicians. It looks just a little more mathematical and regular than it is; its exactitude is obvious, but its inexactitude is hidden; its wildness lies in wait.” - G.K. Chesterton
We have seen this happen many times. A financial crisis erupts and everybody, including the New York Times, remembers risk management. There appear lengthy expositions of the falsity of assuming normally distributed returns, and everyone loudly wonders why the industry was not warned by risk models. However, as soon as the situation stabilizes – or rather appears to stabilize – risk management is again relegated back to the specialized conferences and, incredible as it may seem after the last twenty years, tracking error is again used to completely describe the risk profile of the portfolio.

This is not a conspiracy; rather it is the effect of what John Cassidy the illusion of stability. This illusion is supported by a few pillars, each of which I intent to discuss in this and the next few posts. The first pillar is the economic theorizing that comes from looking at the economy as a physician looks at the elementary particles or an astronomer at the galaxy. This is the logician’s trap taught in every institution of higher learning.

In January 2009, the Basel Committee on Banking Supervision finally attempted to disconnect the feeding tubes and get out of the matrix when it proclaimed that:
“most risk management models, including stress tests, use historical statistical relationships to assess risk. They assume that risk is driven by a known and constant statistical process. Given a long period of stability, backward-looking historical information indicated benign conditions so that these models did not pick up the possibility of severe shocks nor the build up of vulnerabilities within the system.”
I know I have used the above quote before. Nevertheless, I keep using it because I believe that it not only provides in a capsule form many of the key misconceptions about risk management, but gives a glimpse of possible ways to deal with them. I have written before about the problems with the present paradigm and the ways of correcting for them. Now that I have put my logical cart before my imaginary horse, let me go back and find the horse. In other words, I would like to briefly discuss the sources of the misconception as I see them, mainly in the economic theories of equilibrium. Why did much of the financial industry believe that “risk is driven by a known and constant statistical process”? Surely, this is not an obvious observation; it requires a certain mindset, a view of the financial markets as a kind of galaxy that we can observe with the telescope and count that the resulting calculations will not need to be changed from day to day.

To understand the source of this view we need to look no further than Leon Valras, a brilliant French economist who was the first to attempt to create a mathematical equilibrium model of the economy. His friends later remembered that he was very inspired by the book on physics that he had then recently read. It fascinated him so much that he vociferously proclaimed his intention to create a new science of political economy, one that would governed by calculus equations just like physics. “Equilibrium” is derived from physics, and it was only natural to look for the equilibrium in the economic system given his premise. One key feature of his model had far reaching implications and it affects virtually every area of economic and financial thinking including risk management. This idea in French is called “totonemont” and it essentially defines the process of gradual adjustment by which participants in the economy slowly move it toward its equilibrium. The process roughly is as follows:
  1. Sellers and buyers announce the prices at which they are willing to transact.
  2. If the prices match, then the equilibrium is reached according to a set of equations written down by Valras.
  3. If the demand and supply are mismatched, prices are altered in increments until the balance is reached, a gradual process called “totonemont.”

As we can see, this model appears to simply follow common sense, something we observe in our daily lives. However, we need to ask a question that is extremely relevant today for any finance practitioner: what makes this change gradual? Why would we assume that the process is constant and stable? The basic answer to these questions given by quantitative economics and quantitative finance is the assumption that supply and demand are relatively stable.

It is interesting to know that Valras explicitly applied this model to financial markets, despite the fact that Europe saw a number of financial bubbles in the18th and 19th centuries. If we are talking about wheat or corn, it might be reasonable to suppose that neither the consumers’ desire to consume them nor the producers’ ability to produce them will change very quickly. The first is limited by the physiology of the human beings, and the second by the physiology of planet Earth. But the situation is quite different for financial assets. There is no obvious limitation on the amount of financial assets buyers are willing to purchase other than the supply of liquidity and credit in the economy. And as we recently saw, the supply of financial assets may grow very quickly if the demand is there (the MBS and CDS markets are only two recent examples).

More so, as German economist Gustav Schmaller observed more than a hundred years ago, demand can actually increase with price. By now it should be obvious that the situations of demand increasing with price or decreasing with it are so common in finance that they almost constitute the rule. This problem lies waste to any attempt to view financial markets as stable or constant. As risk practitioners, we should be aware that there have been and will be periods when supply and demand change drastically, making the models calibrated in normal times useless. When demand for financial assets falls with prices, it creates a wave of demand for and subsequent shortage of liquidity, which is really what is hiding behind the frequently mentioned “rise in correlations.”

The fact that we do not have a stable process that easily lends itself to modeling should not deter us from quantifying the risk of our portfolios and our exposure to such extreme situations. Our primary answer to this problem has been the development of the Event Weighted method of stress testing. This method assumes that the situations in which demand falls with price have many similarities across time, and therefore we can look for similar periods of liquidation to estimate how our portfolio will respond to future instances. This assumption has shown its validity in empirical tests and certainly does not require the leap of faith involved in assuming that financial markets are as orderly as the Solar galaxy in its motion. The financial system may not be stable and constant, but the reaction of the participants in times of instability can be modeled to supply a risk manager with valuable input for decision making.

Don't miss the next post in this series. Receive new blogs by e-mail.

No comments:

Post a Comment