Regulating Complacency: Human Limitations and Legal Efficacy

Steven L. Schwarcz is the Stanley A. Star Professor of Law & Business at the Duke University School of Law. This post is based on a recent article by Professor Schwarcz.

The limitations of human irrationality impose critical constraints on the efficacy of law. Recent studies have shown, however, that irrationality can be addressed and sometimes improved. This article examines how insights into human rationality can improve financial regulation.

The article identifies four categories of human limitations that can impair financial regulation: herd behavior, cognitive biases, overreliance on heuristics, and proclivity to panic. Herd behavior refers to the tendency of people to follow what others are doing. That tendency is not necessarily irrational or bad; it can improve financial markets if a firm’s managers follow the behavior of other firms whose managers have more or better information. But herd behavior becomes problematic to the extent some followers may not be acting in their self-interest or the interest of the party for whom they are serving.

A firm’s managers might follow the behavior of other firms’ managers, for example, thinking such other managers have more or better information. In reality, though, they merely may be following a misleading information cascade—a convergence of action based on a belief that the prior actors have better information, whereas the convergence reflects imitation more than good information. The frenzied worldwide demand to purchase certain mortgage-backed securities (“MBS”) in the years prior to the financial crisis partly reflected the herd behavior of investors following a misleading information cascade.

Cognitive biases are psychological coping mechanisms that simplify our perception of reality. Availability bias is the tendency to overestimate the frequency or likelihood of a recent or especially vivid event. Optimism bias is the tendency to be unrealistically optimistic when thinking about negative events with which we have no recent experience, and devaluing the likelihood and potential consequences of those events. Cognitive biases can combine to create a tendency to define future events by the recent past.

Prior to the financial crisis, for example, banks made loans to risky “subprime” borrowers who used the loan proceeds to purchase homes and then mortgaged their homes as collateral to the lenders. Although these loans were not initially overcollateralized, the banks expected housing prices to continue rising—reflecting the tendency to define future events by the recent past. If prices had continued rising, the increasing collateral value would have protected the loans. In reality, a collapse in housing prices caused many subprime borrowers to default on their now-undercollateralized mortgage loans.

Overreliance on heuristics refers to undue reliance on explicitly adopted simplifications of reality. Heuristics are especially important in areas of complexity, such as financial markets. Investors, for example, use credit ratings to help estimate risks associated with securities. Without reliance on heuristics, financial markets could not operate.

Problems can occur, however, when there is overreliance on heuristics. Prior to the financial crisis, investors rarely questioned the accuracy of credit ratings, which had a long record for reliably assessing creditworthiness on simple debt instruments such as corporate bonds. But that unquestioning faith continued even when ratings were extrapolated to new, much more complex and highly leveraged, high-yield MBS.

Proclivity to panic, the fourth category of human limitations that can impair financial regulation, is caused by sudden changes and the influx of new information that cause an information overload, activating a flight reflex to remove oneself from the perceived danger. Thus, when presumably safe investment-grade rated MBS defaulted or were downgraded in 2007, investors panicked and stopped investing not only in MBS but also in all debt securities. The resulting loss of credit contributed to the financial crisis.

Because these human limitations can trigger and transmit systemic risk, they should be regulated. Human nature cannot be easily changed, but some steps can be taken. For example, requiring more robust disclosure and due diligence could help to increase and strengthen the reliability of market information, thereby reducing the herd behavior that results from information cascades.

Cognitive biases could be regulated through an approach that Professors Jolls and Sunstein call “debiasing through law.” This involves making an event more “available” to individuals, such as by exposing them to a concrete instance of the event’s occurrence. Smokers are more likely to be convinced by poignant and concrete narratives rather than general information of health risks. Cognitive biases could also be reduced by framing information about an event. A person is more likely to choose to have an operation if told “[o]f one hundred patients who have this operation, ninety are alive after five years” than if told “[o]f one hundred patients who have this operation, ten are dead after five years.”

Applying these insights to financial regulation suggests that regulation should require more concrete investor warnings in prospectuses. The financial crisis might have been less likely to occur, for example, if regulators had required stronger market awareness that loans that are not initially overcollateralized are inherently risky, given that even a small decline in collateral value could jeopardize repayment.

Overreliance on heuristics could be regulated, for example, by requiring firms to engage in more self-aware operational risk management and reporting. Even a simple reminder that negative economic shocks have occurred in the past can encourage more critical reflection and accurate risk assessments. The Basel III capital-adequacy guidelines thus require banks to engage in periodic financial “stress” scenarios, in order to motivate them to consider the possibility of, and to better prepare for, future periods when previously adequate liquidity and capital resources might prove inadequate. Officials from the Federal Reserve have similarly touted stress tests in creating a “strong, accountable, and proactive risk culture.”

If these debiasing techniques are inadequate, regulators could also consider banning overreliance on overreliance. The post-crisis attempt by regulators to reduce overreliance on credit ratings illustrates such a ban. Because bans may not always be realistic, regulatory responses should also focus on attempting to increase the accuracy of heuristics.

Panics could be regulated by attempting to promote market stability and calm the out-of-control feeling that activates the flight reflex. The classic example is a government guarantee of bank accounts to help deter the collective flight of depositors known as a bank run. But irrationality can exceed even the best regulatory controls. Although federal deposit insurance has been a successful strategy for reducing instability caused by panic-induced bank withdrawals, during the financial crisis depositors did not feel their funds would be safe in any banking system.

In short, notwithstanding the best regulatory efforts, we do not yet understand human nature well enough to anticipate all the causes of panic or even to fully address the other categories of human limitations. Insights into human irrationality thus can improve financial regulation but cannot make it perfect. Financial regulation should therefore have an additional goal: to mitigate the harm from systemic shocks that inevitably will occur.

The complete article is available for download here.

Both comments and trackbacks are currently closed.