Julian Sanchez header image 2

photos by Lara Shipley

Werner Heisenberg, Economist

March 18th, 2011 · 8 Comments

Interesting passing observation from Yglesias:

[F]or all the horrors of the current recession it’s been managed much better than the Great Depression of the 1930s was. Progress is happening. The only way to make more rapid progress on the science of macroeconomic stabilization would be to have many more recessions so as to gather better data. Paul Krugman emphasizes that to understand the problems facing the American economy today you have to focus on the special economic properties of a large economy in an liquidity trap. But (fortunately), human history isn’t littered with examples of such a situation, so it’s challenging for him to compile a quantity of data sufficient to persuade all of his colleagues.

It seems like this should be generalizable to any disciplines that share two features: (1) They study complex, large scale real-world systems where controlled lab experiments are effectively impossible, at least when it comes to the emergent macro-phenomena, and (2) They are practically effective, in that the evolving state of the discipline powerfully affects how players in the system—here, financiers and regulators—behave within it. (This is basically just Hayek reduced to fortune-cookie size.) As the discipline advances, actors become better at avoiding or preventing undesirable outcomes—but as a result, have less data to guide the response to what shocks do occur. And since those shocks are, by definition, the ones that swamp whatever increasingly sophisticated countermeasures have been put in place to prevent them, they’re apt to be particularly large and severe, with more dire consequences if a particular “experiment” doesn’t pan out. (A similar point is often made about dependence on technology: When it fails it often turns out the once-commonplace skills or knowledge we’d previously used to get by have failed.)

This isn’t a perfect analogy, but the problem made me think of forest management, it’s long been understood that attempting to wholly prevent forest fires is usually a bad idea—the dry growth builds up until the fires you don’t manage to prevent quickly rage out of control, to far more devastating effect. Instead, we have periodic “controlled burns.” I suppose live vaccination is another example of the same idea: Deliberately expose enough of the population to a weak pathogen at staggered intervals (sometimes causing mild symptoms) and a serious epidemic becomes much less likely. (Though this, too, creates the same trade-off a level up: When an epidemic does happen, the society that’s done a good job at prevention may be ill-equipped to respond.)

Obviously, it would be perverse for any number of moral and practical reasons (not to mention a political non-starter) to suggest deliberately creating small economic crises, whether the purpose was to gather data, experiment with different policy responses, or create some kind of general vaccine effect (whatever that might mean). But it might be worth counting this as one possible cost of policy designed to preserve macroeconomic stability. The nature of the problem, alas, is that it’s probably impossible to estimate the magnitude of the cost very well, precisely because you don’t the counterfactual.

Tags: Economics · General Philosophy


       

 

8 responses so far ↓

  • 1 Tim Lee // Mar 18, 2011 at 4:55 pm

    A similar point is often made about dependence on technology: When it fails it often turns out the once-commonplace skills or knowledge we’d previously used to get by have failed.

    I know people say that, but is there evidence it’s actually true? I can’t think of very many examples of technologies displacing their predecessors and then catastrophically failing in a way that the predecessor technology wouldn’t have done.

  • 2 Adrian Ratnapala // Mar 20, 2011 at 6:15 am

    It’s also really hard to know whether it’s true that “actors become better at avoiding or preventing undesirable outcomes”. Perhaps they just get better a delaying them, like a fire crew getting better at preventing medium sized forest fires without doing controlled burns.

    The big difference between now and the ’30s seems to be that central bankers are more able and willing to expand the money supply. We know this has prevented bank collapses. What can’t yet know is whether this merely builds up the fuel for a bigger disaster. Such as a dollar run.

  • 3 Andrew Clarke // Mar 21, 2011 at 9:26 pm

    To me this dilemma has more obvious homology with the discipline of organizational safety science – trying to prevent undesirable outcomes from occuring within large complex organizations. You can’t possibly learn how to manage every single series of events that might lead to a catastrophe. Each one is simply too rare. But you can learn a lot about organizations that seem to have better safety records than others. Rittel and Webber’s 1973 article on so-called “wicked problems” is a great read, as is Snowden and Boone’s 1997 update for HBR, and most of Karl Weick’s writing about mindfulness in organizations.

  • 4 DivisionByZero // Mar 22, 2011 at 8:00 am

    Isn’t this simply problem of induction? It may be on the extreme end but it’s always the case they we may be missing the exception that proves to be the rule regardless of how much data we have. And this problem is no different than say the problems we have in anthropology with the fossil record. We assume that the *relatively* small sample we have is representative of the whole. It’s a problem all positive sciences have to one degree or another.

  • 5 DivisionByZero // Mar 22, 2011 at 8:17 am

    I guess to add a bit. We can never eliminate risk through understanding and therefore we should accept the inevitable that disaster is always possible and prepare for the disaster. Of course that raises the thorny problem of which disaster for which we prepare and this prioritization is usually based on the theory which may be completely fallacious. So I suppose there are two categories: disasters predicted by theory and those that disprove the theory. I suppose the only thing to do for the latter is to increase resiliency and redundancy as seen in some biological systems and distributed computing.

  • 6 Barry // Mar 28, 2011 at 12:49 pm

    “When an epidemic does happen, the society that’s done a good job at prevention may be ill-equipped to respond.)”

    I disagree – the society which is doing a good job at prevention would probably be the society that would care faster and more about dealing with epidemics.

    “And since those shocks are, by definition, the ones that swamp whatever increasingly sophisticated countermeasures have been put in place to prevent them, they’re apt to be particularly large and severe, with more dire consequences if a particular “experiment” doesn’t pan out. ”

    Part of what we’re experiencing is that, and part is that we rolled back a bunch of stuff, and are getting things like what happened before.

  • 7 精力剤 // Jun 22, 2011 at 9:33 pm

    紅蜘蛛

  • 8 sac à main // Aug 26, 2011 at 3:21 am

    Welcome to http://www.replicabagsell.com .Our company was founded in 2004 and was committed to internet marketing businesses in 2006.