Like many other anti-war types, I’ve joined the lament about the apparent lack of accountability for pundits who got it grievously wrong about Iraq. The familiar form of this complaint is that one ought generally to replace people who’ve been demonstrably poor at something with people who seem to be better at it. And that, in turn, has provoked folks like Megan McArdle to wonder whether it’s really clear that war opponents actually had more generally reliable epistemic machinery or just lucked out on this one. That someone made a mint playing lotto numbers from a fortune cookie doesn’t entail that they ought to be your financial advisor, after all.
But it strikes me there’s an entirely distinct reason for thinking a shakeup in the pundit class might be helpful: cognitive dissonance. Sociologist Leon Feistinger, who coined the term, had studied the puzzling behavior of a benign-seeming UFO cult. This group was relatively quiet and complacent, eschewing proselytizing, until their leader and prophets prediction of a sci-fi rapture, in which the faithful would be beamed up on a certain date, failed to come true. At that point they began a fervent outreach program, desperate to salvage their own past commitment, and the self-respect they’d now bound up with it, in the face of this new countervailing evidence.
For a more grim example, consider that psych 101 staple, the Milgram experiment, in which unwitting subjects were prepared to deliver increasingly powerful and dangerous electric shocks to actors posing as volunteers in a study of learning and punishment. It was crucial, Milgram concluded, that the subjects were instructed to proceed by tiny increments, from quite mild (fake) shocks to ostensibly lethal ones.
In part, of course, this is because it makes it more difficult to say that this is the point at which one must call a halt to things, even though it’s only a little further than the previous step. But there’s a slightly more subtle secondary effect that depends on looking backward rather than forward. For if I decide it would be wrong to push this switch, I have to reconsider whether I haven’t already acted wrongly by pushing the previous one, only slightly weaker. I might be blameworthy for not having stopped earlier. And the further I go, the greater the blame I’d have to accept if I decide it’s wrong to continue.
If you’ve ever found yourself suddenly, irrationally angry and resentful toward someone you’ve treated poorly, then you’re familiar with this phenomenon. If we’d have to reproach ourselves for mistreating someone who didn’t deserve it, we’ll often try very hard to convince ourselves they must have deserved it.
The cross-application to punditry should be obvious enough. Assume away the warping influence of ideology to the maximum extent possible, and stipulate that everyone occupying a perch at every major opinion rag or televised gabfest did their level best to assess the wisdom, necessity, and probable outcome of invading Iraq on the basis of the available intelligence. What happens as evidence mounts that the whole enterprise was and is an ill-conceived clusterfuck?
Well, if you’re an ordinary citizen who was largely taking his cues from others, you can probably change your mind and decide, without too much dissonance, that it’s time to stop hurling bodies at a doomed endeavor. And the swing in the general population’s opinion on the war has indeed been pretty dramatic.
For the pundit class, the stakes are higher. If providing foreign policy analysis is your bailiwick, you can’t easily appeal to the excuse that you were misled by others. And the more strongly you pushed the war in advance, or the greater your influence was, the more you’re apt to discover that changing your mind means accepting your share of the responsibility for what’s turned out to be a massive, futile waste of life. Moreover, as with Milgram’s switches, there’s a feedback effect: The further you go without changing your stance, the stronger the pressure to go further still, both because the body count keeps rising and because it becomes more plausible that you should have known better sooner.
Perversely, then, the pundit class, though presumably more attentive to new data, will be slower to be swayed by it than the general public. Worse, the more prominent and influential a pundit is, if this account is right, the greater will be her motivation to dig in her heels. So never mind accountability for bad predictions, maybe this is just an argument for establishing blanket term-limits op-ed columnists. Until Andrew Rosenthal calls with an offer, anyway. At that point, I expect self-interest will trump and cognitive dissonance I might feel about reversing course.