In Defense of Sophistication

Soon turning into a rant against pundits

Via Andrew Gelman come a piece by American Enterprise Institute writer Steven Hayward on "The Irrelevance of Modern Political Science" and a response from John Sides. I am in no position to defend current Amercian political science, but Hayward's point is of relevance for the social sciences, and indeed epistemology, more generally.

Hayward writes:
The real problem with academic political science is its insistence on attempting to emulate the empiricism of economics and other social sciences, such that the multiple regression analysis is considered about the only legitimate tool of the trade. Some regressions surely illuminate, or more often confound, a popular perception of the political world, and it is these findings Klein rightly points out. But, on the other hand, I have often taken a random article from the American Political Science Review, which resembles a mathematical journal on most of its pages, and asked students if they can envision this method providing the mathematical formula that will deliver peace in the Middle East. Even the dullest students usually grasp the point without difficulty.
The silliest stuff first. Responds Gelman:
[T]he U.S. Army didn't deliver peace in the Middle East either, and at a far higher budget than the American Political Science Association!
More to the point, first, it is a bit mysterious why multiple regression analysis, which Hayward seems to consider the latest in statistical sophistication, should be less appropriate for political science than for "economics and other social sciences". (Is he thinking of Queer Studies?) Second, if maths is used, that's somehow not okay, though he never explictly says why. Third, the ultimate test for the veracity of a statement is whether the dullest students - not the brightest, the dullest - agree with it.

I have no doubt that there is an overuse of fancy mathematical methods in some social science work and have derided such excesses on the pages of this very blog. But to conclude that something must be wrong, or at least irrelevant, because it is hard to understand is utterly silly. Even when the topic is something relatively simple as the weather, scientists reach for the simplifying tools of mathematical modeling; shouldn't we expect these tools to be even more needed when the topic is a system as complicated as a polity? Should the criterion for using a method be whether Steven Hayward can understand it?

The fact of the matter is that some stuff is genuinely hard to understand. Students don't spend a lot of time learning stuff for no reason. Political scientists, hopefully, are the experts for understanding the methods that are deemed appropriate for the study of politics by their peers. No wonder Tom, Dick or Steven Hayward can't understand them - presumably he doesn't understand the contents of your average condensed matter physics journal either.

The right has no monopoly on the if-it's-hard-to-understand-it-can't-be-right heuristic. Here's Barbara Ehrenreich on a paper on the development of female happiness:
Only by performing an occult statistical manipulation called "ordered probit estimates" do the authors manage to tease out any trend at all
Comments Justin Wolfers, co-author of the paper in question:
O.K., so her first criticism is that we use an appropriate statistical technique for dealing with ordered responses
One might, of course, want to criticize the use of ordered probit regression on the data at hand. But Ehrenreich is in no position to do this. All she knows is that she's never heard of it, which is what "occult" means in the sentence above.

This leads us to a strange phenomenon to be observed in the US media. It is common over there to employ writers as pundits who churn out opinions on everything from evolutionary psychology to global warming. I have a hard time thinking of anyone who would be qualified to write on both topics, let alone all of those in between. Yet this appears to be of no concern to the writing hands.

If they were honest, pundits in papers and on the web would routinely have to produce paragraphs like the following:
So, what about the death penalty - am I for or against it? Well, an important question in this respect is whether or not it saves lives. This question is studied by scholars from a number of social science disciplines, and I had a look at some of their research. But, frankly, their debates soon turn to questions such as the assumptions behind "exclusion restrictions" for "instrumental variables regressions", and this stuff is over my head. So I declare ignorance. As a consequence, I don't have any strong views on the death penalty.
But a pundit - a glorified bloke at the bar - knows no uncertainty. On and on and on he opines. Ignorance is bliss. Finis.

No comments: