I'm not a great fan of Daniel Davies, but his post on the Lancet '650,00 deaths' is well worth a read and raises points which need answering. People (like me) quote death counts from Sudan and the Congo estimated using pretty much the same methodology.
There was a sample of 12,801 individuals in 1,849 households, in 47 geographical locations. That is a big sample, not a small one.
In the 18 months before the invasion, the sample reported 82 deaths, two of them from violence. In the 39 months since the invasion, the sample households had seen 547 deaths, 300 of them from violence. The death rate expressed as deaths per 1,000 per year had gone up from 5.5 to 13.3.
There has been debate about whether the pre-invasion death rate measured by the study is lower than other figures - thus making the increase seem larger. I'm not sure how much that affects the headline figures. 547 deaths in 1,849 households in three and a bit years. That is saying that pretty much one household in three has had a death in the last three years. If those figures are correct it doesn't strike me that worrying about the exact pre-invasion toll cuts much ice.
This is the question to always keep at the front of your mind when arguments are being slung around (and it is the general question one should always be thinking of when people talk statistics). How Would One Get This Sample, If The Facts Were Not This Way? There is really only one answer - that the study was fraudulent. It really could not have happened by chance. If a Mori poll puts the Labour party on 40% support, then we know that there is some inaccuracy in the poll, but we also know that there is basically zero chance that the true level of support is 2% or 96%, and for the Lancet survey to have delivered the results it did if the true body count is 60,000 would be about as improbable as this. Anyone who wants to dispute the important conclusion of the study has to be prepared to accuse the authors of fraud, and presumably to accept the legal consequences of doing so.
I'm sure there are a lot of people, including some at the Lancet, who want the figures to be large. Everyone against the Coalition presence wants the figures to be large, because the larger the figures the more likely the Coalition are to throw up their hands and leave. That's different from fiddlng them. I want them not to be large. But did the authors collect the data themselves ? I'm presuming not - I'm presuming Iraqis did that. What's difficult to ascertain is the quality of the data. Were the death certificates (90% svailable, apparently) photographed or scanned ? How were the households chosen ? I'd have thought some people would have moved or otherwise disappeared, given the fear of ethnic cleansing I read of. Who went out to ask the questions ? Given that security in Iraq seems pretty poor at present (i.e. you can't trust the police in many areas) how do you know the people asking were honest ? Are there statistical methods that could be used to compare data collected by different people, to spot potential exaggerations ?
Dunno mate. I gather the report is available as a pdf. Anyone read it ?
UPDATE - DFH looks at the car-bombs.
Of the 300 violent deaths, 30 (10%) were the result of car bombs in the year June 2005-June 2006. Using the survey's methodology, I believe that equates to 60,000 people killed by car bombs in one year. The most recent data available on the Iraq Body Count website lists 15 car bombs in the first half of September (ignoring bombs which targeted non-Iraqi forces); taking the highest figure for reported deaths, these bombs killed 75 people. That’s an average of 5 people killed per car bomb. On that basis, 60,000 deaths would require 12,000 car bombs in one year, or 33 per day. Either that or there are hundreds of massive car bombs killing hundreds of people which are going totally unreported.
Shuggy adds his groat's worth.
Horton (Lancet editor, anti-war-by-us - LT) in particular takes the view that the escalation of violence is largely a function of the presence of coalition troops in Iraq. Yet the situation described in the report is essentially one where violence is increasing because no one group in society has the capacity to monopolise its legitmate use. It is, in other words, a function of the fact that Iraq presently does not have, post-Saddam, a properly functioning state. Specifically the report records an increase in the casualty rate but a decline in the proportion that can be attributed to the actions of coalition troops.
It is clearly the absence of government that is the problem, which leads directly to the positions taken by those currently using these statistics as a basis to analyse recent history and prescribe future solutions. For one, since the accusation of denial - not always unjustified - has been spread abroad, it is worth considering whether there isn't another kind working here. Pre-2003 was preferable, is the argument, because while Saddam Hussein was violent in the extreme, because he enjoyed the monopoly over it, there was less of it.
Wheeling and Dealing
9 hours ago