On the afternoon of Dec. 30, I was sitting in D.C. Metropolitan Police Department’s (MPD) command center with more than a dozen other journalists waiting for Chief Cathy Lanier and Mayor Vincent Gray to arrive. They had called the New Year’s Eve eve press conference to talk about the year in crime and policing, and, in part, to talk about MPD’s incredible 94 percent homicide case closure in 2011.
I knew that nowhere near 94 percent of 2011′s homicides listed in the Homicide Watch database were closed. I pulled up the site and did a quick check to make sure. According to our data, 61 of 108 homicides — or 56 percent — had been closed with an arrest.
When I checked with Lanier later that afternoon, I learned that the case-closure arithmetic MPD was using included the closures of homicides from previous years in the calculation of the 2011 calendar year’s closure rate. It was math that conformed to federal guidelines, but not math that I thought the public understood. To help, I wrote a quick post in our Year in Review package explaining the much-repeated case closure rate.
On Feb. 18, the Washington Post followed that explainer piece with an in-depth investigative feature headlined “The trick to D.C. police force’s 94% closure rate for 2011 homicides.”
Wrote reporter Cheryl Thompson:
The closure rate [Lanier] presents for the District is 154 percent higher than Boston’s and at least 104 percent higher than Baltimore’s, and it gives residents reason to believe that D.C. police have been remarkably successful at solving homicide cases under her watch.
But an examination of District homicides found that the department’s closure rate is a statistical mishmash that makes things seem much better than they are. The District had 108 homicides last year, police records show. A 94 percent closure rate would mean that detectives solved 102 of them. But only 62 were solved as of year’s end, for a true closure rate of 57 percent, according to records reviewed by The Post.
D.C. police achieved the high closure rate last year by including about 40 cases from other years that were closed in 2011.
Lanier’s editorial in the Washington Post, titled “There is no ‘trick’ to D.C.’s homicide closure rate,” fought back:
On Feb. 19, The Post published a front-page article headlined “The trick to D.C.’s homicide closure rate,” suggesting that the Metropolitan Police Department (MPD) was somehow tricking the public by announcing that it had a 94 percent homicide closure rate. To support its slanted claims, the article used misleading and inflammatory quotes from ill-informed sources. Furthermore, the writer left out information supplied by my department that would have invalidated the assertions contained in the story.
The MPD’s homicide clearance rate is calculated, as it is in most police departments in the country, using the Uniform Crime Reporting (UCR) guidelines established by the FBI in the 1930s — guidelines that are the national standard for reporting homicide clearance rates. The UCR closure standard is not a new development in the District; it has been used by the D.C. police since the early 1980s.
Select letters to the editor were published in the Washington Post on Feb. 21. Among them was a letter from Gregory R. Sullivan, a retired MPD homicide detective. He wrote:
I was a homicide detective in the 1990s when the Metropolitan Police Department transitioned from straightforward crime reporting to the FBI’s Uniform Crime Reporting (UCR) standard. While statistically speaking the UCR numbers are misleading, the approach has its merits. Before UCR, the success of the police department was based only upon current-year case closures, so detectives were given little incentive to pursue prior-year cases.
After UCR, every case counted, freeing detectives to put their focus where needed. For families of past victims, it made an enormous difference. And for the detective, UCR allowed for a truer picture of actual productivity and success.
The Post’s ombudsman looked into the matter in his March 2 column.
Rather than suggesting that Lanier was fudging numbers, I think the story would have worked far better as a straightforward explanation of how the Metropolitan Police Department, other major police departments and the FBI keep homicide statistics — and of some of the pitfalls in that method.
Lanier, as you might imagine, was hopping mad after the original story; I know because she and her senior aides met with me after meeting with top editors. They made several valid complaints, but I’m going to stick to the statistics for this column because they go to the heart of this story.
The problem with homicide statistics is that there isn’t a perfect way to express them.
The original story now includes an editors note.
The article characterized the department’s reporting of homicide-closure rates as a “statistical mishmash that makes things seem much better than they are.” It also suggested that the department’s methodology produced a number that was not “a true closure rate.” As a result, the article, as well as elements of the headline and an accompanying graphic, implied that the department artificially inflated public data on the number of cases that are closed each year.
In fact, as the article reported, the department has followed practices consistent with federal crime-data guidelines and relied upon the same methodology used by other major municipal police agencies. The department hasn’t altered the ways it calculates homicide-closure rates since Cathy L. Lanier became chief in 2007, and it discloses its methodology in its annual report.
The Post’s story was picked up and rewritten or reported by several news outlets, including the Associated Press, the Baltimore Sun, WAMU, and DCist.
Lanier has also spoken publicaly about the story at least twice: once in an hour long interview with News Channel 8, and again with Mark Seagraves on WTOP’s Ask the Chief program.
Journalism think tank Poynter explored the Post story— and editor’s note— in a column March 1.
Jeff Leen, a Post assistant managing editor in charge of investigation projects, said in that report:
We set out to write a story about how police were touting a statistic that could be misleading. We did not intend to suggest that police had manipulated data to derive that statistic. The note was intended to address that confusion. We stand by the thrust of the story, that police choose to report a certain statistical approach to homicide closure rates that makes those rates look much better than a simple accounting would. The methodology used by the D.C. police conforms to FBI standards, and can be statistically valid but, without further explanation, it tends to make things look much better than they are.
I told Poynter:
While I think the Post’s story suffered from an unfortunate headline that overstated MPD’s actions, and from oversimplifying the mathematical calculations, calling them a ‘statistical mishmash,’ for example, the Post’s story added depth to our story by investigating how other police agencies report the same number. For the public, understanding these numbers is critical to how safety, and the effectiveness of policing, is perceived. Good math or bad math, I’ll leave that to the experts, politicians (and The Washington Post) to debate. What is important is that the public knows how the math is done.
For an easy explainer on that math, see Homicide Watch DC’s original post.