June 27, 2012 Celeste Monforton, DrPH, MPH 3Comment

It’s not the first time that Kenneth Rosenman, MD has provided scientific evidence on the deficiencies in the Bureau of Labor Statistics (BLS) annual survey of occupational injuries and illnesses, and it won’t be the last.  His latest study, written with Joanna Kica, MPA, with Michigan State University’s (MSU) Department of Medicine ,reports that the Labor Department’s methods for estimating work-related burns misses about 70% of them.  Their analysis focused on cases occurring in the State of Michigan in 2009.   The MSU researchers used data from the State’s 134 acute-care hospital, which are required by State public health regulations to report certain work-related injuries, (e.g., burns, amputations) as well as data from the Michigan’s workers’ compensation agency, and the poison control center to identify all cases of work-related burns.  They merged the data from the multiple sources, matched and eliminated duplicates, and identified 1,461 burns occurring among Michigan workers in 2009.

But if policymakers, the public and journalists want the “official” tally from BLS’s annual Survey of Occupational Injuries and Illnesses (SOII), the number of work-related burns in 2009 in the State of Michigan was 450.  (Use can use this search tool to obtain that number.)

Why is the “official” tally only 450 cases, when there were three times as many?  Because the Labor Department relies on a voluntary survey of employers for its estimate.   Let me repeat: for this critical public health surveillance data, the U.S. relies solely on a voluntary survey of employers.   If we assume, and I believe it is a safe assumption, that the BLS’s systems undercount of work-related burns  in Michigan is repeated across the country, instead of the Labor Department’s estimate of 24,500 occupational burns nationwide in 2009, the figure would be closer to 73,500 cases.

A previous analysis by the MSU researchers published in 2006, examined all work-related injuries among Michigan workers in 1999 through 2001.  They compared the BLS’ SOII case count to their multi-source surveillance counts, and reported about 68% of cases missing from the “official” BLS system.  Now Rosenman et al’s 2012 analysis with a 70% undercount, and those conducted by other researchers (e.g., here, here, here, here, here, here, here) demonstrate yet again that our nation’s current system to estimate the incidence of work-related injury and illness is deeply flawed.   MSU’s Ken Rosenman, MD put it this way:

“This is another report that highlights the need for OSHA and BLS to address the undercount inherent in a surveillance system solely based on employer reporting.”

I could not agree more.  Regrettably, this is not a new problem, but Labor Secretary after Labor Secretary has failed to address it.

As I wrote in a previous post, the House Committee on Government Operations held a series of hearings in 1984 on the Executive Branch’s inability to address occupational illness surveillance.  They noted:

“Since the passage of the OSH Act nearly 15 years ago, a bipartisan failure of four administrations has thwarted the mandated development of an information and data collection system on occupational diseases. No reliable national estimate exists today, with the exception of a limited number of substance-specific studies (such as on asbestos), on the level of occupational disease, cancer, disability, or deaths. It cannot be meaningfully determined if diseases from chronic exposure to hazardous substances represents a greater problem today than when the OSH Act was passed in 1970.”

In 1987, an expert panel convened by the National Research Council issued its report entitled “Counting injuries and illnesses in the workplace: proposals for a better system.”  Some of the NRC’s recommendations from 25 years ago described using multi-source surveillance systems, and that’s just what Dr. Rosenman and others have done.  They’ve shown it’s not only feasible, but they receive consistent results with respect to the degree of missing cases in the BLS system.

The public health researchers who know best these data systems understand the value of using multiple systems to try to capture the entirety of work-related injury and illnesses cases.  They are also pragmatists who understand that significant resources would be required to develop a multi-system surveillance.  As Rosenman correctly points outs, the current BLS SOII is a statistical sampling of U.S. employer.  As a statistics-driven agency, there’s no reason the Labor Department’s BLS couldn’t improve its estimate with additional statistical techniques to adjust for SOII’s undercount.  There are at least 13 State-based programs across the country—-from the ones reported on by Rosenman in Michigan, to those in California, Michigan, North Carolina, Oregon, Wisconsin and elsewhere—with talented experts who could assist BLS in developing “correction factors” for its SOII.

How many more studies are needed to demonstrate that the Labor Department’s annual estimate of occupational injury and illnesses grossly misses the mark?

===

*Note: When OSH Act of 1970 became law and the Secretary of Labor assigned certain functional responsibilities through his Department, he delegated authority (36 Fed Reg 8754)  for the collection, analysis and reporting of health and safety statistics to BLS.  It’s been with BLS ever since, and BLS’s confidentiality policies prohibit OSHA from obtaining any of its raw data.

 

 

3 thoughts on “Researchers challenge Labor Dept to fix its annual count of injuries, misses 70% of work-related burns

  1. This blog draws too strong a conclusion that voluntary employer reporting is responsible for the much lower BLS estimate of burns as compared to the Rosenman/Kica (R/K) count. There is an important difference in the scope of the cases being counted. The BLS estimates only the number of burns that result in at least one day away from work following the day of the burn event. Other OSHA recordable burns, resulting in days of only restricted work or in no lost work time after the day of the burn, are counted in the aggregate total recordable case counts (and in sub-categories), but are not published separately as burns. In contrast, the R/K count includes all burns, regardless of whether a burn resulted in a subsequent day away from work. For example, R/K estimates that about 29 percent of burns for which a degree of burn was available involved first-degree burns. It is reasonable to think that these may not have resulted in a day away from work (after the day of the burn). Other more serious burns may also not result in a day away from work. These burns may well be counted in the BLS aggregate “summary” estimate for Michigan, though they would not be reported separately in BLS estimates as burns. Thus, while the R/K study provides valuable information on work-related burns that complements and expands on the BLS data, it is incorrect to conclude from the study that the BLS misses 70% of work-related burns in Michigan.

  2. While one may discuss why the BLS missed 70% of the work-related burns in Michigan in 2009 (i.e.incomplete reporting by employers, differences in scope such as self employed, time away from work)one fact is very clear the BLS survey is an undercount of the true number of work related injuries and illnesses that occur each year. When one reads the BLS press releases one does not see in bold type that the numbers for specific injuries are only a small percentage of the total number. Not only does the BLS survey provide a vast undercount but it cannot be used to target intervention activity at specific facilitities. At the miminmum, it is time for BLS to incorporate a statistical estimate for the undercount in their statistical estimate of work-related injuries or illnesses. Otherwise OSHA should find better ways to spend the BLS survey funds than on a system that markedly undercounts and cannot be used to target facilities.

  3. For readers, the commenter John Ruser was the assistant commissioner at BLS who was in charge (2006-2012) of occupational safety and health statistics including the annual survey of occupational injuries and illness (SOII). He is now in a new position at BLS–assistant commissioner for productivity and technology. (Congratulations.)

    Mr. Ruser is correct that the scope of the SOII (e.g., injuries requiring at least 1 day off of work, and exclusions for self-employed, independent contractors and workers on small farms) may account for some of the discrepancy in the true toll of work-related harm and what is estimated in SOII. But, as Rosenman & Kica explain in their analysis these factors alone do not make-up for the difference between the “official” count of 450 burns versus the multi-source surveillance count which identified 1,461 burns. (Rosenman & Kica only identified 16 burns among the self-employed and workers on small farms.) They offer alternative reasons for the imprecise count (an understatement) such as employers not provided complete reporting, employers failing to classify injuries and illnesses as work-related, and inadequate statistical sampling procedures.

    I realize that developing and implementing a multi-source surveillance system may take a few years. In the meantime, as Rosenman suggests in his comment, the Secretary of Labor and BLS should qualify their statements about the annual SOII data by explaining that their estimates represent only a fraction of actual work-related injuries and illnesses.

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.