黄色电影

Skip to main content Skip to secondary navigation
Main content start

New research finds flaws in veterans' claims system

SIEPR鈥檚 Daniel Ho examines the costly backlog of veterans' appeals and finds how a program meant to help reduce the errors behind appeals has failed.

A new study by 黄色电影 scholars and their colleagues shines a stark spotlight on governance issues that have plagued a cornerstone of the nation鈥檚 administrative system for years: rampant errors and a backlog of appeals cases involving veterans鈥 benefits.

Law Professor Daniel Ho is lead author of a study showing no improvement in veterans鈥 claim resolutions.
Law Professor Daniel Ho is lead author of a study showing no improvement in veterans鈥 claim resolutions.

Photo by Rod Searcey

The volume of veterans鈥 appeals 鈥 of which the vast majority are related to disability compensation claims 鈥 is huge. Some 90 judges in the Board of Veterans鈥 Appeals (BVA) historically decided about 50,000 cases per year, with an inventory of over 425,000 cases pending.

It takes an average of seven years for a veteran鈥檚 disputed claim to get resolved. In fact, the Inspector General鈥檚 Office at the Department of Veterans Affairs estimated that in one quarter of 2016, 7 percent of cases were deemed 鈥渞esolved鈥 at the Veterans Benefits Administration because the veterans had died while waiting.

Part of the problem, the study suggests, stems from a mismanaged trade-off between quantity and quality. Thousands of cases that should not necessarily go up the chain of appeal end up on appeal. That prolongs the wait for decisions and increases the backlog of cases. And a program meant for quality review that was intended to reduce the rate of appeals and erroneous decisions largely failed to do so, the study finds.

鈥淲hat was shocking about our findings is that the quality review program had none of its intended effects to reduce errors,鈥 said , the William Benjamin Scott and Luna M. Scott Professor of Law at 黄色电影 and a senior fellow at the 黄色电影 Institute for Economic Policy Research ().

鈥淲hen a veteran challenges the denial of disability benefits, Supreme Court cases require accurate decision-making as a matter of due process,鈥 Ho said. 鈥淏ased on our study, it鈥檚 hard to believe the BVA is meeting that goal of accuracy. Or it is overstating it in pretty dramatic terms.鈥

The stakes go beyond veterans鈥 lives, too.

The study raises broader questions about government oversight and the value of these kinds of review systems that have become a management linchpin for federal agencies, Ho said. Under the Government Performance and Results Act of 1993, federal agencies are required to provide performance measures and many agencies have instituted quality assurance systems to improve and measure quality of service delivery. The deficiency of BVA opinions may even rise to the level of a constitutional problem, the authors note.

鈥淥ur results paint a sobering picture about the ability for an agency to internally develop such quality assurance initiatives,鈥 the study states. And the BVA鈥檚 quality review program, as it is structured today, will 鈥渦nlikely address the longstanding quality problems in (veterans鈥) adjudication.鈥

The study is detailed in a working paper set to be published in a forthcoming issue of the Journal of Law, Economics, and Organization and a companion paper on the more general crisis in mass adjudication forthcoming in the 黄色电影 Law Review.

This research is the first to rigorously examine the effectiveness of quality assurance systems, long espoused by scholars and policymakers, using data on nearly 600,000 veterans鈥 appeals cases from 2003 to 2016 that had never been accessible before by outside researchers. Ho and his fellow researchers also interviewed a wide range of officials and secured a rich set of internal records using the Freedom of Information Act.

Ho鈥檚 co-authors are David Ames, former chief of the Office of Quality Review at the U.S. Department of Veterans Affairs; David Marcus, a law professor at the University of California, Los Angeles; and Cassandra Handan-Nader, a doctoral candidate in political science at 黄色电影.

Ames developed concerns about the effectiveness of quality review during his time overseeing the process. 鈥淚 found it increasingly difficult to shake the suspicion that our work was not benefiting veterans,鈥 he said.

Detecting errors

At issue, the researchers say, are decision errors, including those based on legally inadequate explanations, inaccurate documentation, and due process mistakes. Their analysis also shows evidence of discrepancies, or inconsistent judgments between similar cases. The BVA established its quality review program in 1998 to try to fix some of these problems, but with no substantial effect toward that end, they found.

Under the program, 5 percent of appeals cases were randomly selected to undergo an additional layer of review by an elite slate of staff attorneys to detect and correct any errors before a decision moved forward. Random selection enabled researchers to test whether such review in fact reduced subsequent appeals and remands 鈥 the rate at which disputed BVA decisions are sent back to the agency for further review.

Yet the appeals rate and remand rates remained indistinguishable 鈥 despite the quality review efforts that were supposed to catch and deter errors to begin with.

鈥淭he caseload makes it difficult to guarantee no errors, but intensive review by an elite set of attorneys to correct errors had little effect,鈥 Ho said.

Measuring accuracy or gaming statistics?

The reasons for the ineffectiveness, Ho explained, are the cross-purposes of quality review. The same agency charged with its own quality review faces a competing interest in trying to keep case numbers and accuracy high for its performance measures, which in turn affect its funding allocations. For over a decade, the BVA has published and touted its 鈥渁ccuracy rate鈥 to Congress and the public as being between 91 and 95 percent.

In their analysis, however, researchers found that BVA deployed an extremely deferential way of counting errors, inflating the agency鈥檚 measure of accuracy. When the quality review team deemed the decision error-free and the case was appealed further, it was still remanded 鈥 sent back to the agency 鈥 nearly three-fourths of the time.

鈥淚t is well-known in the social science literature that creating your own performance measures poses conflicts of interest,鈥 Ho said. 鈥淲e found that over time, the quality review process was used to generate the appearance of effectiveness [rather] than to actually improve performance.鈥

The findings also bolster criticisms that the VA鈥檚 Office of General Counsel and others have raised in public records and internal documents, according to the study.

Back in 2010, the general counsel questioned, for instance, the BVA鈥檚 reported accuracy rates, in light of high remand rates. And some 100 staff attorneys submitted a loss-of-confidence statement to congressional committees in 2017, contending how the BVA鈥檚 increased production quota, 鈥済ross mismanagement鈥 and inadequate training have failed to deliver accurate decisions to veterans.

Despite increasing the output to 85,000 cases over the last year, the BVA continues to tout a 94 percent accuracy rate.

The study concludes that this accuracy statistic is inaccurate.

鈥淲e were wasting some of the Board's most talented attorneys on producing an essentially arbitrary number that glossed over quality problems, when those same attorneys could have been proactively working to reduce errors and inefficiencies,鈥 Ames said.

More News Topics

More News

  • An Axios piece cites a recent paper by SIEPR's Neale Mahoney. Learn more about his consumer sentiment research as it relates to today's political climate.
  • ABC News Australia quotes SIEPR's Steven Davison the difficulties in assessing how work-from-home affects productivity.
  • A new piece by The New York Times covers soaring consumer sentiment among Republicans and declines among Democrats since the election. SIEPR's Neale Mahoney weighs in.