States investing more in social services and public health had healthier residents.
Policy makers are now placing more emphasis on the role of social determinants in influencing individual and population health. To date, though, little has been known about the correlation between state spending for social services and public health and the health of individuals. Elizabeth Bradley of Yale University and coauthors broke new ground by finding that people living in states that spend more on public health and social programs such as education, income support, recreational programs, and housing fare significantly better on a range of health outcomes, compared to states with less spending in these areas. The data were obtained through a retrospective longitudinal study of all states and the District of Columbia for the period 2000-09. The authors suggest that their findings underscore the value of broadening the debate beyond what should be spent on health care to include what should be invested in social services and public health.
Opioid spending in the United States shifts from consumers to insurers.
As policy makers seek to address the current opioid crisis, Chao Zhou of the Centers for Disease Control and Prevention and coauthors examined data pertaining to trends in expenditures for these drugs. They found a major shift: According to the authors, consumer out-of-pocket spending on opioids per 100 morphine milligram equivalents (a standard reference measure of strength for various opioids) declined from $4.40 in 2001 to 90 cents in 2012, with insurers paying an increasingly larger share of the cost. Another key study finding: After the establishment of Medicare Part D in 2006, the program spent far more for opioid pain relievers on the small number of beneficiaries younger than age sixty-five ($1.8 billion) than it did on those ages sixty-five and older ($637 million). This study is the first to comprehensively examine the financial aspects of the use of opioid pain relievers at the population level.
United States cancer drug prices rise after market launch.
In the rapidly changing climate of anticancer drug development, the costs of developing and bringing a new pharmaceutical product to market are sizable. Previous studies have documented the rapidly rising prices of cancer drugs at launch, but less attention has been paid to how the costs change once the drugs are in the marketplace. Caroline Bennette of the University of Washington and coauthors used a large database of pharmaceutical claims to examine monthly costs for newer oral anticancer drugs approved by the Food and Drug Administration (FDA) between 2000 and 2012. According to the authors, the per patient average monthly costs of these drugs increased approximately 5 percent per year above inflation. In addition, they found several important milestones that affected costs beyond these average annual increases. Following receipt of a supplemental approval by the FDA, the monthly drug costs increased an additional 10 percent. In contrast, FDA approval of a competitor product was associated with a 2 percent decrease in average monthly costs. The authors conclude that their findings show that competition will not necessarily serve to meaningfully rein in the escalating costs of oral anticancer drugs in the near future.
Cancer drugs provide value in nine countries, but the United States lags behind.
While examining real-world cancer drug consumption in nine countries, Sebastian Salas-Vega and Elias Mossialos, both with the London School of Economics and Political Science, compared the value in lives saved for cancer drug spending in Australia, Canada, France, Germany, Italy, Japan, Sweden, the United Kingdom and the United States, for the years 2004-14. The authors found a wide variation in how cancer patients in the different countries benefited from their cancer drug care, with the United States spending more than any of the other countries and witnessing one of the smallest improvements in cancer-related mortality. Assigning a conservative value for extended life-years, the authors calculated a $32.6 billion net positive return from cancer drug care in the United States in 2014 under base-case assumptions. While net returns from cancer drug care remain positive under most circumstances, the authors also found that the United States obtains a lower return per cancer drug dollar spent on individual patients than in the other countries analyzed.
Lower priority review voucher costs could undermine the incentive to develop new treatments.
To encourage the development of drugs for neglected diseases, Congress created the priority review voucher program in 2007. In addition to faster review of a drug for a neglected disease, the program offers the drug developer a voucher for faster review of a different drug, and the right to sell the voucher. David Ridley of Duke University and Stephane Régnier of Novartis Pharma AG examined US sales of new treatments approved in the period 2007-09 to estimate the commercial value of the voucher. According to the authors, that value could exceed $200 million if only one review voucher was available in a given year. However, if four vouchers were available, the value could fall below $100 million. The study details some of the actual voucher sales and changes in the regulatory and commercial market for the voucher, including faster overall review times by the Food and Drug Administration and an expansion of the number of diseases eligible for a voucher.
Also of interest in the May issue: