• Sustainability
  • DE&I
  • Pandemic
  • Finance
  • Legal
  • Technology
  • Regulatory
  • Global
  • Pricing
  • Strategy
  • R&D/Clinical Trials
  • Opinion
  • Executive Roundtable
  • Sales & Marketing
  • Executive Profiles
  • Leadership
  • Market Access
  • Patient Engagement
  • Supply Chain
  • Industry Trends

Opinion: Inappropriate Behavior

Article

Pharmaceutical Executive

Pharmaceutical ExecutivePharmaceutical Executive-04-01-2007
Volume 0
Issue 0

Inappropriate care may account for 25 to 30 percent of the US healthcare bill. Former NEJM editor Arnold Relman used to write that if we could eliminate the unnecessary care, we could save enough to insure all the uninsured.

What constitutes appropriate and inappropriate care? Very often, the answer depends on whom you ask.

Humphrey Taylor

At two recent industry conferences, I had occasion to ask a different question: "How much inappropriate and unnecessary care is provided?" There, I found a consensus—many people thought it was "a lot." But when I countered with the question, "Have you ever sought care which you thought was inappropriate or unnecessary?" not a single hand went up.

Actually, when I think about it, I'm pretty sure I have received inappropriate care—but I only figured that out after the fact. When a physician told me that I needed this test or that treatment, I did what I was told without question. Noncompliance, I knew, was a sin. Only later did I wonder why the doctor prescribed an antibiotic, asked for another office visit, or scheduled more diagnostic tests.

Magnify that experience across the nation, and it's easy to see that this issue of inappropriate care—what is inapropriate and how to reduce it—is likely to become much more important.

We will continue to believe that costs are out of control. For the last 50 years, healthcare costs have grown more than twice as fast as the gross domestic product (GDP). But, with the costs of employer-provided insurance, Medicare, and Medicaid outpacing the economy, we have to figure out a way to pay for the services we're using. Perhaps economist Herb Klein said it best: "If a trend is unsustainable, it will end."

It sounds obvious that the least painful way to slow the growth of healthcare costs is to cut down on inappropriate, unnecessary, wasteful, or marginal care, which some experts estimate may account for as much as 25 to 30 percent of all the money spent on healthcare in the United States. Arnold Relman, when he was editor of the New England Journal of Medicine, used to write that if we could eliminate all the unnecessary care in the United States, we could save enough money to insure all the uninsured, and then some.

Survey Says: How to Lower the Healthcare Bill

Fifteen years ago, we thought we had the answer to this problem. It was called managed care. Using preauthorization and utilization reviews, we would "just say no" to inappropriate care. People who tried to do this were very unpopular and still have the scar tissue.

The latest silver bullet to cost management is consumer-driven health plans. Of course, these plans are not really consumer driven. They are employer driven, and they are ideologically driven by those who believe the US healthcare system is so expensive because most of the costs are born by employers, insurers, and the government. As a result, the thinking goes, the insured population uses too many medical services because they are almost "free." (This argument ignores the fact that, in Europe, Japan, and other countries where out-of-pocket costs are much smaller, healthcare accounts for a much lower percentage of GDP). Milton Friedman is one of the many advocates who believe that the solution to the nation's healthcare problem is to require patients to pay much more for their care and thereby take more responsibility for their health, their healthcare, and their healthcare costs—and that this would lead to a big cut in inappropriate care.

Others point to rising healthcare costs as a provider-driven problem and say that the way to contain costs is to reduce the supply of doctors. There is compelling evidence: Researchers Jack Wenberg, Elliott Fisher, and the Dartmouth Atlas project, for example, have documented the big differences in the utilization of medical services in different areas with more or less doctors and correlated that with little difference in health outcomes.

This research, of course, conflicts with the idea of market-based and consumer-driven health plans, which want to "empower consumers" to decide for themselves. But we have always tended to believe that physicians, not patients, are the right people to decide what is and is not appropriate. By all means, doctors should review treatment options with patients, and even let patients make the final decision—but the physician is the expert we trust to tell the patient about the likely consequences of different treatment options, or of no treatment.

Needed: Discourse and Debate

There are several reasons why cutting inappropriate care will be very difficult.

One is the culture of American medicine, so well described some years ago by Lynn Payer in the book Medicine & Culture. American doctors are, it seems, the most aggressive in the world and are educated that "if in doubt, do something." Payer contrasted this approach with the British medical culture of "watchful waiting"—in other words, "take an aspirin and come back if it still hurts".

Furthermore most physicians and hospitals make money from doing more rather than less. The financial incentives to deliver more care are very powerful. Doctors' productivity and compensation are usually based on how many dollars they generate. The more care they deliver —appropriate or inappropriate—the more money they and their hospitals make. And sometimes inapproprate care is generated by defensive medicine to avoid malpractice suits.

Whatever the causes of inappropriate care, there is almost universal consensus on the desirability of reducing it substantially. It is the one goal on which both the left and the right can agree. They disagree passionately on the roles of government, the market, global budgeting, single-payer systems, and tax incentives. But who could be against cutting unnecessary care?

Health information technology, quality measures, patient protocols, and pay-for-performance are all steps toward improving the quality and appropriateness of care. The unresolved question is who should be mainly responsible for cutting inappropriate care: government regulators, employers, insurers, hospitals, doctors, or consumers?

Advocates of market-based solutions—those who think patients should pay a much larger share of their costs—argue that this is something the patient should take responsibility for. This sounds good, but real-world data show that when people have to pay more for medical care, they reduce their use of both inappropriate and appropriate care. Higher co-pays and deductibles increase noncompliance. And noncompliance often increases healthcare costs.

Can we really believe that patients who are sick, in pain, or frightened can tell what is and is not appropriate? The idea that patients should be expected to know better than their doctors whether tests, treatments, surgeries, or medications are inappropriate—before they receive them—seems fanciful. Yet many are swallowing this pill of personal responsibility without stopping to think about it. For some reason this issue of how to and who should decide what care is appropriate does not seem to be the subject of much debate. It should be, and it will be.

Humphrey Taylor is chairman of the Harris Poll, Harris Interactive. He can be reached at htaylor@harrisinteractive.com

Related Videos
Related Content