Socialize Big Pharma

Analysis piece originally published by Jacobin magazine and syndicated in Salon on 29 June, 2013. 

The pharmaceutical industry, like oil companies and arms manufacturers, isn’t viewed highly in the public imagination.

And for good reason. There is growing awareness of an inherent conflict of interest in the testing of drugs by the companies that manufacture them — like Pfizer, Merck and Eli Lilly — and a steady stream of tales from journalists, researchers and doctors of deliberately dodgy trials, buried unfavorable results, and purchased academic journals.

Yet the greatest crime of the world’s major private pharmaceutical companies is not what they do, but what they don’t do. In the ongoing war against bugs and infection, these companies have abandoned their posts at the most critical time: when the enemy is mounting its most ferocious attack in generations. As these firms continue to shirk their duties — effectively abandoning antibiotic research for some 30 years now — senior public health officials are warning that the world could soon return to the pre-antibiotic era, a miserable, fearful time that few people alive now remember.

Market reports, medical journals, philanthropic organization analyses, government studies, and the pharmaceutical sector’s own assessments prefer a more delicate approach, attributing the dangerous threat to “insufficient market incentive.” My solution is a bit more elegant: socialization of the entire industry.

Policy options such as fresh regulation and keener oversight could work to moderately temper areas of Big Pharma malfeasance such as research massaging. But in the War on Bugs, these measures are either radically insufficient or of no use. There are a handful of emergency preventative steps that hospitals and livestock farmers can take to slow the advance of the enemy, but these attempts can do no more than postpone the impending doom. Socializing drug development is the only way of solving this problem.

A THREAT AKIN TO CLIMATE CHANGE

In March, the director of the US Centers for Disease Control and Prevention, Thomas Frieden, warned authorities of their “limited window of opportunity” to deal with the “nightmare” presented by the rise of a family of bacteria highly resistant to what are often our last line of antibiotic defense: the suite of drugs known as carbapenems. A few months earlier, the UK’s chief medical officer, Sally Davies, used similar language to describe a future “apocalyptic scenario” in 20 years’ time, when people will be dying from infections that are currently understood to be trivial, “because we have run out of antibiotics.”

Davies described how the phenomenon “poses a catastrophic threat” to humanity akin to that of climate change and imagined a scenario in the coming decades in which “we will find ourselves in a health system not dissimilar to the early 19th Century,” where any one of us could go to the hospital for minor surgery and die from an ordinary infection that can no longer be treated. Major interventions like organ transplants, chemotherapy, hip replacements and care for premature babies will become impossible.

For generations we have grown accustomed to what are, frankly, superhuman feats of medicine, viewing them as unexceptional and indefinite, when in fact they depend on an assumption of prevention of microbial infection. Antibiotics revolutionized healthcare: the treatment of trauma, heart attacks, strokes and other illnesses requiring extensive care with catheters, intravenous feeding and mechanical ventilation cannot proceed without access to antimicrobial drugs. As the population ages, demand for this sort of intensive care will only increase.

So what did the pre-antibiotic era look like? There was 30% mortality from pneumonia for those who didn’t have surgery. Mortality from appendicitis or a ruptured bowel was at 100%. Before Alexander Fleming’s serendipitous discovery of the first antibiotic penicillin, hospitals were filled with people who had contracted blood poisoning through cuts and scratches. These scratches often developed into life-threatening infections. Using amputation or surgery as common medical responses for scraping out infected areas is not pleasant or preferred, but these were the only options for the doctors of 19-year-old David Ricci of Seattle following his train accident in India a few years ago. Ricci suffered infections from drug-resistant bacteria that even highly toxic last-resort antibiotics could not treat.

We have forgotten how common and deadly infectious disease once was. We’ve taken antibiotics for granted, but we can hardly blame ourselves for such complacency. US Surgeon General William H. Stewart is infamous for declaring it “time to close the book on infectious diseases and declare the war against pestilence won.” By the 1980s, cases of tuberculosis — humanity’s first known infectious disease and one of our deadliest foes, killing 1.4 million in 2011 — had dropped to such low rates that policymakers frequently spoke of eradicating the disease.

For the rest of the article, visit the Salon or Jacobin website.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s