Getting Better at Getting Better
There’s a famous parable in medicine: two doctors are standing by a river, when they spot a drowning man. One doctor immediately dives in and pulls the hapless swimmer out. A second drowning swimmer soon appears, and the same physician quickly rescues her as well. After observing this pattern continue for the rest of the afternoon, the (still dry) second doctor starts slowing walking upstream.
“Where are you going?” the first doc asks, struggling to catch his breath. “I could sure use your help here.”
“I know,” the second doctor replies. “I’m gonna try to figure out who’s throwing all these people in.”
The story endures and resonates because it captures so well the dilemma confronting both individual doctors and the broader healthcare community, especially in the context of a health crisis such as COVID-19. Should we allocate our resources to the immediate needs of the moment, focusing on the delivery of care and how we can iteratively improve it? Or should we seek more transformational change and turn our attention to the hunt for cures, the development of vaccines, the exploration of mechanistic science?
At first blush, the answer seems obvious: it’s the vaccine, stupid. We didn’t stop worrying about polio when someone made a better iron lung. Effective vaccines truly offer the possibility of rapidly enabling large populations to safely acquire the herd immunity necessary to stop a virus from spreading, limiting the impact it can have, and enabling life to (safely) return to normal. Some vaccines have proved so effective that they’ve either eliminated (the smallpox virus) or nearly eliminated (the polio virus and some others) savage pathogens from the face of the earth.
Vaccination may also be working its way through the middle school canon: In 1938, William Carlos Williams published his much-anthologized story, “The Use of Force,” about a doctor’s attempt to diagnose a young girl with diphtheria; today, thanks to vaccination, there are only a handful of diphtheria cases reported in the United States every few years. Similarly, with the advent of vaccines, very few domestic species now meet the sorry fate of poor Old Yeller.
Vaccination is also, and understandably, the focus of Trump’s Operation Warp Speed. That is arguably the administration’s best-conceived effort to combat the coronavirus, as I’ve discussed, and is led by an experienced industry scientist, former GSK R&D exec Moncef Slaoui, who is exquisitely matched to this specific challenge.
In contrast, the president’s continued efforts to undercut the FDA and CDC threatens to erode the trust so vital to the effective functioning of these agencies. Rolling out a novel vaccine to a nation is inherently a monumental challenge, even with the involvement of highly touted military logistics expert such as OWS’s Chief Operating Officer, General Gustave F. Perna, Commander of the Army Materiel Command. Without complete confidence in the quality and integrity of the FDA’s review of any administered vaccine, acceptance and adoption are likely to be hindered.
While an effective vaccine that’s widely adopted (vaccinologists are fond of pointing out, “vaccines don’t save lives; vaccination does”) would, indeed, be a transformative advance, it’s also not a sure thing. Candidate medical products fail or disappoint all the time—far more often than not—and not every pathogen can be managed as effectively as smallpox and polio. We require a new flu shot each year because the influenza virus itself changes each year, for example. Our own immune systems respond differently to different vaccines; in some cases, we seem to develop long-lasting immunity, but in other cases, our response rapidly peters out, and it’s hard to know in advance the category in which a potential COVID vaccine might fall.
As appealing as transformative approaches are, most of the progress in medicine, arguably, has resulted not from one or another brilliant discovery. Rather, it tends to come via incremental innovation, through the efforts of what legendary surgeon-scientist Judah Folkman referred to as “inquisitive physicians”: curious, front-line doctors (and other providers) who are intrigued by something they’ve noticed in a patient, and are driven to pursue.
Despite a tendency to view progress through the lens of transformative scientific discoveries and game-changing inventions, most measurable progress results from all the subsequent work: figuring out how to actually, meaningfully use a new discovery or emerging technology, and apply it gainfully.
The theme of “learning by doing,” as economist James Bessen has emphasized in a book of the same name, is seen in areas from the power loom to petroleum refinement to the generation of energy from coal. In each case, the ultimate gains in productivity came not from an original, transformative discovery (to the extent such an entity even exists), but rather through the sequential efforts of motivated, front-line workers consumed with trying to solve pressing, real-world problems and creatively looking to anything that will help.
MIT innovation researcher Eric von Hippel coined a term for this approach to generating progress: “field discovery,” a process centered not around scientists in a lab conceptualizing a novel product, but rather the end-users who are focused on solving day-to-day problems. He demonstrated the importance of field discovery in a range of domains, including medicine: in 2006, von Hippel showed that the majority of new uses for existing medicines were originally discovered by practicing clinicians.
Similar themes emerge from Matt Ridley’s engaging new book, How Innovation Works, as he takes us through a series of vignettes in areas from energy and transportation to food and public health, urging us to recognize the gradual, incremental nature of innovation. We think of the light bulb, for example, as a singular triumph of Thomas Edison, but Ridley urges us to reconsider:
The truth is that the story of the light bulb, far from illustrating the importance of the heroic inventor, turns out to tell the opposite story: of innovation as a gradual, incremental, collective yet inescapably inevitable process. The light bulb emerged inexorably from the combined technologies of the day. It was bound to appear when it did, given the progress of other technologies.
Ridley feels Edison “deserves his reputation,” but urges us to focus on the 99% perspiration aspect of his famous adage, rather than the 1% inspiration. The value of pragmatic trial and error seems especially important; Ridley notes Edison “tested more than 6,000 plant materials in his bid to try to find the ideal material for making a carbon filament.”
Similarly, Ridley explains, key efforts in the early twentieth century to develop fertilizer involved “no single moment of breakthrough,” according to one of the leading scientists involved, Fritz Haber. Instead, there were “just a number of small improvements and incremental advances.”
Grand discoveries take the headlines, but implementation, and dogged, trial-and-error, incrementalism, seem to be what actually accounts for most progress—and certainly is responsible for the realization of a discovery’s promise.
Which brings us back to COVID.
Since the outbreak began, it’s become apparent that the fatality rate from the virus seems to be decreasing; it’s still an extremely dangerous virus, especially in vulnerable people, but it seems to be picking off fewer of those it infects than was the case in the earliest days.
As the New York Times reports, many had assumed this reflected simply a difference in who was infected; as we’ve gotten better at protecting our oldest patients, more of the infections are in younger people who are intrinsically more resilient. Perhaps this accounts for the declining fatality rate.
Yet a fascinating new research publication by New York physician-scientist Leora Horwitz suggests this explanation may not be correct—or, at least, is incomplete.
Horwitz and her team examined 5,121 patients hospitalized for COVID-19 at NYU-Langone’s 3-hospital academic health system from March through August of this year, during which time the mortality rate dropped significantly, consistent with national figures. The CDC, for instance, reported in April that 6.7% of cases resulted in death, compared with 1.9% in September. Yet, nationally, the average age of those afflicted had also demonstrably decreased, from 46 years of age at the start of May to 38 by the end of August.
To figure out whether the reduced mortality in their hospital center was due to characteristics of those afflicted, the authors statistically adjusted for age, sex, and a range of pre-existing conditions. After all this, they again looked at the mortality rate and found that even there was still a profound decrease in the mortality rate after the statistical adjustment, from 25.6% in March (among hospitalized patients) to 7.6% in August—a huge and encouraging decline. The authors note similar results were recently reported from a study of ICU patients in the U.K.; the data showed “mortality was significantly and progressively lower over the course of the study period.”
Horwitz and her co-authors can only speculate why they saw “incremental improvements in outcomes.” She posits a range of possible factors, from “increasing clinical experience” to tweaks in the approach to treatment to the possibility of “lower viral load exposure from increased mask wearing and social distancing.”
Even without the ability to attribute causation, these results are encouraging, and highlight the role and (often underappreciated) value of incremental innovation, small improvements driven by inquisitive physicians and other front-line providers, that are likely responsible for much of the progress we’ve seen to date.
By all means, we should continue to press for vaccines, cures, and transformational treatments. But let us also remember the critical role of incremental improvements—and ensure we are doing what we can to develop both the technological and cultural infrastructure required to support and advance this critically important, chronically unheralded work.