USNews: Doximity’s ‘Residency Navigator’ Injects Transparency into GME

Source: USNews

A snapshot from Doximity's Residency Navigator tool.

Graduates of the University of Utah’s neurosurgery residency program are in the 95th percent or higher on academic publications, NIH grants and clinical trial participation.

By  and  

As med students prepare residency applications, search tool and rankings reveal new data.

t’s crunch time for fourth-year medical students to decide where they want to do their post-graduate training. Next week, they’ll begin submitting applications to the national system that determines residency assignments. For many, the choice of training program will heavily influence what medical specialty they ultimately practice – and where they practice it.

But fourth-years have long faced a dilemma. They’ve had no way to objectively assess which residency programs offer the best prospects of gaining the clinical training and career opportunities they may covet. For example, a student hoping to go into cardiology would be blind to which internal medicine residencies’ graduates tend to ultimately subspecialize in that field.

A new online tool changes that, putting subspecialization rates, board-pass rates and other useful data at the fingertips of tomorrow’s residents. The Residency Navigator tool, developed by the physician networkDoximity, also ranks top programs based on a large survey of physicians. Doximity, which has a relationship with U.S. News, gave U.S. News editors an early look at Residency Navigator and access to the data behind it. What follows are our first impressions – and the reasons we think Doximity’s latest project will be a catalyst for good in medical education.

The push for transparency

The need for transparency and assessment of graduate medical education (GME) programs has recently emerged in a national conversation. At stake is the public’s ability to ensure program value in exchange for continued public investment in GME.

In July, a report from the Institute of Medicine underscored that little information is available on how GME programs annually spend some $12 billion to $15 billion in federal funds. The authors contended that without a mechanism to track and evaluate this investment, little can be done to ensure the money is well spent preparing doctors for mainstream practice. For example, while the majority of residency and fellowship training occurs in hospitals, certain types of care are increasingly delivered in outpatient settings. And while swathes of the country appear to face looming doctor shortages, most of Medicare’s spending on GME occurs in the Northeast, which is crowded with urban teaching hospitals and the attending physicians who’ve settled around them.

The IOM report proposed that Medicare invest in building data infrastructure to track how teaching hospitals use GME dollars. Such a project, the report concluded, could lead to better allocation of funding for training in non-hospital settings and could support efforts to draw physicians to rural areas, where they are acutely needed.

Following the IOM report, the Association of Health Care Journalists wrote to the Accreditation Council for Graduate Medical Education, which accredits residencies, and asked it to release data on how it evaluates the programs. Residency programs need ACGME accreditation in order for their graduates to sit for board exams. In the letter (available to AHCJ members via healthjournalism.org), AHCJ president Karl Start also asked ACGME to disclose the percentage of residents at each program who pass their board exams. This information could be used to assess how effectively each program builds on incoming residents’ clinical knowledge.

In a written response (available to AHCJ members via healthjournalism.org), ACGME executive director and CEO Thomas Nasca stated that information relating to board pass rates and how the organization evaluates residency programs could not be disclosed. Nasca wrote that publicizing such information “would have a chilling effect” on ACGME’s ability to retain “confidence in the quality assurance aspect of accreditation.” The organization did not make Nasca available in response to a U.S. News interview request.

At least four specialty boards reveal pass rates, said Doximity co-founder Nate Gross. “Unfortunately, not every specialty board has made data public,” he said. “There’s a lot of tension right now in graduate medical education circles about what to share.”

Finding a ‘best match’ residency

Doximity’s Residency Navigator aggregates and presents various types of data that medical students may want to know, including information Doximity has gleaned from its proprietary database of profiles created by thousands of individual physicians.

U.S. News was not formally involved in the development of Residency Navigator or the methodology behind Doximity’s residency rankings. (Disclosure: U.S. News and Doximity have an agreement to collaborate on other initiatives, including the U.S. News Doctor Finder, which uses physician-level data provided by Doximity, and U.S. News editors offered nonbinding input on Doximity’s methodology.)

To evaluate residency programs, Doximity conducted an online survey of current and former medical residents who are members of Doximity’s free physician network. The survey made a single request of each doctor: Nominate up to 5 residency programs in your medical specialty that offer the best clinical training. Dermatologists nominated dermatology residencies, general surgeons nominated surgery residencies, and internists nominated internal medicine residencies, which are steppingstones into subspecialties such as cardiology and infectious disease.

More than 17,000 Doximity members responded to the survey, which was conducted between January and July. That represents more than 10 years’ worth of medical trainees; nationwide, approximately 1,600 newly minted M.D.’s enter residency each year. (By comparison, slightly more than 10,000 Doximity members responded to an earlier survey that U.S. News used in its 2014-15 Best Hospitals rankings.) In all, respondents provided more than 50,000 residency nominations, according to Doximity. By any measure, that’s an impressive data set.

Doximity weighed each nomination to account for regional differences in response rates and in the proportion of physicians who are Doximity users. Doximity then ranked the top 10 programs in each specialty, both nationally and regionally, based on the weighted number of nominations each program received. Top-ranking programs in internal medicine, for example, included Massachusetts General Hospital, Johns Hopkins Hospital and the University of California, San Francisco. In emergency medicine, Indiana University School of Medicine, University of Cincinnati Medical Center and University of Southern California topped Doximity’s rankings. And the top-ranked general surgery residencies are atJohns Hopkins University, Mass General and the University of Michigan.

Rankings, we at U.S. News have found, are sometimes controversial. But whether you like what Doximity has done or not, it has opened important data to public scrutiny – and made it possible for this year’s graduating medical students to make better-informed decisions than their predecessors could.

Arguably even more useful than the rankings are the objective data Doximity has assembled to allow apples-to-apples comparisons among programs. A medical student seeking a particularly intensive training experience, for example, might be glad to know that the internal medicine residency at Emory University in Atlanta involves spending half of one’s time at Grady Memorial Hospital, a large public hospital.

Quantitative measurement of residency quality

Doximity isn’t the only group exploring how to evaluate the quality of residency programs. Adam Wilson, a researcher in the department of surgery at the Indiana University School of Medicine, recently reached out to U.S. News and described his group’s proposed approach to quantitatively evaluating surgical residencies. The Indiana methodology would involve four domains comprising as many as 30 measures of each program’s quality, many of which are not publicly available and would require voluntary participation by individual programs. One domain would measure program reputation as determined by surveying program directors. “Doximity has an advantage here,” Wilson noted. In addition to surveying program directors, “they can ask other attendings” because so many doctors use Doximity’s online network, he said. For practical reasons, the Indiana investigators would limit their survey to residency program directors.

Another domain would cover scholarly research, NIH funding and other measures of what Wilson called “departmental vitality.” That approximately parallels what Doximity has done in embedded academic-productivity data in Residency Navigator. Doximity users can apply filters such as “Research Focus” to customize the rankings to their interests. In doing so, a med student angling for an academic career in neurosurgery might elect to look past the 10 top-ranked programs and apply to the University of Utah’s neurosurgery program. According to Residency Navigator, the Utah program’s recent graduates rank in the 95th percentile among their peers for publication productivity and in the 97th percentile on a measure of alumni participation in grants and clinical trials.