From the June ACP Internist, copyright © 2013 by the American College of Physicians
By Molly Cooke, MD, FACP
Internal medicine residency programs will face a big transition in July when the Accreditation Council for Graduate Medical Education (ACGME) implements its “new accreditation system,” or NAS.
NAS is the most recent of a series of changes that have roiled residency training over the past decade, beginning with duty-hour reduction in July 2003. Across all specialties, these regulations limited the on-duty time of residents to 80 hours a week averaged over a four-week period and restricted continuous in-hospital service to not more than 30 hours per shift.
A flowchart illustrates the complex matrix of organizations and relationships within academic medicine. AAMC=Association of American Medical Colleges; ABMS=American Board of Medical Specialties; ACCME=Accreditation Council for Continuing Medical Education; ACGME=Accreditation Council for Graduate Medical Education; AHA=American Heart Association; AHME=Association for Hospital Medical Education; AMA=American Medical Association; CME=continuing medical education; CMSS=Council of Medical Specialty Societies; FSMB=Federation of State Medical Boards; LCME=Liaison Committee on Medical Education; MCAT=Medical College Admission Test; MOC=Maintenance of Certification; NBME=National Board of Medical Examiners; NCCA=National Commission for Certifying Agencies; TJC=The Joint Commission; USMLE=United States Medical Licensing Examination. Graphic provided courtesy of the AAMC, original concept by M. Brownell Anderson
Residency programs were further challenged when, in 2008, the Institute of Medicine, in its publication “Resident Duty Hours: Enhancing Sleep, Supervision and Safety,” argued that restrictions on resident work hours had not gone nearly far enough. In 2011, interns’ duty hours were further reduced to no more than 16 hours per shift.
Then, in June 2010, the Medicare Payment Advisory Commission (MedPAC), a nonpartisan group that advises the Centers for Medicare and Medicaid Services, issued a report concluding that $3.5 billion, or 54%, of the $6.5 billion in indirect medical education funding that Congress spends annually to assist hospitals and medical centers with the cost of training residents could not, in fact, be traced to educational efforts or activities.
Consequently, MedPAC recommended that this money be held back and used to develop incentive programs rewarding “performance-based GME.” MedPAC’s recommendation caught the attention of policymakers. In the face of pressure to reduce the deficit, a number of budget proposals have included a reduction in GME funding.
Residency programs in all specialties have been under unprecedented scrutiny, while the hard-working program directors who run them have had to deal with challenging mandates regarding the deployment of the housestaff they supervise, all at a time of flat, if not declining, support for educational programs.
As they always do, program directors have labored to keep their residents cheerful and satisfied with their educational experience and their department chairs happy with the quality of the residents recruited, while at the same time guiding residents’ career choices at the completion of the program and seeing that clinical services operate smoothly. They have organized complex resident schedules, dealt with the personal and professional difficulties some trainees inevitably experience, and ensured that their residents’ training was appropriately preparing them as skilled and humane physicians.
So, NAS is the most recent of a series of changes that residency programs have needed to accommodate. Traditionally, residency program accreditation has been conducted on a cycle, with strong programs awarded the longest accreditation cycle of five years. The key feature of the NAS is a shift, similar to that undertaken by The Joint Commission, to a more continuous verification of a sustained high level of performance.
Under the NAS, the periodic compilation of data in the form of a self-study called the Program Information Form and the subsequent high-stakes accreditation site visit—hallmarks of the old accreditation system—will be replaced by twice-yearly submission of data documenting resident achievement against 22 milestones. This first year, only one cycle of documentation will be required.
The ACGME’s purpose in making these changes is to promote education and foster innovation in the learning environment; to increase the focus of the accreditation process on the outcomes of residency education (i.e., what the residents learned to do, rather than where they rotated and how long they spent there); to improve collaboration and communication across the full range of stakeholders, including the public; and to increase efficiency and decrease the burden associated with the accreditation process.
Worthy goals all, but will the NAS accomplish them?
Frankly, we are in dire need of radical innovation in GME, the kind that Clay Christensen, coauthor of the white paper “Disruptive Technologies: Catching the Wave” and the book “The Innovator’s Prescription: A Disruptive Solution for Health Care,” termed “disruptive innovation.”
We continue to work primarily in a “front-loaded” model, emphasizing much more what residents learn during GME than the more critical abilities of using clinical experiences to identify gaps in knowledge and skills and keeping up with medical advances over a lifetime in practice. Because of this front-loading and the ever-increasing knowledge base of medical practice, post-MD training has become longer and longer, delaying the entry of young physicians into practice.
Despite the best efforts of educational leaders and the program directors who work with them, today’s residency programs continue in the format that developed in the 1930s and was essentially locked into place when Medicare was enacted in 1965, with its provision for indirect medical education and direct medical education funding for hospitals.
Regulation of medical education has become intense. In many ways, the role of program directors, chosen for their exemplary clinical skills and their high standing among, and unusual sympathy for, their residents, has devolved to that of a compliance officer. Program directors are answerable to their residents; their department chairs; the services that have come to depend on resident labor; their program’s designated institutional officer, who is the official link between each medical center’s residency programs and the ACGME; and their residency review committee. While the American Board of Internal Medicine certifies residents, rather than accrediting training programs, residents must complete approved programs to sit for board examinations. All in all, residency program directors serve an intimidating number of masters.
While I believe that the ACGME’s NAS was entirely well motivated, it remains to be seen whether it mitigates or worsens the problems it was intended to address.
Perhaps it is simply a case of the devil one knows being preferable to a new devil, but internal medicine program directors are expressing considerable anxiety about the upcoming “go-live” date in July, and particularly the workload associated with the documentation of each resident’s status twice yearly. Frequently expressed concerns include the faculty development needed to ready clinical supervisors to assess residents’ performance against internal medicine’s milestones and the time required to document trainee achievement on a twice-yearly basis.
We need innovation. Let’s hope that the NAS moves us in the right direction.