How to select surgical residents: The evidence

Source: Skeptical Scalpel

On Twitter a while ago, a medical student asked me how surgical program directors select new residents. Then a discussion arose among some academic surgeons on the same topic. Someone suggested that medical school grades were the best way to tell whether an applicant would be a successful resident.

The fact is that we aren’t really sure what the best way to choose residents is.
First, here’s what we really do.
A 2011 paper from the Journal of Surgical Education reported on a survey of general surgery program directors, associate program directors, and department chairs, with 262 (65%) responding. http://www.ncbi.nlm.nih.gov/pubmed/21292219
USMLE Step 1, used by 37% of programs, was the most common applicant screening criterion with USMLE Step 2 second at 24% and graduation from an LCME-accredited US med school third at 15%. The least important criteria were previous research experience and publications.
Final selection criteria were assessed using a Likert scale. The number one factor was the interview followed by the USMLE Step 1 score, letters of recommendation, and USMLE Step 2 score. The least important factor by far was whether an applicant had done a preliminary year. Research, publications, and a previous rotation at the institution also ranked near the bottom. Class ranking, the dean’s letter, and surprisingly, Alpha Omega Alpha status were in the middle.
Responses consisted of 49% from university programs, 38% from university affiliated hospital programs, and 13% from independent community hospital programs. The average number of applicants per program was 571.
The problem is that proof of the value of the above methods of selection is lacking. A paper from Academic Medicine in 2011 reviewed nine studies of USMLE scores and resident performance and found no correlation between those scores and the acquisition of clinical skills by students residents or fellows. http://www.ncbi.nlm.nih.gov/pubmed/21099388
meta-analysis of 80 studies and over 41,000 participants from the journal Medical Education in 2013 found that the USMLE Step 1 scores and medical school grades were associated with better resident performance. However, if you eliminate studies showing that better USMLE scores led to better scores on in-training exams and passing licensing tests, only two studies found that USMLE scores correlated well with subjective ratings of residents.
What about grades?
The authors pointed out that “These data could potentially be more useful to program directors if grading systems across medical schools were standardized.” I said the same thing in a previous post.
study of 348 categorical general surgery residents at six residency programs on the West Coast looked at resident attrition or need for remediation. The need for remediation was associated with receiving a grade of honors in the medical school surgery clerkship and with a slightly but statistically significantly lower USMLE Step 1 scores. For example, PGY-1 residents needing remediation averaged 225 on USMLE Step 1 vs. 232 for those not needing remediation.
A major issue is the fact that we don’t have good data on the clinical performance of surgical residents or graduates of training programs. I know from personal experience that a good USMLE score or a high score on the surgery in-training exam had a “halo effect” when it came time for faculty to evaluate overall resident performance.
According to the Wall Street Journal, some businesses are asking job applicants of all ages “to provide SAT or ACT scores, results from graduate-school entrance tests and grade-point averages along with their work history.”
Google, a successful company by any measure, does not care much about grades. Here’s what Laszlo Bock, senior vice president of people operations, had to say in the NY Times about employee performance: “One of the things we’ve seen from all our data crunching is that G.P.A.’s are worthless as a criteria [sic] for hiring, and test scores are worthless — no correlation at all except for brand-new college grads, where there’s a slight correlation. Google famously used to ask everyone for a transcript and G.P.A.’s and test scores, but we don’t anymore, unless you’re just a few years out of school. We found that they don’t predict anything.”
But the relationship between college grades and performance at Google isn’t the same as the relationship between med school grades and performance as a surgeon. Or maybe it is.
I suppose one could argue that applicants for residencies, who are recent graduates of medical schools, would fall into Google’s “slight correlation” category.
I know one thing—I don’t know how to select applicants who will become good surgeons. Do you?

One thought on “How to select surgical residents: The evidence

Comments are closed.