Skip to content

Medical schools get an “F” at grading graduates, study suggests

witteles word cloud imagePerformance evaluations, an important piece of the medical residency application packet, are often incomprehensible, sometimes useless and, at worse, misleading and unfair, according to a recent Stanford study published in Academic Medicine.

The study, which examined performance evaluations — commonly referred to as the “Dean’s letter” — from 131 medical schools across the nation, found that about half don't follow recommended guidelines set by the Association of American Medical Colleges in 2002.

"This has real consequences as it leaves residency programs in the dark about how well an applicant performed," says Ronald Witteles, MD, senior author of the study and director of the internal medical residency program at Stanford. "Some of the examples are actually rather humorous, such as one school having 33 percent of its students in the 'top quartile' and only 8 percent in the 'bottom quartile.' "

AAMC guidelines recommend that medical schools include "easily interpretable comparative data on core clerkship performance and overall academic performance," the study states.

To quantify whether the 117 medical schools in the study achieved this goal, researchers examined the grading and ranking systems used, if any. Among the results, they found that 14 of the schools didn’t use any ranking systems at all. Among the 83 medical schools that did assign key words to rank students, there was "tremendous variability" in the terms used — a total of 72 — making it extremely difficult to compare students across institutions.

Adding to the confusion, those 83 medical schools used 27 different words and phrases to describe the "top tier" students such as: exemplary, superior, distinguished, outstanding, exceptional, most outstanding, recommended highly, recommended with distinction, extraordinary and enthusiastically recommended. The meanings of the words varied from institution to institution, Witteles says, and were often difficult to interpret.

"For those schools that gave a keyword, the percentages of graduates who got the top key word were all over the place, ranging from 1 percent of the graduates to 60 percent getting the top key word. Thus, getting the top keyword can mean a whole lot if you are at the school where that word or phrase means you are ranked in the top 1 percent, or it can mean very little if it means that you are ranked in the top 60 percent,” Witteles says.

As director of a residency program for the past seven years, Witteles has experienced firsthand how difficult it can be to sort through hundreds, sometimes thousands of applications when the performance evaluations don’t provide helpful information, he says.

"By providing confusing or meaningless data it forces program directors to rely on something else to pick the best applicants to interview," Witteles said. "Either we rely on test scores, or we have to make very arbitrary decisions or call colleagues at other institutions asking about applicants. It is not a fair way to review applicants."

Previously: My couple's match: Applying for medical residency as a duo, How to combine anesthesiology, internal medicine and rock climbing and "How might we..." redesign medical education?
Image courtesy of Academic Medicine

Popular posts