Thousands of medical students across the country are demanding the elimination of a licensing exam that tests their patient care skills, arguing that it’s an expensive waste of time.

Led by students from Harvard Medical School, the effort has drawn strong support nationwide. More than 6,000 students, residents, and faculty at 130 medical schools have signed a petition calling for the National Board of Medical Examiners to abolish the test.

The target of their anger is the Step 2 CS, a role-playing test that asks students to examine and diagnose patients or actors pretending to have specific conditions and fill out their medical charts. It’s meant to test their ability to interview patients, do physical exams, and explain their findings.

advertisement

The exam is only offered in five cities: Atlanta, Chicago, Houston, Los Angeles, and Philadelphia. That means many students need to spend hundreds of dollars on flights and hotel accommodations. The registration fee for the exam itself is $1,275.

“We all believe there needs to be accountability for competencies among medical students,” said Lydia Flier, one of the Harvard students organizing the petition, “but we do not think that [this exam] is a cost-effective, fair, or reasonable way to do that.”

advertisement

National Board statistics show that 96 percent of students trained at US medical schools pass Step 2 CS on their first try. The student activists said such data prove the exam isn’t effective at weeding out incompetent physicians.

Dr. Peter Katsufrakis, the senior vice president of the National Board of Medical Examiners, agreed that the exam isn’t difficult, but pointed out that 871 students did fail it in the 2013-14 academic year. Besides, he said, most medical school faculty don’t have time to observe third- and fourth-year students doing a complete physical exam, so it’s important to test those skills as part of the licensing process.

“It’s really just a part of what we do to become physicians and to demonstrate to the public that we have earned their trust — that they can put their faith in us and feel comfortable with it,” Kastufrakis said.

The exam also serves as a sort of quality assurance test for medical schools, to make sure they’re teaching patient care skills, said Dr. Lia S. Logio, president of the Association of Program Directors in Internal Medicine. “I think everyone coming to my residency program should pass it, and pass it on the first attempt,” Logio said.

But many students believe that the exam’s staged patient interactions do not reflect the realities of actually seeing, treating, and learning from patients in the hospital or clinic.

“Is it worth having tens of thousands of people every year spend $2,000 to go prove that they can remember to wash their hands, introduce themselves, and ask, ‘Do you have any questions?’ at the end of an interview?” said Isaac Jaben, a third-year medical student at Tulane who will take the exam in June.

Adam Buckholz, a fourth-year medical student at the University of Virginia, said he was frustrated that he had to spend more than $1,600 to get to a testing center and take the exam. He spent barely 10 minutes studying — and still passed, he said, which made the whole thing feel like a worthless exercise. Medical licensing officials “need to be honest with students as to why they are doing what they are doing and to make it worth their time and money,” he said.

The American Medical Student Association, which has about 43,000 members, hasn’t taken a position on the clinical skills exam but does support increasing the number of testing sites and improving the quality of feedback that students get after taking the exam, said Dr. Deborah Hall, the association’s president.

The National Board of Medical Examiners doesn’t give feedback to test takers — in part because that would be expensive and in part because it would make it too easy for students to cheat by sharing their feedback forms, which would likely contain hints about the specific scenarios being tested, Katsufrakis said.

The board is, however, analyzing the exam to identify which sections routinely trip up students, so medical schools can improve their teaching of those topics.