by Lisa Sanders
But here’s the thing: not all abnormal heart sounds are important. Up to 50 percent of people who have a heart murmur—the most common abnormal heart sound—have completely normal hearts. These patients don’t need additional testing. What we really need are doctors who are able to reliably distinguish between those who need more testing and those for whom further testing is simply a waste of their time and money. How well do we do here, where it really counts? Can we distinguish between those murmurs that need further evaluation and those that are benign or innocent? Cardiologists can. In a study done by Christine Attenhofer of the University Hospital in Zurich, cardiologists correctly identified ninety-eight out of one hundred pathologic heart sounds. Can primary care docs match that? Somewhat surprisingly, there’s very little research done addressing this important question. One study done of emergency room physicians suggests that they can—though not as well as the subspecialists. In this study, two hundred patients with heart murmurs were evaluated by an ER physician. The physician took a history, examined the patient, and got a chest X-ray and an EKG. He then documented—in writing—whether the patient needed further evaluation or had an innocent murmur. After this evaluation all patients had echos. Of the two hundred patients, 65 percent had normal echocardiograms and thus innocent murmurs. These ER doctors were able to identify those who didn’t need additional studies nine times out of ten, erring mostly in sending too many patients with a normal heart for further evaluation. But they missed fourteen of the patients who had abnormal hearts.
Can we get better? Several studies have been done evaluating programs designed to better teach the cardiac exam. Not surprisingly, all showed that if you teach these doctors-in-training, they will learn. One course used recorded sounds that participants were required to listen to five hundred times. Their test scores increased fourfold—from the downright pathetic 20 percent correct to a respectable 85 percent correct. Other studies had students examine actual patients who had a variety of heart murmurs. These doctors doubled their test scores. So it is a skill that can be learned. We have the tools we need to bring back a reasonable, workable version of the heart exam. The question is, will we do it?
Carol Pfeiffer is a tall, slender brunette with a husky voice and a warm smile. She is sitting at the head of a table in a small conference room crammed with a half dozen second-year medical students dressed in their short white coats. A few of the students sit; the others move restlessly around the room. They chat nervously as they wait. Tension fills the air like a bad smell. The students are there to take their end-of-the-year final but there are no blue books, no number 2 pencils, no desks. This exam consists of a half dozen simulated patient encounters.
The patients these students will be seeing are actually actors who have been trained to depict one or more of the 320 medical conditions on which the students will be tested. Carol is the head of the Medical Skills Assessment Program at the University of Connecticut. She explains the test to the anxious students, even though these guys are old hands at this—they took a similar test at the end of their first year and have learned from these patient-instructors throughout their first two years.
The test is set up to simulate an outpatient doctor’s practice. The students will visit the six rooms in the order given on each one’s schedule. Outside the door there is a little card listing the patient’s chief complaint. When the bell rings the students will enter the rooms and begin collecting the essential information on each patient. They will get the patient’s history, perform a physical exam, explain to the patient what they think is going on. Once they leave the room they will write a brief medical note on the patient.
The rooms are equipped with the usual doctor’s office stuff—a small table with a couple of chairs, an exam table, a blood pressure cuff, and thermometer—plus some equipment not usually found in an office—a small camera and a microphone. The entire encounter will be videotaped and the students and their teacher will review it after the test. After reminding the students about how the test works, Carol asks for questions. When there are none she sends them to the corridor around the corner, to find the room with their first patient.
I follow Pfeiffer into what looks like the control room of a TV studio. It’s dominated by a wall of small black-and-white monitors. I don a set of headphones and plug in to watch one of the encounters. Most of the scenarios require the student to recognize a common illness and recommend the appropriate study or treatment. In one room there’s a young man complaining of shortness of breath—his history reveals that he has had an accidental exposure at work to toxic chemicals. Diagnosis: asthma due to occupational exposure. In another room a fifty-something-year-old man complains of chest pain with any exertion for the past day. Diagnosis: likely unstable angina. Some need a diagnosis and counseling: a worried mom brings in her daughter, who has a cold and ear pain. She wants antibiotics for her little girl. The student’s job is to explain why antibiotics are not appropriate. A young woman complaining of trouble sleeping is found to have a pattern of binge drinking, putting her at risk for alcohol-related disease and disability. The student’s job in this case is to counsel the woman about the risks from her behaviour.
After checking in on a few of the rooms, I settle in to watch a young man who is speaking with a heavyset patient with graying hair. The student introduces himself and washes his hands as he’s been taught. He sits and asks the man what brought him in. It’s his stomach, the man tells Chris, the young doctor-to-be. Every now and then he gets this pain that comes on an hour or so after he eats. It doesn’t happen all the time but a couple of nights before it woke him up from sleep and he almost went to the emergency room but decided to come in to get it checked out instead. The pain was severe and constant, lasting several hours. That time he thought he had a fever as well. Sometimes he has diarrhea when he has the pain.
As the student asks questions, more details come out. He sometimes takes an antacid for the pain but it doesn’t seem to do any good. The pain seems more common after a meal of fatty foods. The other night he’d had fried chicken. The pain seems to be mostly on his right side and doesn’t worsen when he lies down; he’s never noticed black or tarry stools, which would suggest a bleeding ulcer. The student gets the rest of the patient’s history. He has high blood pressure and takes two medications for that; he’s married, works in an office, doesn’t drink or smoke. He’s been on a health kick lately and lost twenty pounds over the past couple of months. The fried chicken was a little treat to celebrate his success.
Now it’s time for the exam. The student, a beefy young man with light brown hair and an open pleasant face, asks the man to move to the exam table. The exam is perfectly normal until he gets to the abdomen. Chris presses gingerly on the right side, just below the rib cage. The man grunts in (mock) pain. He asks the patient to take a deep breath and as he’s inhaling the student pushes briskly in the same area. The man grunts again. Chris tells the middle-aged man that he thinks maybe he has a gallstone and that the pain is caused when the stone blocks the duct leading out of the gallbladder. He’ll need to get some tests before he can confirm that diagnosis, he concludes somewhat vaguely. The student shakes the man’s hand again and steps out of the room.
I watch on the monitor as the “patient” opens a drawer and removes a form and a pen. He quickly moves through the yes/no answers by which he evaluates the student. Yes he introduced himself, and yes he washed his hands. No he didn’t always use simple language. Yes he examined the abdomen. Yes he listened for the presence of bowel sounds and pressed on the right upper quadrant.
Suddenly there’s another knock on the door and Chris walks back into the room. I forgot to do a rectal, he tells the surprised patient. Invasive exams such as this are not actually performed in these tests. Instead the student tells the patient he would like to do one and the patient gives him a card with the results of the exam written on it. But not this time. “It’s too late for you to ask for that,” the patient tells him. “You’re ou
t of here.”
After Chris finishes up his note, he returns once more to the patient’s room. The patient reviews how the student did in the encounter. He notes that Chris opened the encounter well but stumbled as he was asking questions about the pain. “Don’t worry about making sure you ask every single question on the list,” he tells the student. “You know this material. Let your instincts tell you where to go with your questions.” And another point. “Be sensitive to the patient. Once you have figured out where the pain is, don’t keep pressing on the spot.”
After the test I sought out Chris as he was collecting his backpack from the conference room. The room was filled again but the difference was immediately apparent. The med students were laughing and talking about the mistakes they made. There was the giddiness of pressure relieved. “The hardest thing is that you can’t write anything down while you’re in with the patient,” Chris tells me. “You have to hold it all in your head. You know I kind of dread these exams but we all know we need it.” He’s planning to go into surgery, but, he quickly adds, that doesn’t mean he doesn’t need to know how to do all this. “Surgeons see patients at the office too.”
Certainly there is some pretty good evidence that these skills will come in handy no matter what area of patient care a doctor goes into. But these students will need to know the clinical exam well before they go into whatever specialty they have planned. At the end of their four years of medical school each of these students will be tested on these very same skills in the very same way.
Starting in 2004, all medical students have been required to pass an exam that tests their clinical skills: their ability to take a history, perform an appropriate physical exam, and collect the data needed to diagnose and treat a patient. The United States Medical Licensing Examination—known as the USMLE—is the test physicians must pass to get licensed in most states. When I took the exam it was made up of just two parts. The first, given at the end of my second year of med school, tested knowledge of the basic sciences of medicine—anatomy, physiology, pharmacology, genetics. The second part of the test was given after graduation and focused on the understanding of basic patient care concepts—could I interpret the patient data that was provided? Was I able to formulate an appropriate differential diagnosis? What studies should be ordered based on what was known? Which medicines would be appropriate in the given setting? Which would be dangerous and must be avoided? Students must still prove their mastery of the book knowledge of medicine, but now, in addition, they will have to demonstrate their skill with patients as well.
In adding this component to competency testing, the USMLE is hearkening back to an older model. As early as 1916 the licensing exam included an evaluation of a real patient, observed by an experienced physician-grader. After taking a history and performing a physical exam, the students were questioned about what they found. This component was dropped in 1964 because of the lack of standardization intrinsic to this kind of test.
But twenty years later the licensing board was asked to design a new test of these skills that would be reliable. The National Board of Medical Examiners, which oversees the USMLE, spent another twenty years trying to develop a system for testing these skills that was fair and reproducible. The medical school class of 2005 was the first to have to jump through this additional hoop.
Medical schools didn’t exactly embrace this new test with open arms. The American Medical Association (AMA) was against it. So was their student branch as well as the student arm of the American Academy of Family Physicians. Opponents argued that most medical students already learn this stuff; and most institutions already test it, so what’s the point of repeating this testing? To the students it seemed like just one more expensive test—they have to pay to travel to one of a dozen centers across the country, and the test itself cost over $1,000. But ultimately everyone takes it because that’s what you need to do to become a doctor.
Has it done any good? It’s still too early to tell if the test has made any real difference in what doctors do, yet if my own institution is any example, then I suspect it’s having a tremendous impact on how doctors are trained—at least in medical school.
Eric Holmboe now heads the department that evaluates medical residents at the American Board of Internal Medicine (ABIM), the organization that accredits doctors specializing in internal medicine. Until 2004 he was associate program director of the Primary Care Internal Medicine Residency Program at Yale. (That’s when he saw my patient Susan Sukhoo.) At a recent meeting of directors of clinical teaching from medical schools in the Northeast, Holmboe described Yale’s preparation for the clinical skills exam part of the USMLE. The faculty had arranged for all of the fourth-year medical students to go to the University of Connecticut in Farmington, where they could take the kind of test that Chris took as preparation for the real thing.
Before the test several of the Yale faculty traveled to northwest Connecticut to check out the facilities and the test. They chose seven clinical scenarios, giving them a few tweaks until everybody was comfortable with the setup. And students from Yale traveled up in groups of six to take the test over the course of several weeks.
When the scores came back, the faculty was shocked. Twenty percent of these fourth-year Yale medical students—seventeen out of eighty-five test takers—had flunked the test. Eric described the reaction when he presented the scores to the faculty. “It was god-awful—the grief reaction in spades,” Eric told me. “Kübler-Ross was hovering over the room,” referring to the anthropologist’s famous stages of grief. “It was anger, denial, and bargaining all rolled up in one.” There were concerns about the test—even though they had signed off on it before the students had gone up—and there was plenty of skepticism—this could not represent the real performance of fourth-year Yale medical students. But amid grumbling and skepticism, everyone agreed to view the tapes of the students who failed.
When they met again, four weeks later, attitudes had changed. “The anger and denial had evolved into deep, deep depression,” Eric reported. In one tape, a Yale medical student who was planning to go into neurology completely botched the cardiac exam. He was listening for heart sounds in all the wrong places. When he was given this feedback by the patient-instructor, the student’s response was breathtaking in its arrogance and ignorance: he didn’t need to know the heart exam—he was going into neurology. Stroke, the most common neurological disease, is often caused by problems originating in the heart. “When he said that,” continued Eric, “it pretty much cinched the deal and suddenly it was Houston, we’ve got a problem.”
In response, Yale revamped the way the physical exam was taught. When I was a student, the physical exam was taught at the end of the second year, just before we began our clinical clerkships that took us into the hospital wards. It was a twelve-week course with lectures a couple of times a week. During the lecture the physiology of the organ system was briefly reviewed and the exam technique was explained and sometimes (but not usually) demonstrated. Essentially I learned about the physical exam the way I learned about sex and menstruation—I got a brief, very nonspecific chat and a book. And did I have any questions? No. Great. The end. All the real info I was left to gather on my own. I figured it out at puberty and I figured it out again in medical school. Essentially I spent hours roaming the halls of the hospital looking for medical students already doing their clerkships to ask them to show me interesting physical exam findings. Like everyone I knew, I learned what I knew about the physical exam on my own, with a patient, a book, and the help and “wisdom” of a student just one or two years ahead of me.
Now Yale begins teaching their medical students from day one. In the very first year there are classes on the techniques of interviewing and examination. Students meet in small groups weekly to review and practice these techniques for the first two years of school—first on each other, then on patients in offices and in the hospital. By the time medical students enter the hospital in their third year, they have the bas
ics of these key data-collecting tools down. They are ready to build on a sound foundation. Unfortunately, there is frequently no one there to help them start construction.
I graduated from medical school with a set of physical exam skills that was spotty and idiosyncratic, and may have been considered unacceptable—had the doctors I then worked with ever observed me. I wasn’t worried, though. I figured I’d learn the proper way to examine a patient when I was a resident. I was wrong. Studies show that by the end of residency training a physician’s skill may be no better than the skills he had as a medical student.
Some of this is undoubtedly due to the time and access constraints already discussed. But some of this is due to an underlying attitude that the physical exam is already history. I accompanied Holmboe to a meeting with several directors from medical school and residency programs to discuss a new initiative to shore up the clinical skills of doctors in training launched by the American Board of Internal Medicine (ABIM). At this meeting Dr. Raquel Buranosky from the University of Pittsburgh voiced a common complaint. “Med students in our program get hours and hours of training in the physical exam in their first and second years. They do great at our final exam. Then they go into their clinical clerkships and, poof, it’s gone.” There was general head nodding around the room and many of the directors told similar stories. Eric added one of his own. A colleague had worked with a medical student several times and been happy with his skills. Several weeks into the student’s first clinical rotation—an internal medicine clerkship—the young student returned to have one last class with his teacher. The teacher watched him evaluate a patient and was horrified to see the student do absolutely everything wrong. He interrupted the patient’s story, he asked closed-ended questions, he examined patients through their clothes. He skipped much of the exam. The teacher couldn’t believe it. He asked the student what had happened since they last met. Oh, replied the student, “my resident says we don’t have time to do all that. I mean, what’s the point?”