Canada’s Clinical Skills Experience

  • Journal of Medical Regulation
  • September 2002,
  • 88
  • (3)
  • 106-107;
  • DOI: https://doi.org/10.30770/2572-1852-88.3.106

I read with interest the articles by Peter Scoles of the NBME and Gerry Whalen of the ECFMG regarding the plan to introduce a clinical skills assessment component to the USMLE in 2004. The authors and their respective teams have a great deal of experience in this type of assessment since the NBME has been carefully planning and piloting this methodology for a decade or more and the ECFMG introduced such a process into their current international graduates’ certification process some four years ago. So there is no doubt that they are well prepared to undertake this next step.

Since they do make reference to the introduction of the objective structured clinical examination (OSCE) using standardized patients into the Qualifying Examinations (licensure-related examinations) of the Medical Council of Canada (MCC), I thought that a few observations about the MCC experience with the OSCE format over the last 10 years would be of interest to your readers.

The re-introduction of a clinical examination in 1992 in Canada was not an uneventful process. It was the first time that a patient-based clinical examination had been part of licensure examinations in Canada since the 1960s when, like the NBME, the MCC stopped using the clinical oral due to its inherent qualities of poor reliability. But during the period from the late 1960s to the late 1980s, the use of in-training ratings had been unsatisfactory due to poor reliability and doubts about their validity. In addition, the licensure bodies in Canada had witnessed a worrisome increase in complaints from patients and other physicians about poor clinical skills and, in particular, poor communication skills amongst the practicing profession.

In 1987, through the Federation of Medical Licensing Authorities of Canada, the MCC was asked to develop a method of assessing these “basic skills” amongst all physicians prior to entry into practice. After a task force report and a three-year development process, greatly encouraged by the success of the Quebec College of Physicians’ use of the OSCE format in its assessment process for family physicians, a 20-station OSCE, using standardized patients and operated at multiple sites, was introduced to the MCC qualification process in the fall of 1992. It was widely opposed, especially from certain groups of the medical community. Some were those whose disciplines were documented to be most at risk for complaints to the licensure bodies. Since the MCC OSCE took place after the first 12 months of post-graduate training, the post-graduate training directors were most unhappy as it was seen as intrusive in the clinical training process. But persist we did. Now 10 years later, the MCC still receives one or two complaints per year from some specialty groups but these are typically from a new training director who does not grasp the reason why the test was introduced in the first place: to promote and assure better basic clinical skills and communication skills in all of our practicing community. It is NOT designed to assess primary care skills or, as we occasionally hear, to see if a psychiatrist can deliver a baby!

Is there evidence that we were successful? Complaints about clinical and communications skills are down in Canada. Clearly, it would be nice to claim total responsibility, but that change is likely multi-factorial. Our follow-up studies indicate that we are clearly measuring a different set of skills than the clinical reasoning and multiple-choice components of Part 1 of the MCC Qualifying Examination. We clearly identify candidates who pass the traditional portion of the process, but who cannot pass the OSCE portion. That is typically in the 8.5 to 7.5 percent range for all takers and, in particular, we identify about 2% of individuals who cannot pass the OSCE component on repeated attempts, despite passing the traditional components and completing 12 months of post-graduate clinical training.

But perhaps most pleasing has been the response of the faculties of medicine. Unlike in 1992, all faculties in Canada have clinical skills assessment programs – almost all with standardized patients as part of the in-training assessment processes, especially at the undergraduate level. While this may be in part due to the MCC’s role in partnering with the faculties in establishing MCC assessment sites at all faculties, it is more significant than the fruits of partnership. It is the fact that the faculties realized that they had to do a better job, especially when failure rates in the first year or two after introducing the OSCE component were as much as 50–60% higher than typically seen amongst first-time Canadian graduates on the traditional pencil and paper parts of the MCC process. Recently, a recently retired Dean approached me at a medical dinner honoring a distinguished colleague and stated that the MCC’s insistence on specifically promoting the importance of assessing these fundamental skills had changed their approach to clinical assessment. They then realized that they had to invest in clinical skills assessment if their graduates were to meet the public need. And that they did.

I want to take this opportunity to wish the USMLE team well and we look forward to collaborating with them on the development and improvement of methods of clinical assessment to ensure that all physicians are solidly prepared in the basic skills expected of the good clinician.

Loading
  • Print
  • Download PDF
  • Article Alerts
  • Email Article
  • Citation Tools
  • Share
  • Bookmark this Article

Jump to section