Standardized Assessment of Pharmacists' Patient Care Competencies:

  • Journal of Medical Regulation
  • January 2016,
  • 102
  • (4)
  • 17-27;
  • DOI: https://doi.org/10.30770/2572-1852-102.4.17

Abstract

Assessing the ongoing competence of practicing health care professionals requires regulators to balance complex demands of governments and the public, as well as interests and concerns of practitioners. A proliferation of models has evolved across professions and jurisdictions. In this article, we report on a model utilizing standardized assessment using best-practice measurement techniques and methods for evaluation of ongoing (i.e., post-registration) clinical competencies in the profession of pharmacy in Ontario, Canada. This model involves categorization of the profession into an active patient-facing and non patient-facing register, implementation of a learning portfolio requirement to replace mandatory continuing education credit accumulation, and the use of standardized assessment techniques, such as a multiple-choice test of clinical knowledge and an objective structured clinical examination (OSCE) of clinical reasoning and interpersonal skills. Lessons learned from the development, implementation and retrospective analysis of almost two decades of data from this program can provide regulators in diverse professions and different jurisdictions with tools for standardized assessment of patient care competencies.

Keywords:

Background

The profession of pharmacy in Ontario, Canada, is regulated by the Ontario College of Pharmacists (OCP). Pharmacists in Ontario are the third largest group of regulated health professionals in the province (after nurses and physicians); as outlined in Table 1, in 2015, there were approximately 15,000 registered pharmacists serving a population of approximately 12 million people1. In the mid-1990s, in accordance with sweeping regulatory changes affecting all regulated health professions in the province, OCP introduced a novel model for assessing ongoing competency of practicing pharmacists. Though the context of this model is specific to one profession in one jurisdiction, the findings and lessons learned through the development and implementation process are of general relevance to many health professions in different jurisdictions. A unique feature of this model was its application to all pharmacists in the province, regardless of specialization, practice setting or professional context.

Table 1

Demographic Profile of the Pharmacy Profession in Ontario, Canada (2015)

Historically, pharmacy (like many other professions) had relied upon a self-directed and self-reported continuing competency system focused on collection of mandatory continuing education (CE) hours. As increasing evidence pointed to a negligible association between compulsory CE hours and practice improvement or change,2,3 the regulator considered this historical model unsustainable, particularly in the context of the new legislative framework within the province, requiring regulators to implement and publicly report on quality assurance programs specifically designed to measure and ensure ongoing competency of practitioners.4

For all regulators, the challenge lay in implementing a quality assurance (QA) program that met legislative expectations for a fair, transparent, and robust mechanism for assessing practitioners' competencies that would also be supported by practitioners themselves.5 OCP anticipated that a summative assessment process could give rise to resistance from the profession, most of whom had not undergone any type of formal practice-based assessment since they had graduated from university programs and had passed initial entry-to-practice licensing examinations. As a regulator, this presented both a challenge and an opportunity to innovate in the area of regulator-led competency assessment, providing opportunities to share experiences and practices with other professions and regulators.

FROM THE OUTSET, PHARMACISTS INVOLVED IN CONSULTATION REVEALED CONCERNS ABOUT AN ‘IVORY TOWER’ ACADEMIC COMPETENCY ASSESSMENT EXERCISE DEVISED BY EXPERTS THAT WOULD NOT BE ALIGNED WITH THEIR DAY-TO-DAY CLINICAL REALITIES IN PRACTICE.

The Peer Review Model of Quality Assurance

OCP undertook an extensive consultation process with the profession in the mid-1990s to try to identify and address concerns, and to determine optimal ways of balancing these with the legislative requirements. This consultative process took various formats, including province-wide meetings with pharmacists, consultations with psychometricians and other experts, and reviews of best and promising practices from other professions nationally and internationally. Several key program design principles were identified as being crucial for successful implementation:

  • 1. Gaining buy-in from the profession. Regardless of legislative mandates and requirements, it was clear from the outset that the profession itself would need to accept any changes to the existing processes. To achieve this buy-in, it would be necessary to address both the rational and emotional concerns of pharmacists, and to ensure that the profession itself was involved in every step of program development.

  • 2. Grounding the consultation and program development process in “real-world” professional practice. From the outset, pharmacists involved in consultation revealed concerns about an “ivory-tower” academic competency assessment exercise devised by experts that would not be aligned with their day-to-day clinical realities in practice.

  • 3. Using an evolutionary, not revolutionary, approach. Using an existing system of targeted pilot-testing — with abundant opportunities for pharmacists to provide input — was advantageous. As part of this evolutionary process, it became clear that one of the OCP's greatest assets was its structure as a self-regulating professional body, in which practitioners themselves could fully participate in discussion and program development.

  • 4. Embracing and communicating a message emphasizing the professional value of the formative (rather than the personal impact of the summative) assessment received during the process. Focusing on the theme that quality assurance is educational and not punitive helped to address the concerns of many pharmacists that they could potentially lose their registration or their ability to operate a pharmacy due to an unsuccessful assessment. Framing the competency assessment process as a win-win for pharmacists individually and for the profession collectively, rather than as a simple pass-fail evaluation, was essential.

Ultimately, a quality assurance model evolved, with the following core components:6

  • 1. Use of a two-part register. A two-part register was developed in which pharmacists would self-declare annually whether they were in Part A (“active patient-focused practice”) or Part B (“non patient-focused practice). The nature of pharmacy practice itself means that some pharmacists are simply not involved in direct provision of care and services to patients: Some pharmacists run complex businesses, or manage diverse organizations, or work in research or drug information centers. For these individuals without a patient-facing practice, “competency” may have a very different meaning than for pharmacists who actually work on a daily basis with real patients. The two-part register meant that only pharmacists who were providing direct patient care for a minimum of 600 hours over a 3-year period were required to participate in the quality assurance program; the resources of the regulatory body would not be used to assess pharmacists in Part B, who were by definition not in patient-facing practice.

  • 2. No specific quality assurance assessments for pharmacists practicing in different specialties or sub-specialties. Within this model, and after considerable discussion and debate, the decision was made to not implement specific quality assurance assessments for pharmacists practicing in different specialities or sub-specialities (such as hospital practice or primary care) within the Part A register. This was based on the fact that in Ontario all pharmacists are registered in one “broad-based” class — there are no sub-specialty or setting-specific classes of registration. While pharmacists may be practicing in one setting today (e.g., a community-based drug store), their license permits free movement with no restrictions to any other setting (e.g., tertiary care hospital).

  • 3. Introduction of a continuous professional development (CPD) framework to replace the historical mandatory continuing education credit requirement. The consultation process with pharmacists highlighted the ambivalence within the profession towards compulsory CE credits: While pharmacists appreciated the simplicity and clarity of this requirement, they concurred with the available evidence that the impact of such a requirement on practice change was negligible. Shifting towards a CPD approach, in which pharmacists self-assessed their learning needs and gaps, identified appropriate resources to address these gaps, undertook and self-assessed their learning and development, then implemented change in practice — all through a reflective process — was challenging. The CPD model was built upon a learning portfolio, in which all pharmacists (Part A or Part B) are required to document their learning activities in a retrievable format, and to submit this record to the regulator upon request. Maintenance of such a learning portfolio is a condition for annual renewal of a license to practice.

  • 4. Development of a peer-review competency assessment process. Perhaps the most innovative — and controversial — part of this evolution was the decision to include, as part of the competency assessment process, standardized practice-based assessments that were grounded in day-to-day practice that focused on published standards of practice. During the consultative process, consideration was given to assessing individual pharmacist's performance (rather than competence) through practice-site visits and on-site clinical audits, where pharmacists would demonstrate their competency (i.e., perform) in their own pharmacy with their own patients. While the contextual richness of this approach was positive, there were concerns about the significant random variations in pharmacy practice that could never be controlled or accounted for when in-practice evaluations were used as a foundation for competency assessment. Further, the significant variations between individual pharmacist's practices (such as prescription volumes and complexity of cases seen in a given day) were so large that any attempt to actually benchmark or compare pharmacists to each other or to a defined standard of practice would be very difficult. As a result, OCP adopted a model in which a common, single assessment for all candidates on a given day would be utilized in order to facilitate intra-professional comparisons and to ensure each individual practitioner had an opportunity to demonstrate competency as defined by the standards of practice. This standardized assessment consisted of two components:7

    • Open-book test. A case-based, open-book, multiple-choice test of clinical knowledge was used, in which the core fund of knowledge for a practicing pharmacist would be assessed using clinical cases to provide context. Since, in the real world, pharmacists always have recourse to drug information and other resources, the decision was made to allow pharmacists to use any resources they wished to complete this 54-question test. The test itself consisted of 18 cases, each of which lead to three multiple choice questions, each of which contained four options. To emphasize the peer-review nature of this test, practicing pharmacists—not academics or specialists—participate in the writing, reviewing, validation, and psychometric standard-setting of all cases and questions used in this process.

    • Multi-station clinical examination. A multi-station, objective, structured clinical-examination (OSCE) was used, in which the problem-solving, interpersonal, communication, and clinical decision making skills of pharmacists are evaluated using standardized patients and trained peer assessors applying both analytical checklists and global/holistic assessment. A total of six stations were used for this portion of the peer review. Similar to the open-book test, individual stations and analytical checklists were written, reviewed, and validated by practicing pharmacists, not by academics or experts. Cut scores for analytical checklists were established by panels of practicing pharmacists using a modified Angoff method.

A CASE-BASED, OPEN-BOOK, MULTIPLE CHOICE TEST OF CLINICAL KNOWLEDGE WAS USED, IN WHICH THE CORE FUND OF KNOWLEDGE FOR A PRACTICING PHARMACIST WOULD BE ASSESSED USING CLINICAL CASES.

Importantly, this regulator-led model focused on competence rather than performance, and emphasized use of standardized assessments using best-practice evaluation methods rather than non-standardized in-service/in-practice site visits or practice audits. The decision to utilize standardized assessments benchmarked to core competencies (rather than site- or context-specific assessments) was made in recognition of the generalist nature of the pharmacy license. Pharmacists can move between different practice-contexts (e.g., a community pharmacy/drug store to an ambulatory care clinic to a family-practice site to a tertiary-care hospital) without any regulatory limitations or requirements. Consequently, a standardized assessment of core practice competencies central to safe and effective practice in any type of practice was deemed essential for this process. The decision to focus on “competence” (i.e., measurement of core pharmacy practice knowledge and skills within a standardized and objective context) rather than “performance” (i.e., real-world demonstration of practice knowledge and skills in unique subjective contexts that are non-standardized and highly situational) was made to emphasize both the fairness and transparency of the process to both pharmacists and members of the public.8 As this quality assurance function was considered integral to OCP's role of public protection, no separate cost accounting was undertaken regarding direct program costs to implement and maintain this program (i.e., costs were allocated across the entire regulatory body rather than to this specific activity). All pharmacists, as part of their annual renewal of registration with the regulatory body, pay an annual fee of ~$750CDN (Canadian dollars). Individual pharmacists randomly selected to participate in this program do not pay any additional fees or charges as this is considered part of the regulator's mandate, and costs are covered by the annual registration fees paid by all members of the profession.

TAKEN TOGETHER, THE MULTI-STATION CLINICAL EXAMINATION AND OPEN-BOOK TEST PROVIDED THE MOST ROBUST, OBJECTIVE, AND DEFENSIBLE FORM OF CLINICAL SKILLS ASSESSMENT FOR PHARMACISTS AVAILABLE.

Taken together, the multi-station clinical examination and open-book test provided the most robust, objective, and defensible form of clinical skills assessment for pharmacists available.8 Recognizing the concerns previously expressed by pharmacists that a standardized competency assessment, administered by a regulator, could give rise to disciplinary, fitness-to-practice, or other proceedings, considerable efforts were taken to provide reassurance to participants.5 In particular, a key provision of the legislative framework ensured confidentiality of all information gathered under the auspices of the quality assurance program. Strict rules were embedded within the regulations to prevent disclosure of information gathered through the QA process with any other areas within the regulatory body.9 This message was communicated frequently — the program was educational, not punitive.

Framing the overall competency assessment process — referred to as Practice Review — as educational and developmental was reinforced through inclusion of an additional component: a facilitated learning portfolio and sharing session in which pharmacists discussed and shared their continuous professional development needs, challenges, and successes with one another.10 The learning portfolio was initially introduced as a tool to support CPD — and in particular, self-reflection, self-assessment, identification of learning gaps and documentation of learning outcomes. Within the competency assessment process, however, the opportunity to use the learning portfolio as a tool for peer benchmarking and sharing of experiences through a facilitated discussion format became both apparent and uniquely valuable in situating the assessment in a constructive, educational context rather than a summative, evaluative one.10

It also prompted the regulatory body to engage in education/remediation for those practitioners who were not able to meet competency standards through the assessment process.11 Meeting or exceeding standards in the peer review process was viewed as demonstration of the success of the pharmacist's continuing professional development activities and approaches. Those pharmacists who successfully completed the assessments were listed as being in the “self-directed” continuing professional development group. Those pharmacists who did not meet expectations required additional support and education, and were identified to participate in “peer-directed” continuing education. Specifically, these pharmacists worked with peer practitioners and regulatory-body staff to develop a structured learning plan to address knowledge and skills deficits identified through the assessment process. To this end, OCP partnered with educational institutions and other providers to develop a menu of remedial and skills-development supports. Once the plan was successfully implemented and pre-defined milestones were achieved, the pharmacist was eligible to re-challenge the assessment, in whole or in part as directed by the individual's results from the standardized assessment and, if successful, to be listed in the self-directed continuing professional development group. This support, provided by the regulatory body to actually work with its members to address knowledge and skills deficits in a collegial manner, rather than simply punishing the individual, was crucial to building acceptance from the profession.11

OF THOSE WHO WERE INITIALLY IDENTIFIED AS REQUIRING PEER-GUIDED CONTINUING EDUCATION, 73% WERE ABLE TO SUCCESSFULLY RE-CHALLENGE THE COMPETENCY ASSESSMENT A SECOND TIME AND MEET OR EXCEED EXPECTATIONS.

Results

The Quality Assurance program was launched by the College in 1996, after more than three years of consultations and pilot testing. A significant profession-wide education and consultation process was also introduced to inform pharmacists of the process and the safeguards in place to balance public protection with member interests. For logistical reasons, approximately 260 Part A pharmacists (representing approximately 5% of the practitioners in the province), initially and each year thereafter, were randomly selected to participate in the peer-review process. In addition, those pharmacists who elected to transfer from Part B to Part A of the register were also required to successfully complete the peer review. The process itself required between five and six hours to complete, and consisted of three major components: first, the case-based multiple choice test of clinical knowledge; second, the objective structured clinical examination (focusing on clinical reasoning and communication/interpersonal skills); and third, the facilitated learning portfolio sharing session, in which practitioners discussed and shared relevant practice-focused continuing professional development issues and activities.

Table 2 outlines overall performance in the peer review program: Between 1997 and 2012, close to 90% of pharmacists in Ontario met or exceeded competency expectations established by the profession and measured through the process by OCP and were deemed to be part of the self-directed continuing professional development group.

Table 2

Overall Performance for Peer Review (Period Covering 1997–2012)

Of those who were initially identified as requiring peer-guided continuing education, 73% were able to successfully re-challenge the competency assessment a second time and meet or exceed expectations, resulting in a shift to the self-directed continuing professional development group. A number of resources had been put into place to assist pharmacists requiring reassessment. First, a peer support group provided the reassessment-pharmacists an opportunity to meet with a small group of peers to discuss the nature of their practice and the challenges they were encountering with the Practice Review. In addition, OCP developed a Professional Development Workshop focusing on problem-solving skills related to the clinical knowledge assessment and communication skills related to the Standardized Patient scenarios. Based on the results of the Practice Review, common knowledge and skills deficits identified included: 1) difficulty in conducting a structured interview to gather information in an efficient and comprehensive manner; 2) lack of appropriate monitoring parameters or discussion of follow-up plan with patient/caregiver/prescriber; 3) difficulties using standardized pharmacy reference texts and electronic resources to answer drug information or clinical questions; and 4) difficulty in effectively applying a clinical reasoning process to identify, resolve, or prevent drug therapy problems. These identified deficits were used to provide a curricular structure for the remediation process. Overall, the impact of attending the workshop was positive (particularly with respect to participants' confidence in structuring a comprehensive patient interview and in using drug information resources) but notably, pharmacists in practice less than 25 years demonstrated a greater benefit compared to pharmacists in practice for more than 25 years.11

...A PEER SUPPORT GROUP PROVIDED THE REASSESSMENT PHARMACISTS AN OPPORTUNITY TO MEET WITH A SMALL GROUP OF PEERS TO DISCUSS THE NATURE OF THEIR PRACTICE AND THE CHALLENGES THEY WERE ENCOUNTERING WITH THE PRACTICE REVIEW.

Since inception, the program has been tracking performance of pharmacists based on years since graduation from their professional degree program. As seen in Figure 1, there is a clear increase in the number of individuals who do not meet competency expectations and therefore require peer-guided continuing education the further away in time they are from original graduation with a pharmacy degree. A total of 207 individuals were identified as requiring peer support through this program (based on years in practice: 0–5 years = 0; 6–15 years =24; 16–24 years =58; 35–44 years = 96; and 45 years or more in practice = 29)

Figure 1

Performance by Years Since Graduation

Within the profession of pharmacy in Ontario, only one type of license/registration is provided; there are no provisions for specialty licenses or for different registrations based on site of primary practice (e.g., community vs. hospital). Since only 12% of registrants in the province self-identify as primarily hospital pharmacists,1 there were some initial concerns that the competency assessment (and in particular, the OSCE component) would be biased towards community practitioners or primary care. Cumulative data, however, suggests that hospital pharmacists perform significantly better across the competency domains than their community-based colleagues (Figure 2).

Figure 2

Competency Assessment Results by Practice Site

The strong performance of hospital pharmacists did not diminish the fact that the Practice Review, from their perspective, was outside their comfort zone. Nevertheless, their performance does argue for the position that the underlying skills that the Practice Review assesses are common for the pharmacy community as a whole.

Pharmacy in Ontario is notable for having a significant reliance upon internationally educated health professionals (those educated outside Canada or the United States) within the domestic workforce.12 Since the early 2000s, more than 40% of all newly registered pharmacists in the province have consistently been international pharmacy graduates (IPGs).13 In large part, this has been attributed to the fact that there are only two schools of pharmacy in the entire province, and only eight schools of pharmacy across English Canada. As illustrated in Figure 3, the data reveals a differential pattern of performance on competency assessments based on place of graduation.

Figure 3

Competency Assessment Results by Place of Graduation

Factors contributing to these differences could reflect the fact that professional preparation around the world does in fact differ in the profession of pharmacy. Some programs place greater emphasis on the technical or clinical aspects of practice while others emphasize pharmacy patient relationships and the need to exercise professional judgment. The Practice Review was necessarily grounded in the practice of pharmacy in Ontario. Therefore, it would be expected that those trained in Ontario would have both education and experience better aligned to the underlying expectations of the Practice Review than those who were educated elsewhere.14

These differences continue to exist despite the fact that pharmacists may have registered in Ontario years previously with the expectation that they would eventually adapt to Ontario practice. While the adaption most certainly occurs for the majority of international candidates, the results indicate that a number of international pharmacists experienced greater difficulty in meeting the competency level as determined by their peers for the Practice Review.15

Discussion

The direct, standardized, and objective assessment of practitioners' clinical competencies by a regulatory body is unique, and has not been attempted in such a systematic and sustainable manner in other health professions or jurisdictions. While the psychometric strength of this approach is both clear and defensible,8 it is also important to acknowledge that such an approach gives rise to levels of concern and anxiety among many registrants. The fear of “being tested” is pervasive and there were added concerns about how results would be used by OCP — both of these were significant challenges that needed to be addressed and overcome in order to allow this model to be successfully implemented. Through the combination of extensive and ongoing consultation with members of the profession, policy decisions that reinforced the notion that this was an educational (rather than punitive) intervention, and a structure that ensured confidentiality through clear protocols for sharing of results across the regulatory body, the profession collectively moved from grudging acceptance of this model to actual buy-in. While few pharmacists would say they “liked” this approach, eventually most pharmacists would agree that it was a fair, transparent, appropriate —and necessary —process to ensure ongoing safe and effective professional practice.16

THE FEAR OF ‘BEING TESTED’ IS PERVASIVE AND THERE WERE ADDED CONCERNS ABOUT HOW RESULTS WOULD BE USED BY OCP — BOTH OF THESE WERE SIGNIFICANT CHALLENGES THAT NEEDED TO BE ADDRESSED AND OVERCOME IN ORDER TO ALLOW THIS MODEL TO BE SUCCESSFULLY IMPLEMENTED.

A key contributing factor to the success of implementation involved a strong commitment on the part of the regulatory body to data gathering, analysis, and dissemination. Based on this analysis, several important and consistent trends have been identified regarding competency drift within pharmacy in Ontario. Those at highest risk of not meeting competence standards over time include: 1) pharmacists who were 25 years or more away from graduation (accounting for 36% of pharmacists); 2) pharmacists working in sole-practitioner environments with limited opportunities to interact with peers in daily practice (49% of registered practitioners); and 3) international pharmacy graduates (IPGs) who received their initial education and training outside Canada or the United States (approximately 37% of pharmacists in Ontario).7,8

Importantly, almost 90% of practitioners randomly selected for this process were able to meet or exceed competency expectations on their first attempt and did not need to return for reassessment. For those who were unable to meet expectations, the peer-supported professional development process appeared successful, with 73% of these individuals being able to meet or exceed expectations on their second attempt.

While the data generated and feedback provided by peer review candidates over 20 years are of interest, the development of the program model itself and the experience of a regulator in direct competency assessment of practitioners is perhaps of greater interest outside the profession of pharmacy. In most jurisdictions, regulators are challenged to balance the competing priorities of ensuring public safety and facilitating engagement of members in their CPD processes. Competency assessment — especially for seasoned practitioners who have not been tested in some time — can be daunting and stressful, and navigating the shoals of pharmacists' misconceptions and fears about this process were challenging. Focusing on competency assessment using best-practice evaluation techniques in a standardized and objective manner, rather than situational performance assessment of a more variable nature, supported the regulator's objective of ensuring safe and effective practice of the profession of pharmacy in a fair, transparent, and psychometrically defensible manner.

Practice Review as a Catalyst in Promoting Continuing Professional Development

Two elements were not directly part of the standardized assessment process, but each played an important role in the overall peer review program. The first was the learning portfolio sharing session where practitioners shared with each other the activities they had engaged to promote continuing professional development. The feedback respecting the perceived importance of this component of the Practice Review was overall very positive. Specifically, pharmacists overwhelmingly confirmed that they saw value in learning from their peers about the activities and approaches they were engaging in. The sessions, which were facilitated by a peer practitioner, provided participants with an opportunity to consult each other in terms of best practice and to share, with their peers, changes in approaches to practice that they were considering implementing.16

Over the years, the learning portfolio sharing session evolved to where the candidates themselves influenced its content and structure. Initially, the session was designed to assist members in understanding the principles in maintaining a learning portfolio, but in the later years the session also became an opportunity for pharmacists to talk about important issues such as advanced practice. In fact, specific cases involving challenging issues related to patient care were developed and these simulations formed the basis for group discussion and emergence of examples of best practice. Taken together, this activity served as a catalyst for many members to explore a variety of learning activities that would benefit both their practice and the patients that they would care for on a day-to-day basis.16

A second, and most surprising, result of the Practice Review, in terms of serving as a catalyst for professional development, was brought to light during the feedback sessions that pharmacists participated in at the end of each review. Facilitated by both College staff and an assessment consultant, the sessions were designed to elicit from pharmacists important feedback respecting the impact experiencing the Practice Review had on them, personally and professionally. Interestingly, the candidates commonly described the experience as initially stressful (when first selected to participate), but just as frequently described it as a valuable experience and necessary program. Further, one of the most interesting and prevalent comments commonly identified the very existence of the Practice Review as a key catalyst for motivating members to engage in further learning.16 The types of learning pharmacists reported engaging in varied. However, what was consistent was the notion that without having been selected for the Practice Review, the learning activity would not have occurred at the level that it did.16 Most encouraging was the candidates' non-solicited positive feedback statements around the impact of feeling the need to prepare for the Practice Review. In the course of doing so, many pharmacists stated that they rediscovered the confidence to learn and grow professionally and considered this not only beneficial for themselves but also for the patients they served.

...PHARMACISTS OVERWHELMINGLY CONFIRMED THAT THEY SAW VALUE IN LEARNING FROM THEIR PEERS ABOUT THE ACTIVITIES AND APPROACHES THEY WERE ENGAGING IN.

Other Impacts

The Practice Assessment and Peer Review process implemented in Ontario for pharmacists has been of interest as a model program for continuing competency assessment by health care regulators in Canada and around the world. Data gathered and analyzed over the years of administration has generated measurable and reportable results and identified consistent trends reported in this paper.

The positive impact of OCP's QA program — with its use of standardized patient scenarios — strongly influenced the College to support the Pharmacy Examining Board of Canada's (PEBC) addition of an OSCE component to its existing written clinical examination. The PEBC examination, with OSCE, was implemented in 2001.17 In addition, the findings related to particular challenges faced by International Pharmacy Graduates (IPGs) led the regulatory body to partner with a university to develop one of the world's first International Pharmacy Graduate Programs, a 16-week intensive bridging program designed to better support immigrant pharmacists (particularly from non-English speaking jurisdictions) in applying their previous knowledge and skills to the Canadian pharmacy practice context.15 This program has, as of 2016, graduated more than 1,000 pharmacists. Performance on this Practice Review is comparable between Canadian graduates and IPGs who have completed the program.15

Ontario's program has also garnered much international attention — and has been the focus of numerous presentations at conferences such as Federation International Pharmaceutique (FIP), Royal College of Physicians and Surgeons of Ireland (RCSI), the Pharmaceutical Society of Ireland (PSI) and the international Council on Licensing, Enforcement and Regulation (CLEAR). In 2010, Ontario's continuing competency assessment process for pharmacists was chosen by the Pharmaceutical Society of Ireland as the model on which the Quality Assurance Program for Pharmacists in Ireland would be built.18 Ontario's program was selected as the desired model by the PSI after a comprehensive review of international programs in multiple health care professions. Unlike Ontario, where the program is under the direct auspices of the pharmacy regulator, in Ireland the program is to be administered through the Irish Institute of Pharmacy (IIOP), an independent body accountable to the regulator.

The province of Ontario began examining the potential expansion of scope of practice of pharmacists to enhance care of patients and to optimize use of health care resources in 2006. Part of this scope of Practice Review involved greater independence for pharmacists in adapting, modifying, renewing, and initiating prescriptions within a collaborative care framework. In 2012, this scope of practice expansion became a reality, in part due to the recognition that pharmacists were well qualified and that continuing competency assessment/quality assurance/peer review for pharmacists was robust, meaningful and provided the public, physicians, and government with the assurance that they were capable of greater independent responsibilities for direct patient care.19,20

The long-term impact on patient care of this program on the quality of pharmacists' practice has not been fully established. While general acceptance and satisfaction is strong, and specific learning of new skills has occurred,11 it is less clear whether this has actually translated into behavioral change of pharmacists and whether any behavioral change has actually resulted in improved outcomes for patients (e.g., decrease in drug therapy problems or errors). Anecdotally, pharmacists have reported that the act of preparing for the Practice Review results in new learning that translates into behavioral change;11 whether this learning and change sustain beyond the immediate period of the Review itself is not known.

While the pharmacy profession in Ontario was, as noted above, initially wary of the practice and peer review process, most pharmacists today agree that the program and its accompanying assessments were fair, valid and necessary and that overall the very existence of the Practice Review has had a positive impact on their own professional development and practice.

Conclusion

The peer review model of quality assurance described here has been successfully used by the profession of pharmacy in Ontario for more than two decades. The impact of this program on individual practitioners' professional development has been meaningful. Data from this program has been used provincially to support expansion of scope of practice of pharmacists, and has been used internationally to implement similar programs in other jurisdictions. The use of a standardized assessment model using best-practice measurement techniques and methods for the evaluation of clinical competencies has helped OCP balance its responsibilities to the public and government, and to the practitioner community.

Chaudry et al (2013) have highlighted the need to “…thoughtfully explore pathways and procedures by which Maintenance of Licensure (MOL) may be implemented for physicians in the years ahead.”21 As medicine — and all of the health professions — consider ways of balancing these responsibilities and needs, consideration of the use of standardized assessment as implemented by pharmacy in Ontario may be of value.

Acknowledgments

The authors gratefully acknowledge the contributions of the staff and Council members of the Ontario College of Pharmacists, who participated in the development and delivery of the program described in this manuscript.

About the Authors

  • Zubin Austin, BScPhm, PhD, is a Professor and Murray Koffler Chair in Management, Leslie Dan Faculty of Pharmacy, at the University of Toronto, Ontario, Canada.

  • Deanna Williams, BScPhm, is a Consultant with Dundee Consulting, Dunnville, Ontario, Canada, and was formerly Registrar of the Ontario College of Pharmacists.

  • Anthony Marini, PhD, is President of Martek Assessments Ltd. and Associate Professor Emeritus, University of Calgary, Canada.

References

  1. 1.
    Ontario College of Pharmacists. 2015 Annual Report. Published in: Pharmacy Connection2016;23(2): 5461.
  2. 2.
    Davis DA , ThomsonMA, OsmanAD and HaynesRB. Evidence for the effectiveness of CME: a review of 50 randomized controlled trials. JAMA1992;268 (9):11111117.
  3. 3.
    Davis DA , O'BrienMA, FreemantleN, WolfFM, MazmanianP and Taylor-VaiseyA. Impact of formal continuing education: do conferences, workshops, rounds and other continuing education activities change physician behaviour or health care outcomes? JAMA 1999:282(9):867874.
  4. 4.
    Alderson D and MontesanoD. Regulating, de-regulating, and changing scopes of practice in the health professions: a jurisdictional review. A report prepared for the Health Professions Regulatory Advisory Council (HPRAC), Government of Ontario. April 2003. Accessed at: http://www.ramblemuse.com/mps/documents/HPRAC_2003_WhyRegulate_Ontario.pdf on July 25 2016.
  5. 5.
    Adjusting the balance: a review of the regulated health professions act. Report to the Minister of Health and Long-Term Care, Government of Ontario. Health Professions Regulatory Advisory Council 2001. March 2001. Accessed at: http://www.hprac.org/en/reports/resources/RHPA_Review_2001_Report.pdf on July 25 2016.
  6. 6.
    Ontario College of Pharmacists. Quality Assurance Program. Accessed at: http://www.ocpinfo.com/practice-education/qa-program/ on July 25 2016.
  7. 7.
    Austin Z , CroteauD, MariniA and ViolatoC. Continuous professional development: the Ontario experience in professional self regulation through quality assurance and peer review. Am J Pharm Educ2003;57:Article 56.
  8. 8.
    Austin Z , CroteauD, MariniA and ViolatoC. Assessment of pharmacists' patient care competencies: validity evidence from Ontario (Canada)'s quality assurance and peer review process. Pharm Educ2004; 4(1):2332.
  9. 9.
    The Regulated Health Professions Act, 1991. Accessed at: http://www.health.gov.on.ca/en/pro/programs/hhrsd/about/rhpa.aspx on July 25 2016.
  10. 10.
    Austin Z , MariniA and DesrochesB. Use of a learning portfolio for continuous professional development: a study of pharmacists in Ontario (Canada). Pharm Educ2005;5(3):17.
  11. 11.
    Austin Z , MariniA, MacLeod GloverN and TabakD. Peer mentoring workshop for continuous professional development. Am J Pharm Educ2006;70(5): Article 117.
  12. 12.
    Office of the Fairness Commissioner, Government of Ontario. Fair Registration Practices Report — Pharmacists (2015). Accessed at: http://ort.fairnesscommissioner.ca/report.php?qid=29&year=2015 on July 25 2016.
  13. 13.
    Austin Z and DeanMR. Bridging education in pharmacy: the international pharmacy graduate program in Ontario, Canada. Am J Pharm Educ2004:68(5): Article 108.
  14. 14.
    Austin Z and GalliM. Assessing communicative competency of international pharmacy graduates in Ontario Canada. J Soc Admin Pharm2003;20(6):225231.
  15. 15.
    Austin Z , GalliM and DiamantourosA. Development of a prior learning assessment for pharmacists seeking licensure in Canada. Pharm Educ2003;3(2):8796.
  16. 16.
    Austin Z , MariniA, Macleod GloverN and CroteauD. Continuous professional development: a qualitative study of pharamcists' attitudes, behaviours, and preferences in Ontario, Canada. Am J Pharm Educ2005; 69(1):Article 4.
  17. 17.
    Austin Z , O'ByrneC, PugsleyJ and Quero MunozL. Development and validation processes for an objective structured clinical examination (OSCE) for entry-to-practice certification in pharmacy: the Canadian experience. Am J Pharm Educ2003;67(3) Article 76.
  18. 18.
    The Pharmaceutical Society of Ireland: the pharmacy regulator. Accessed at: http://www.thepsi.ie/gns/home.aspx on July 25 2016.
  19. 19.
    Health Professions Regulatory Advisory Council (HPRAC). Review of the scope of practice of pharmacy — interim report to the Minister of Health and Long Term Care on mechanisms to facilitate and support interprofessional collaboration among health colleges and regulated health professionals: Phase II, Part 1. September2008: 1977.
  20. 20.
    Health Professions Regulatory Advisory Council (HPRAC). The prescribing and use of drugs in the profession of pharmacy HRPAC Critical Links Document. 2009. Accessed at: http://www.hprac.org/en/reports/resources/hpraccriticallinksenglishjan_09.pdf on July 25 2016.
  21. 21.
    Chaudry HJ , CainFE, StazML, TalmageLA, RhyneJA, and ThomasJV. The evidence and rationale for maintenance of licensure. J Med Reg2013; 99(1):1926.
Loading
Loading
Loading
  • Print
  • Download PDF
  • Article Alerts
  • Email Article
  • Citation Tools