ABSTRACT:
Background: Group medical practice has grown globally, necessitating evaluation methods to enhance patient care and physician well-being. The College of Physicians & Surgeons of Alberta (CPSA) launched a pilot project, the Group Practice Review (GPR), to assess family medicine and general practice (FM/GP) clinic performance, focusing on groups or clinics rather than at the individual level.
Methods: Eight volunteer clinics with a total of 65 FM/GPs in Alberta, Canada participated in the GPR pilot. Compliance with CPSA's Standards of Practice (SOP), chart scores, complaints, and risky prescribing were evaluated. SOP compliance was assessed through office observation and policy reviews. Prescription flags identified the number of patients on high doses of opioids and benzodiazepines.
Results: None of the clinics met all CPSA SOP. On average, 91.7% of SOPs were met. Common deficiencies included patient recordkeeping and drug storage standards. Post-visit feedback suggested improvements in process definitions and access to SOP compliance reports before facilitation visits. The average cost per physician to conduct the GPR as a part of the pilot project was $300 Canadian dollars ($225 USD).
Conclusions: The GPR pilot demonstrated a feasible, cost-effective approach for evaluating FM/GPs in group settings, fostering proactive environments and promoting timely corrective actions.
Keywords:
Introduction
Since its inception, group medical practice has experienced significant global growth, driven by various reforms and incentives. 1 This trend is particularly pronounced in western countries such as Finland, Sweden, and the United Kingdom (UK), where over 80% of family medicine practitioners are in group settings. 2 Transitioning to group practice is believed to enhance patient care by potentially improving overall performance and outcomes and supporting physicians' well-being. 3, 4 Despite its widespread adoption, the full impact on patient care remains unclear. Study results are mixed: while some discussions emphasized the protective nature of group practice, 3-10 others highlighted interpersonal and clinic dynamic barriers to improving overall quality of care. 6, 9, 11 In Canada, the percentage of physicians in group practices grew from 53% in 2012 to 65% in 2019, 12 underscoring the need for effective evaluation methodologies.
One useful framework for understanding and delineating physician performance in practice is the "Cambridge Model." 13 This theoretical model provides a structured approach for analyzing physician performance by identifying three core domains:
Competence factors — the physician's fundamental knowledge, skills, and training acquired through medical education and professional development;
Individual factors — personal characteristics such as age, specialty, gender, country of medical training;
System-level or group factors — the workplace environment, including workload, access to resources, processes, remuneration model(s), and institutional culture.
By interpreting physician performance according to these three domains, the Cambridge Model allows for a more comprehensive understanding of how individual, systemic, and competence-related factors link to performance outcomes. This model emphasizes the significant role that system-level factors can play in shaping physician performance. 13 For instance, the Bawa-Garba case in the UK revealed how systemic failures, such as inadequate staffing and poor communication, played a pivotal role in a tragic outcome; yet the focus remained on individual culpability. 14 Similarly, the story of Greg Price in Alberta, Canada highlighted how delays and miscommunication within the healthcare system contributed to a young man's death. 15 These cases demonstrate that poorer-performing physicians might benefit from the support of higher-performing groups, while higher-performing physicians could be hindered by dysfunctional systems. Therefore, it is crucial for medical regulatory authorities (MRAs) to evaluate system issues, including group performance, as part of their mandate to maintain and enhance the quality of medical care and to protect the health and safety of the public.
The practice of medicine is dynamic, requiring licensed physicians to engage in lifelong practice reflection and continuous professional development. This commitment is supported in Canada by the national certifying colleges—the College of Family Physicians of Canada and the Royal College of Physicians and Surgeons of Canada—which provide a standards-based framework for ongoing professional development to maintain certification. In alignment with this framework, the Canadian revalidation process emphasizes collaboration between national colleges, MRAs, and other organizations to develop tools and strategies that integrate learning into daily practice. These efforts ensure that physicians remain accountable and equipped to meet the evolving needs of patients and the public. 16
In response to this need, the College of Physicians & Surgeons of Alberta (CPSA), Alberta's MRA, initiated a pilot project: the Group Practice Review (GPR). This project is one component of CPSA's overarching continuing competence program re-design initiated in 2016 and aimed to assess family medicine and general practice (FM/GP) clinic performance, with a focus on group processes, patient safety, and system sustainability. The goal of the GPR pilot was to engage with a limited number of FM/GP clinics to determine how CPSA can best assess and intervene to improve the level of quality and safety provided by a group practice. By aligning resources to support practices in need and offering constructive feedback, GPR seeks to enhance both group and individual physician practice within a cost-effective, evidence-based framework, and contribute to the ongoing improvement of healthcare delivery.
Methods
CPSA conducted the pilot phase of the GPR program in 2016. Purposeful sampling of anecdotally "high performing" clinics, according to experienced CPSA staff, was used to select clinics for the GPR pilot. The inclusion criteria involved FM/GP groups consisting of at least two physicians providing entirely or mixed family practice and walk-in care services. Clinics were located in Edmonton and Calgary, the two large metropolitan cities in the province of Alberta, for ease of in-office assessments by CPSA staff. Clinics were excluded if they were only providing walk-in care and no comprehensive family medicine services or had been formed within the last two years. The pilot phase included eight voluntary clinics in Alberta, comprising a total of 65 FM/GPs. The GPR pilot, as part of the larger overarching continuing competence program re-design, was approved by the University of Alberta's Health Research Ethics Board (Pro00065137).
Procedures and Follow-Up
After recruiting clinics, detailed information was provided to participating groups (Figure 1).
Workflow of GPR Pilot
Pre-visit questionnaires gathered information about the background of the clinic including years of operation, the number of full-time physicians, group meetings, after-hours coverage and call services, the use of electronic and/or hard-copy medical records, available procedures, usage of reusable and single-use instruments, and patient characteristics (episodic or comprehensive full spectrum). Follow-up adherence to CPSA's Standards of Practice (SOP) assessments were scheduled for the clinics. CPSA databases were used to retrieve complaints and prescription data profiles for all physicians within the groups. After evaluation and report generation, findings were communicated to clinics. CPSA facilitators provided feedback within four-to-six weeks following the assessment, and post-visit satisfaction surveys were administered at the study's conclusion. The evaluation measures were as follows:
CPSA SOP compliance
Chart score
Prescribing flags
Complaints per registration year
Follow-up visit with physician facilitator
Post-visit satisfaction survey
SOP Compliance
CPSA's SOP are "the minimum standards of professionalism and ethics expected from physicians practicing medicine in Alberta." 17 The SOP review process included office observations, policy and procedure manual reviews, and legislated privacy document checks. This measure was selected because adherence to SOPs is mandatory for practicing in Alberta. Compliance was assessed by the percentage of SOP criteria met. If some SOPs were not applicable to a given clinic, this percentage was calculated excluding them (for example, if a clinic did not store vaccines or medication requiring refrigeration, temperature control was not applicable).
Chart Score
Chart evaluations were conducted on 10 charts per clinic using a rubric focused on specific information presence, excluding clinical reasoning. Both paper and electronic charts were reviewed to generate a chart score ranging from 0-100 based on met criteria proportions. The evaluation criteria included eight elements that considered a cumulative patient profile within the physician-patient relationship. These elements were: identification details, contact and emergency contact information, current medications and treatments, allergies, medical and social history, health maintenance plans (such as screening and immunization) and the dates of last updates. Criteria were categorized as met, partially met, or not met.
Prescribing Flags
This indicator included the mean number of patients on potentially harmful doses of opioids and benzodiazepines. For benzodiazepines it was calculated based on the Daily Defined Dose (DDD) of a medication, and the flag was a count of the number of patients prescribed three times or more than the DDD. 18,19 For opioids, the flag was defined as Oral Morphine Equivalents (OME) and OME90+ a count of the number of patients prescribed equal to or greater than 90+ OME of opioids. 20
Complaints Per Registration Year
The complaint data was obtained from CPSA's complaints database, and the mean number of complaints per registration year for the group practice was calculated.
Follow Up Visits With Physician Facilitator
Nurse reviewers and physician facilitators, trained by CPSA, conducted the onsite clinic reviews. Multiple reviewers assessed charts for the GPR pilot, but each clinic had a single dedicated physician facilitator who led a 90-minute facilitation meeting with the group to discuss the results. Although the inter-rater reliability was not calculated between reviewers, they met on a regular basis throughout the GPR pilot to debrief and discuss their assessments.
Post-visit Satisfaction Survey
The questionnaire was designed to collect FM/GP clinic participants' experience and feedback to incorporate in the next iterations. The survey included questions about their experiences with CPSA staff and facilitators, clarity of instructions for the on-site visit/reports, and relevance, fairness, and usefulness of feedback.
Statistical Analysis
Demographic information for clinics including physicians' age, gender, country of medical school training (international or Canadian), qualifications such as Certification in the College of Family Physicians (CCFP), years in practice, number of complaints, and other characteristics such as location and size of the clinic was obtained from CPSA databases. Performance measures and/or outcomes, demographics and costs were reported descriptively. Given the small sample size (n=8 clinics), only correlations were computed to explore connections between factors and outcomes.
Results
Eight clinics volunteered to participate in the GPR pilot. On average each clinic had eight physicians and 46.1% of FM/GPs were female (30/65). Demographic characteristics of clinics are shown in Table 1.
Demographics of Eight Volunteer Clinics in GPR Pilot
Pre-visit questionnaires revealed that 74% of physicians worked full-time, and all clinics held regular team meetings. Seven out of eight clinics self-reported successes such as supportive staff, while six clinics reported challenges including senior partner retirements, large patient panel sizes, and difficulty finding new physicians.
The characteristics of the eight pilot clinics were compared to provincial averages (Figure 2). The pilot clinics displayed variation across several metrics when compared to provincial averages such as average number of complaints and percentage of internationally trained practitioners. Most clinics in the pilot had a higher percentage of CCFP-qualified physicians than the provincial benchmark. Complaint rates were generally lower than the provincial average. The duration physicians spent at their current clinic also varied, with several clinics surpassing the provincial average.
Comparison of Characteristics Between the Eight Pilot Clinics and Province-wide Averages
None of the eight clinics fully met all CPSA SOPs. On average, clinics met 91.7% of SOPs, with six clinics meeting an acceptable number of SOPs (ie: >90%). Commonly unmet standards related to patient records, with only one quarter meeting requirements for having appropriate information sharing agreements in place. Similarly, only 25% of clinics fully met the infection prevention and control standard for drug storage, while the remaining clinics were partially compliant. One-quarter of the clinics lacked information on after-hours care access and formal agreements. SOP compliance scores are detailed in Figure 3. Following the GPR, these identified deficits were corrected by the clinics.
Percentage of SOP Compliance of Eight Volunteer Clinics in the GPR Pilot
On average 76% of the chart audit criteria were scored as "met" and "partially met" across eight clinics with scores ranging from a minimum of 55% to a maximum of 92%. The most frequently met items in the chart reviews were:
Patient identification;
Current medications and treatments;
Recorded ongoing health conditions and identified health risk factors;
Health maintenance plans.
In contrast, the most frequently unmet items were allergies and drug reactions' history, as well as the dates of the last updates. The overall result of patient chart evaluations is depicted in Figure 4.
Due to operational constraints, charts were inconsistently selected, sometimes chosen randomly by the reviewers and other times provided by clinic directors or physicians. Although 10 charts were expected to be reviewed, fewer were sometimes assessed.
SOP scores were negatively correlated with the number of physicians, number of complaints, and days worked per week. SOP scores were positively correlated with gender (% female), time worked at current practice, qualifications, and years in practice. Only the variable workdays showed a statistically significant correlation, with a p-value of 0.05.
The mean number of complaints per registration year of practice ranged from 0.00 to 0.086, with an average of 0.045. This was notably lower than the province-wide average of 0.089.
The average rate of opioid (OME) and benzodiazepine (DDD) prescription per patient per day were 72 mg and 0.7, respectively, which both were above the province wide average (Table 2).
Evaluation Measures of Eight Volunteer Clinics in the GPR Pilot
Percentage of Chart Audit Criteria Met or Partially Met of Eight Volunteer Clinics in the GPR Pilot
The average cost per physician was approximately $300 Canadian dollars ($225 USD). The post-visit survey provided feedback on the pilot program, such as: Ensuring the clinics have access to the SOP compliance report for 2-3 weeks prior to facilitation for digestion and discussion; improving definitions to avoid misunderstanding, such as reporting teaching time; and maintaining a process for CPSA to assess physicians/groups most in need of intervention and support.
Discussion
The GPR pilot succeeded in introducing an innovative approach to evaluate physicians in group practice settings, exploring CPSA's ability to assess and intervene to improve the overall quality and safety of medical groups as a whole and outlining plans for future phases.
Even though the variable workdays were significantly correlated with the SOP scores, given the small sample size, no general conclusions can be drawn from SOP scores correlation coefficients. Nevertheless, it was noteworthy that some of the findings are in line with previous work regarding factors that are associated with physician performance. 11, 21 The GPR pilot began to address key questions, including the effectiveness of assessment tools in identifying areas of improvement, effective interventions for enhancing group practice, and associated costs. This approach differs from traditional physician assessment methods which focus solely on individual physician evaluations by MRAs at an average cost of $4800 Canadian dollars (USD $3600; personal communication with CPSA's Director of Continuing Competence, October 2024) whereas GPR costs only $300 Canadian dollars ($225 USD) per physician, making it more cost-efficient. Notably, the GPR process fostered a proactive environment within clinics while promoting timely corrective measures.
The process sought not only to support individual physicians within the clinic, but also to encourage reflection on practice, promote improvement by considering group dynamics, and align with quality initiatives of partner organizations such as the Health Quality Council of Alberta and the College of Family Physicians of Canada. 22,23 Additionally, the GPR pilot aimed to avoid duplicating existing assessments, ensuring cost-effectiveness, practicality and professional acceptability.
Anecdotal, unpublished experiential knowledge from physician assessors at the CPSA highlighted the value of chart reviews in evaluating individual performance, even when charts were self-selected. Instances were observed where patient information was omitted due to deficiencies in the chart template, indicated a systemic issue influencing chart scores. However, little consideration has been given to the role of charts in assessing group performance until now. The findings of this study could highlight the importance of chart reviews as a relatively cost-effective and minimally invasive intervention for evaluating group performance. Other research on chart reviews has been mixed; one study found that doctors in the Netherlands recorded much less in their charts than they actually performed. 24 Others have indicated that visit activity varies widely, correlated to subjective workload. 25
Comparing clinic demographics, patient volume, and service range with province-wide averages provided a contextual overview of how the pilot clinics align with or diverge from broader provincial patterns. This comparison supports understanding of the unique features of the pilot group in relation to the general landscape of FM/GP clinics in Alberta. The clinics sampled in the pilot project had similar demographic characteristics to that of the province overall. This supports the idea that the GPR program could be implemented on a broader scale with similar findings from the pilot.
Future research replicating the GPR pilot findings in larger populations would assist in building the evidence base around clinic assessments and validating the measurement tools within group practices. This method has the potential to create a type of accreditation program for clinics to engage them in the process of evaluation and self-reflection, which has been shown to improve quality of care, organizational performance, and patient safety. 26-28
The GPR pilot successfully engaged with all groups within the specified project timeframe. The SOP review was conducted at the clinics to assess adherence to CPSA standards. Some standards and chart items related to patient record content, such as medical history, drug reactions or allergies were difficult to answer with a simple binary "meets" or "does not meet." As a result, "partially meets" was added as an option for the answers. Additionally, some elements of the SOP or chart audit criteria were not applicable to all clinics.
The post-visit survey demonstrated clinics' interest in participating and provided valuable feedback at the end of the process. According to the survey results, the component of the pilot program that received the most positive feedback was sharing the SOP compliance results with participants. Respondents found the SOP compliance report to be relevant, easy to understand and fair. This feedback underscores the potential for collaborative regulatory assessments to foster transparency, enhance understanding, and support quality improvement across clinical environments. Furthermore, these findings align with conclusions from Foy and colleagues, who emphasize that audit and feedback, especially when combined with actionable steps and relevant compliance data, can improve engagement and receptiveness among healthcare providers.29 This approach helps to shift the perception of audits from punitive measures to supportive improvement tools.29
The GPR pilot was not without limitations. First, mandatory SOP compliance posed a challenge, as the CPSA cannot close an intervention or assessment until all SOPs have been met. This requirement may introduce bias in measuring the program's success in affecting practice change, as all groups must achieve 100% compliance. Additionally, some data such as patients per day was self-reported, which inherently has limitations, including the potential for under/overestimation of patient volumes, data entry errors, or misinterpretation of questions by respondents. Identifying physician groups accurately was also complex, as not all clinics have a unique registration identifier, and physicians frequently switch clinics or work in multiple clinics.
Another limitation of this study relates to the chart review process. Variability in documentation practices across clinics and potential biases in scoring could influence the consistency and generalizability of the findings.
Moreover, since the clinics involved in this pilot were volunteer participants, they may differ significantly from the average clinic in Alberta, potentially being higher-performing, larger and perhaps better managed. This voluntary nature introduces the possibility of selection bias, as these clinics might already prioritize compliance and quality standards more than non-participating clinics.
Finally, while the pilot focused on systems and processes that support high-quality care, it was not designed to evaluate individual physician performance. As such, it may not effectively identify underperforming physicians within group practices.
Future research should focus on the application of GPR to larger numbers of clinics, exploring and identifying risk and protective factors for clinic performance. It is imperative to ascertain and distinguish risk and protective factors for performance at competence, individual and system-levels to understand the interactions and nuances that impact physician practice, and ultimately patient care. 13
Conclusion
The College of Physicians & Surgeons of Alberta's group practice review pilot demonstrated that implementing a clinic assessment is both feasible and practical and was well-received by participants. This pilot was designed to encourage physicians to embrace new ideas and resources, thus ensuring compliance with the medical regulator's requirements with relation to standards of practice, and high-quality patient care. This pilot offered a cost-effective approach for evaluating group practice performance in Alberta family medicine clinics. However, further research is warranted to refine practice group selection, streamline assessments, target interventions, and accurately measure outcomes.
Footnotes
Correspondence: Nicole Kain, MPA, PhD, Research and Evaluation Unit, College of Physicians & Surgeons of Alberta, 2700-10020 100 Street NW, Edmonton, AB T5J 0N3; Tel: 780-696-4906; E-mail: [email protected]
Open Access: © 2025 The Authors. Published by the Journal of Medical Regulation. This is an Open Access article under the terms of the Creative Commons Attribution-NonCommercial License (CC BY-NC, https://creativecommons.org/licenses/by-nc/4.0/), which permits use and distribution in any medium, provided the original work is properly cited, and the use is noncommercial.
Funding/support: N/A
Other disclosures: N/A
Author contributions: NA and NK conceptualized and designed the study. NH, HH, IH, and KK collected the data. NH, HH, and KK analyzed and interpreted the data. NA and NK supervised the project. All authors contributed to drafting, reviewing, and editing the manuscript. NK, HH, NH, and KK critically revised the manuscript. All authors reviewed previous versions, read, and approved the final manuscript.
Ethics approval: This study, which is part of a larger overarching body of continuing competence program redesign, was approved by the University of Alberta's Health Research Ethics Board (Pro00065137).
Acknowledgements: The authors thank Ms. Tanya Northfield, Mr. Phong Van, and CPSA's Continuing Competence department for their invaluable support and advice, which were instrumental in completing this project.
- Received November 8, 2024.
- Revision received January 27, 2025.
- Accepted April 23, 2025.
References
- 1↵Fye WB. Presidential address: The origins and evolution of the Mayo Clinic from 1864 to 1939: a Minnesota family practice becomes an international "medical Mecca." Bull Hist Med. 2010;84(3):323-57. doi:10.1353/bhm.2010.0019
- 2↵Bourgueil Y, Marek A, Mousquès J. Medical group practice in primary care in six European countries, and the Canadian provinces of Ontario and Quebec: What are the lessons for France? Issues in Health Economics. 2007;n127:1-8.
- 3↵Damiani G, Silvestrini G, Federico B, et al. A systematic review on the effectiveness of group versus single-handed practice. Health Policy. 2013;113(1-2):180-7. doi:10.1016/j.healthpol.2013.07.008
- 4↵Zwiep TM, Greenberg JA, Balaa F, et al. Impact of group practices on patients, physicians, and healthcare systems: Protocol for a scoping review. BMJ Open. 2018;8(9):e022164. doi:10.1136/bmjopen-2018-022164
- 5Zwiep T, Ahn SH, Brehaut J, et al. Group practice impacts on patients, physicians, and healthcare systems: A scoping review. BMJ Open. 2021;11(1):e041579. doi:10.1136/bmjopen-2020-041579
- 6↵Casalino LP, Devers KJ, Lake TK, Reed M, Stoddard JJ. Benefits of and barriers to large medical group practice in the United States. Arch Intern Med. 2003;163(16):1958-64. doi:10.1001/archinte.163.16.1958
- 7Rodríguez C, Pozzebon M. The implementation evaluation of primary care groups of practice: A focus on organizational identity. BMC Fam Pract. 2010;11:15. doi:10.1186/1471-2296-11-15
- 8Kash B, Tan N. Physician group practice trends: A comprehensive review. J Hosp Med Manag. 2016;2(1:3):1-8. doi:10.4172/2471-9781.10008
- 9↵Burns LR, Goldsmith JC, Sen A. Horizontal and vertical integration of physicians: A tale of two tails. Adv Health Care Manag. 2013;15:39-117. doi:10.1108/s1474-8231(2013)0000015009
- 10↵Liebhaber A, Grossman JM. Physicians moving to mid-sized, single-specialty practices. Track Rep. 2007 Aug;(18):1-5. PMID: 17710764
- 11↵Kain NA, Hodwitz K, Yen W, Ashworth N. Experiential knowledge of risk and support factors for physician performance in Canada: A qualitative study. BMJ Open. 2019;9(2):e023511. doi:10.1136/bmjopen-2018-023511
- 12↵The College of Family Physicians of Canada. Family Physicians: The foundation of Canada's health care system. Published 2020. Accessed October 30, 2025. https://www.cfpc.ca/CFPC/media/PDF/HPGR-VFP-ENG.pdf
- 13↵Rethans JJ, Norcini JJ, Barón-Maldonado M, et al. The relationship between competence and performance: Implications for assessing practice performance. Med Educ. 2002;36(10):901-9. doi:10.1046/j.1365-2923.2002.01316.x
- 14↵General Medical Council. Responding to the case of Dr. Bawa Garba. Published February 2018. Accessed October 30, 2025. https://www.gmc-uk.org/news/news-archive/responding-to-the-case-of-dr-bawa-garba/your-questions-about-the-case
- 15↵Healthy debate. Improvements suggested to the health system that failed Greg Price. Born K, Laupacis A, Pendharkar S. Published March 13, 2014. Accessed October 30, 2025. https://healthydebate.ca/2014/03/topic/quality/hqca-greg-price/
- 16↵Federation of Medical Regulatory Authorities of Canada. Position statement on professional revalidation of physicians (2007). Published July 4, 2007. Accessed October 30, 2025. https://fmrac.ca/professional-revalidation-of-physicians/
- 17↵College of Physicians & Surgeons of Alberta. Standards of Practice. Accessed October 30, 2025. https://cpsa.ca/physicians/standards-of-practice/
- 18↵Royal Australian College of General Practitioners (RACGP). Prescribing drugs of dependence in general practice. Evidence-based guidance for benzodiazepines. Published June 2015. Accessed October 30, 2025. https://www.racgp.org.au/clinical-resources/clinical-guidelines/...
- 19↵Ashworth N, Kain N, Wiebe D, Hernandez-Ceron N, Jess E, Mazurek K. Reducing prescribing of benzodiazepines in older adults: A comparison of four physician-focused interventions by a medical regulatory authority. BMC Fam Pract. 2021;22(1):68. doi:10.1186/s12875-021-01415-x
- 20↵Busse JW, Craigie S, Juurlink DN, et al. Guideline for opioid therapy and chronic noncancer pain. CMAJ. 2017;189(18):E659-E666. doi:10.1503/cmaj.170363
- 21↵Yen W, Thakkar N. State of the sience on risk and support factors to physician performance: A report from the pan-Canadian physician factors collaboration. J Med Regul. 2019;105(1):6-21. doi: 10.30770/2572-1852-105.1.6
- 22↵Health Quality Council of Alberta. Resources for improvement. Accessed October 30, 2025. https://hqca.ca/resources-for-improvement/resources-for-improvement-overview/
- 23↵The College of Family Physicians of Canada. The Practice Improvement Initiative (Pii). Accessed October 30, 2025. https://www.cfpc.ca/en/policy-innovation/innovation-in-research-and-quality-improvement/practice-improvement-initiative
- 24↵Rethans JJ, Martin E, Metsemakers J. To what extent do clinical notes by general practitioners reflect actual medical performance? A study using simulated patients. Br J Gen Pract. 1994;44(381):153-6. PMID: 8185988
- 25↵Calvitti A, Hochheiser H, Ashfaq S, et al. Physician activity during outpatient visits and subjective workload. J Biomed Inform. 2017;69:135-149. doi: 10.1016/j.jbi.2017.03.011
- 26↵Flodgren G, Goncalves-Bradley DC, Pomey MP. External inspection of compliance with standards for improved healthcare outcomes. Cochrane Database Syst Rev. 2016;12(12):CD008992. doi: 10.1002/14651858.CD008992.pub3
- 27Ng GK, Leung GK, Johnston JM, Cowling BJ. Factors affecting implementation of accreditation programmes and the impact of the accreditation process on quality improvement in hospitals: A SWOT analysis. Hong Kong Med J. 2013;19(5):434-46. doi: 10.12809/hkmj134063
- 28↵Alkhenizan A, Shaw C. Impact of accreditation on the quality of healthcare services: A systematic review of the literature. Ann Saudi Med. 2011;31(4):407-16. doi: 10.4103/0256-4947.83204
- 29↵Foy R, Skrypak M, Alderson S, et al. Revitalising audit and feedback to improve patient care. BMJ. 2020;368:m213. doi:10.1136/bmj.m213





