Best Practices for Longitudinal Assessment in Continuing Certification: Consensus-Based Guidelines from American Board of Medical Specialties (ABMS) Member Boards

  • Journal of Medical Regulation
  • August 2025,
  • 111
  • (2)
  • 31-40;
  • DOI: https://doi.org/10.30770/2572-1852-111.2.31

ABSTRACT:

Background: After more than 100 years of successful self-regulation of physician specialties in the United States, requirements for maintaining certification have changed from lifetime certification to required demonstration of competence at regular intervals. In addition, the integration of education and professional development into the certification process has become more prevalent and expected by certificants. Over the last decade, most American Board of Medical Specialties (ABMS) Member Boards have transitioned from a traditional point in time recertification examination requirement to a longitudinal assessment that blends summative assessment with formative learning tools to aid certificants in staying up to date.

Objective: Given the novelty of this assessment design, measurement experts convened to establish best practices and minimum standards for development of longitudinal assessments for ongoing certification requirements.

Conclusion: Consensus-based standards for longitudinal assessment program design, scoring and reporting, and assessment security are outlined in this article. They are relevant to regulatory bodies and non-medical professions who have or may consider implementing a longitudinal assessment. As the field of medicine evolves, medical regulation must also adapt to the changing landscape and expectations of the public and the profession.

Keywords:

Introduction

Since 1916, the United States has maintained a system of professional self-regulation for medical specialties. After obtaining medical licensure and completing education requirements, physicians can voluntarily choose to obtain a credential (board certification) in a specialty area, such as internal medicine, surgery, or ophthalmology. Although specialty certification is not required for practice, nearly one million physicians have voluntarily obtained and close to 800,000 have maintained certification through the 24 Member Boards of the American Board of Medical Specialties (ABMS). The Member Boards represent 24 certifying bodies of physicians and specialists.1 Despite longevity, medical self-regulation has evolved considerably in the past 100 years.

Historically, most member boards issued lifetime certificates to diplomates after completing education prerequisites and passing an exam or a series of exams. Board certification could be revoked for professional misconduct or licensure requirements, but there was an assumption that physicians maintained sufficient specialty knowledge over time. There was a gradual transition across the boards to eliminating lifetime certificates. This typically required diplomates of the member boards to pass a periodic secure exam (eg, every ten years) to remain certified, usually known as recertification. By the mid-2000s, all 24 Member Boards were issuing only time-limited certificates, which meant that certificates had an end date, and participation in “Maintenance of Certification” (MOC) activities were required for recertification. MOC requires board-certified physicians to remain current in medical knowledge and judgment, exhibit professionalism, and engage in systematic efforts to improve the quality, value, and safety of care in the system in which they work.2-4

Although perhaps an understatement, ABMS diplomates’ perception of the value of MOC has been variable. A particular point of contention was the periodic requirement for passing a high-stakes examination. Some diplomates viewed the periodic testing as impractical, costly, and not relevant to their practice.5,6 Others, citing the perspective of the public, argued that a decennial examination was too infrequent to identify physicians who were not keeping up to date. Additionally, these 10-year examinations did not always provide feedback to the diplomates to allow them to identify their areas of weakness and tailor their continued learning accordingly. In 2016, Yam, et al,7 identified 10 emerging trends in medical regulation, among which they noted regulation is not only for “detection and remediation of poor performance,” but also in creating requirements for continuous professional development to “enable them to maintain their professional competence” (p. 24) and stay up to date.

Longitudinal Assessment in US Medical Certification

As the medical regulatory community continued to identify the needed attention to both assessment and education to meet their mission in protecting the public, Member Boards began to think about alternate models for MOC. A significant change occurred in 2014 when the American Board of Anesthesiology transitioned to a continuous assessment called Maintenance of Certification in Anesthesiology (MOCA) Minute.8 The premise was that instead of taking a single exam with several hundred items (questions) in one sitting, test questions would be presented to the diplomate throughout their certification cycle on a continuous basis – perhaps one question a week or a handful of questions every month. Not only was the perception of stakes lessened by the option to take only a handful of questions at a time, educational information (including the correct answer, justification for the answer, and additional references) was provided to the diplomate. As other ABMS Member Boards (as well as other certification organizations) began to adopt this model, the term “longitudinal assessment” (LA) was used to denote an assessment where diplomates are expected to interact with the content more frequently than a traditional point-in-time assessment and over an extended period of time. LAs may be primarily formative (meant to inform the learner of his/her strengths and weaknesses) or primarily summative (meant to indicate whether the individual possesses the knowledge and skills required for certification). Between 2014 and 2023, all 24 Member Boards followed suit and now offer alternatives to the 10-year exam and 18 offer LAs exclusively or as an alternative to the point-in-time exam.

The scientific foundation from learning sciences (eg, cognitive psychology) for LA is strong and a comprehensive review of this was completed in 2023.9 This review noted four core components for the importance and effectiveness of LA: (1) cognitive skills decline over time; (2) self-assessment is not sufficient; (3) testing enhances learning and retention; and (4) goals and consequences motivate. The Member Boards, in their unique position to assess and certify individuals, use these core components in their LA programs to blend assessment for learning with assessment of learning.9-13

Balancing The Purposes of Assessment

Summative assessments are designed to evaluate the level of a knowledge or skill that an examinee has achieved at a given point in time (assessment of learning), while formative assessments are designed with the primary purpose of enhancing the examinee's learning (an assessment for learning).14-16 Summative and formative assessments are designed differently to optimize the validity of the intended inferences made based on an examinee's performance. Optimizing the validity of the scores from a summative assessment means that examinees are taking the assessment under similar conditions (eg, the same amount of time and access to resources), security is sufficient, and items have been vetted as fair and representative of the content domain. Additionally, summative assessments must have sufficient reliability to make valid inferences about an examinee's ability. This means that under repeated conditions, we would expect the same score or outcome from the same examinee. In contrast, formative assessments may allow less standardized delivery, contain repeat questions, be less secure, and provide more feedback to improve the examinee's knowledge. Due to some mutually exclusive features of these assessment types, it is challenging to design an assessment that supports both formative and summative uses. However, the LAs that the ABMS Member Boards have developed strive to achieve, at least in part, both goals. Not only did spreading out the summative assessment over time allow for more flexibility, but the boards also leaned into the opportunity for testing to enhance knowledge retention.17 The addition of the formative (educational) component to LA has been very well-received by users and addresses one of the challenges with the 10-year exam.18-23

The first years of the transition to LAs in the ABMS community were exciting but also widely variable. Despite the success of the initial LA programs, many Member Boards created unique assessment programs with different design elements. For the most part, Member Boards were successful in creating programs that the diplomates found valuable and less burdensome; however, unlike the psychometric standards that exist for point-in-time exams, the best practices for ensuring fairness and validity for LA were less clearly defined and made program evaluation challenging. In response, the ABMS Member Boards worked together in a multi-year process to develop guidelines for designing a strong LA program, incorporating formative and summative assessment within the program, and addressing test security concerns.

Establishing Standards for Longitudinal Assessments

Among other roles, ABMS sets standards and fosters consistency among its Member Boards. The ABMS Committee on Continuing Certification is charged with reviewing the continuing certification programs of the Member Boards to ensure they meet the ABMS standards.24-25 As the number of LAs proposed to the Committee on Continuing Certification increased, it was clear that more guidance on LA design was necessary.

The Committee on Continuing Certification charged three measurement experts (psychometricians SS, AJ, BB) of the Member Boards to chair three task forces focused on development of design guidelines for LAs. Each task force was charged with identifying both minimum standards and best practices, when applicable. One task force focused on guidelines for LA program design, the second developed guidelines for scoring and reporting, and the third created guidelines for test security. Each was comprised of ABMS staff, Member Board staff psychometricians, and members from the seven Member Boards who had the most experience with LAs. The task force rosters are in the Appendix. The task forces met about once per month from 2021 through early 2022. The committee chairs met regularly to ensure consistency across the guideline development and to minimize overlap. Task forces approved their final guideline reports by consensus in February 2022 and the reports were accepted by the Committee on Continuing Certification in April 2022.

Each of the task forces relied on several sources. The Standards for Educational and Psychological Testing26 served as the primary reference document when applicable. Given that LA is relatively new for certification testing, expert judgment was used when standards were not available. While the program design and scoring and reporting task forces identified the lack of guidance in the literature for this type of assessment, the security committee found that many of the best practices and considerations for exam security for traditional assessments also applied for LAs. Although the specific methodological approach used by each task force differed, all task forces discussed recommendations as a group and reached consensus on all guidelines. The guidelines were collated by each task force chair and circulated as a complete document to all task force members, where final edits were made.

Program design

The program design task force developed guidelines in three subdomains: assessment purpose, content specifications, and design and administration. The group recognized that LAs are one component of a more comprehensive continuing certification program. The purposes and the relationship between the complete continuing certification program and the LA should be clear. Once the subdomains were identified, the task force created a list of minimum standards and best practices for each of the subdomains. Then, a final list was constructed through a consensus process based on the assessment design experience of each of the members.

Scoring and reporting

The scoring and reporting task force identified the potentially non-congruent design elements of formative and summative assessment. The task force discussed the tradeoffs between these two purposes. For example, making the most defensible summative decision is optimized by delivering exams in a secure, standardized environment which often limits the opportunity for feedback on specific items. Conversely, optimizing formative assessments relies on an exam platform that provides examinees with the freedom to respond to items in various settings (eg, home, office) and modes (eg, computer, tablet, phone) and receive detailed feedback about an item including rationales for correct and incorrect answers. The task force determined that each LA must be assessed individually to determine whether it is balancing formative and summative elements appropriately given its intended goal(s). The task force identified four domains for optimizing summative value, four domains for optimizing formative value, and the minimum standards and best practices within each domain.

Security

The security task force was unique in that, unlike the other task forces, a good deal of the existing testing literature was applicable to LAs. Nevertheless, the nature of LA makes the considerations and options to ensure a secure exam quite different than for exams administered in a testing center. Thus, the committee compiled a list of security measures that might be considered, the combination of which should be documented and a rationale for their support of the underlying validity argument of the exam provided. The test security task force established a framework to determine the features that an LA security protocol could have. The framework is based on the work of Foster in a two-dimensional matrix.27 The task force members determined which of the security measures were “best practices” and reached consensus on ratings for each measure in on effectiveness, ease of implementation, user experience, and cost. As an additional step, three external US organizations (Caveon Security, Salt Lake City, UT; Cornerstone Strategies, Massapequa, NY; and Internet Testing Systems, Baltimore, MD) reviewed the general concepts and specific items in the security guidelines as external validation. Based on this feedback, a few items were modified.

Recommendations

The program design task force developed guidelines in three subdomains: assessment purpose framework, content specifications framework, and design and administration specifications (Table 1). A LA program is typically one component of a larger continuing certification program. The purposes of both the continuing certification program and the LA should be provided. The purpose statement of the assessment should include both what is intended to be measured and how the results are intended to be interpreted and used. For example, a purpose statement might state, “Completion of at least 5 years of knowledge-based Quarterly Questions will determine whether an individual possesses the clinical knowledge and judgment necessary to provide board-certified-level care to their patients.”

Table 1

Program Design Minimum Standards and Best Practices for Longitudinal Assessment (LA=longitudinal assessment; n/a=not applicable)

A systematic process for determining the content (eg, assessment topics, knowledge, and skills) to be assessed should include input from a representative group of subject matter experts. To support the underlying meaning of the certificate, some portion of the assessment should be dedicated to core knowledge. If offered, content customization should be supported by a rationale. For example, a board-certified ophthalmologist specializing in pediatric ophthalmology may not need to demonstrate knowledge and judgement related to certain surgeries that are outside of their pediatric population; however, some core ophthalmic knowledge is needed by all board-certified ophthalmologists and should comprise some defined proportion of the assessment. Lastly, all design and administration elements of the LA process should be documented with a rationale that links each to the assessment's overall purpose. This includes selection of item formats that support the intended construct of measurement or “goal” of the test (eg, use of multiplechoice questions for assessment of knowledge and judgment). Other administration features, such as time limits and frequency of content delivery, should also be determined by the intended interpretation and use of test scores.

The scoring and reporting task force identified minimum standards and best practices for both summative decision-making and formative feedback (Tables 2a and 2b). The subdomains relative to summative decision-making were calculating total scores and key validation (the process by which answer keys are re-checked after reviewing item performance data), the incorporation of time and total scores, standard setting, as well as scaling (transforming raw scores to a reporting scale) and equating (maintaining a consistent passing standard over time) (Table 2a). Best practices included establishing a reliability coefficient ≥0.80 to ensure reliability in making a summative decision and using multiple panels of content experts in standard setting so that there is broad representation from the field. The subdomains pertaining to formative feedback were item-level performance feedback, identification of strengths and weaknesses, promotion of learning and retention, and the use of confidence/relevance ratings (Table 2b).

Table 2a

Summative Decision-Making Minimum Standards and Best Practices for Longitudinal Assessment (LA=longitudinal assessment; n/a=not applicable)

Table 2b

Formative Feedback Minimum Standards and Best Practices for Longitudinal Assessment (LA=longitudinal assessment; n/a=not applicable)

Best practices here focused on continued and ongoing study to determine the best ways to support learning and to determine the utility of confidence/relevance ratings by the examinee.

Security of exam content is a prerequisite for an exam's validity argument, and the potential for compromise of content is impacted by the purpose of the assessment, assessment stakes, and item pool size, which are variable across programs. The security task force recognized that the testing organization must relinquish some of the security control of the design, administration, and oversight of the exam to allow for the flexibility and frequency of item delivery. Thirty-two security measures were identified (Table 3). The starred items were deemed “best practices” for LA security.

From the ratings, several of the measures received level four (highest) rating in their category. The highest effectiveness rating was for an empowered group or committee to review and act on any security compromises that arise. Seven measures received level four rating for ease of implementation including having a code of conduct attestation, educating examinees about expectations, measuring item exposure and drift (ie, how often items are presented to examinees and whether their difficulty level changes over time), randomizing the order of items and distractors (the incorrect answers in a multiple-choice question), limiting the duration of time to answer an item, warning not to re-use passwords, and creating an end user agreement. Seven measures were rated as least costly such as creating a code of conduct attestation, measuring item exposure and drift, randomizing the order of items and distractors, limiting the duration of time to answer an item, warning not to re-use passwords, and creating an end user agreement.

Conclusion and Application

Assessment is necessary for a certifying board to ensure individuals maintain their knowledge, judgement, and skills in their specialty. After many years of intermittent point in time exams, ABMS Member Boards have moved to LA for continual certification. Longitudinal assessments can address many of the functional arguments against the decennial examination (burden, frequency, lack of formative value, and insufficient feedback), allow the Member Boards to make a summative decision based on performance over time, and provide formative feedback to the examinees.9-13 Development of these minimum standards and best practices for LAs relied on the extensive experience from the ABMS Member Boards and acknowledging the potential for Las to combine the summative and formative functions of assessment. These core and best practice recommendations (Tables 1-3) provide other certifying organizations with a starting point for LA programs.

Table 3

Security Measures for Prevention, Mitigation, and Detection of Exam Compromise in Longitudinal Assessment (LA = longitudinal assessment; ***= Best Practices; CAT = Computerized Adaptive Testing; LOFT = Linear On-the-fly Testing)

The use of measurement experts from ABMS Member Boards, the consensus-based model for guideline development, and the parsing of the guidelines into three task forces provides a convenient way to use these recommendations. A certifying entity that intends to initiate an LA program needs to start with a clear purpose and a job analysis to identify the core content (Table 1). Balancing and blending the summative and formative purposes of the LA is key to making a certification decision while supporting the learning of the examinee (Table 2a & 2b). Finally, the administration and the security of the exam must be substantial to support the decision making and the integrity of the process (Table 3).

Seven of the Member Boards use a common LA delivery platform (CertLink; Chicago, IL, US) that is managed by ABMS. Over the past five years of the CertLink LA, examinees have answered over 1.5 million questions and over 95% rate the experience as a “helpful learning tool.” Additionally, many Las include repeat or “clone” questions—items that an individual answered incorrectly the first time that are re-presented as the same or a similar question—which have been shown to be more likely to be answered correctly on a subsequent attempt.28 Member Boards also report similar or higher internal test reliability for their LA compared to their decennial exam.29

These guidelines should be considered a starting point. While there is breadth and depth of the experience with LA from the ABMS Member Boards, most operational Las are still evolving. We encourage others to develop and implement LA programs and to assess and refine these recommendations. There is a rich body of evidence that links traditional certification maintenance to improved outcomes in practice.30-35 In theory, the association between maintenance of certification and practice outcomes will persist given the continued summative assessment requirement. Additionally, given the additional formative elements and test-enhanced learning design, we theorize that more frequent testing will lead to more retention and thus better patient care. However, despite the aforementioned early success of Las, it is imperative that future research investigate if the relationship of continuing certification and practice outcomes persists. The ABMS Member Boards are committed to continuing to evaluate the optimal way that LA programs can provide both a rigorous assessment process and examinee learning to maintain the public trust in board certification.

Acknowledgements:

The authors wish to acknowledge the contributions of the task force members: Sara Locatelli, PhD; Andrew Dwyer, PhD; Ann Harman, PhD; Mary M. Johnston, PhD; Kevin Joldersma, PhD; Jerome C. Clauser, EdD; Mikaela M. Raddatz, PhD; Linda Althouse, PhD; Cathleen Koenig, MS, CESP; Kyle Miller, and Derek Sauder, PhD. They also acknowledge the editing support from Rachel Knapp.

Footnotes

  • Open Access: © 2025 The Authors. Published by the Journal of Medical Regulation. This is an Open Access article under the terms of the Creative Commons Attribution-NonCommercial License (CC BY-NC, https://creativecommons.org/licenses/by-nc/4.0/), which permits use and distribution in any medium, provided the original work is properly cited, and the use is noncommercial.

  • Funding/support: N/A

  • Other disclosures: The authors are full-time employees at the primary institutions listed and have no other disclosures.

  • Author contributions: GO developed the task forces and selected the chairs (SS, AJ, BB). GO, SS, AJ, and BB developed the longitudinal assessment guidelines with the acknowledged committee members. SS drafted the majority of the first manuscript draft, with critical review and edits by AJ, BB, and GO in subsequent drafts. All authors reviewed and approved the final version.

  • Received August 28, 2024.
  • Revision received October 30, 2024.
  • Accepted November 22, 2024.

References

  1. 1.
    ABMS Member Boards . American Board of Medical Specialties. Accessed April 3, 2025. https://www.abms.org/member-boards/
  2. 2.
    Cordovani L , WongA, MonteiroS. Maintenance of certification for practicing physicians: a review of current challenges and considerations. Can Med Educ J. 2020;11(1):e70-e80. doi:10.36834/cmej.53065
  3. 3.
    Rosner MH . Maintenance of certification: Framing the dialogue. Clin J Am Soc Nephrol CJASN. 2018;13(1):161. doi:10.2215/CJN.07950717
  4. 4.
    Kavic MS . Maintenance of certification. JSLS. 2009;13(1):1-3
  5. 5.
    Levinson W , KingTE, GoldmanL, GorollAH, KesslerB. Clinical decisions. American Board of Internal Medicine maintenance of certification program. N Engl J Med. 2010;362(10):948-952. doi:10.1056/NEJMclde0911205
  6. 6.
    Teirstein PS . Boarded to death — Why maintenance of certification is bad for doctors and patients. N Engl J Med. 2015;372(2):106-108. doi:10.1056/NEJMp1407422
  7. 7.
    Yam CHK , GriffithsSM, LiuS, WongELY, ChungVCH, YeohEK. Medical regulation: ten key trends emerging from an international review. J Med Regul.2016; 102: 16-27. doi:10.30770/2572-1852-102.1.16
  8. 8.
    MOCA Minute . The American Board of Anesthesiology. Accessed April 3, 2025. https://www.theaba.org/maintain-certification/moca-minute/
  9. 9.
    Rottman BM , CaddickZA, Nokes-MalachTJ, FraundorfSH. Cognitive perspectives on maintaining physicians’ medical expertise: I. Reimagining maintenance of certification to promote lifelong learning. Cogn Res Princ Implic. 2023;8(1):46. doi:10.1186/s41235-023-00496-9
  10. 10.
    Caddick ZA , FraundorfSH, RottmanBM, Nokes-MalachTJ. Cognitive perspectives on maintaining physicians’ medical expertise: II. Acquiring, maintaining, and updating cognitive skills. Cogn Res Princ Implic. 2023;8(1):47. doi:10.1186/s41235-023-00497-8
  11. 11.
    Fraundorf SH , CaddickZA, Nokes-MalachTJ, RottmanBM. Cognitive perspectives on maintaining physicians’ medical expertise: III. Strengths and weaknesses of self-assessment. Cogn Res Princ Implic. 2023;8(1):58. doi:10.1186/s41235-023-00511-z
  12. 12.
    Fraundorf SH , CaddickZA, Nokes-MalachTJ, RottmanBM. Cognitive perspectives on maintaining physicians’ medical expertise: IV. Best practices and open questions in using testing to enhance learning and retention. Cogn Res Princ Implic. 2023;8(1):53. doi:10.1186/s41235-023-00508-8
  13. 13.
    Nokes-Malach TJ , FraundorfSH, CaddickZA, RottmanBM. Cognitive perspectives on maintaining physicians’ medical expertise: V. Using a motivational framework to understand the benefits and costs of testing. Cogn Res Princ Implic. 2023;8(1):64. doi:10.1186/s41235-023-00518-6
  14. 14.
    Ismail SM , RahulDR, PatraI, RezvaniE. Formative vs. summative assessment: impacts on academic motivation, attitude toward learning, test anxiety, and self-regulation skill. Lang Test Asia. 2022;12(1):40. doi:10.1186/s40468-022-00191-4
  15. 15.
    Nieminen JH , AsikainenH, RämöJ. Promoting deep approach to learning and self-efficacy by changing the purpose of self-assessment: a comparison of summative and formative models. Stud High Educ. 2021;46(7):1296-1311. doi:10.1080/03075079.2019.1688282
  16. 16.
    Connors CB . Summative and formative assessments: An educational polarity. Kappa Delta Pi Rec. 2021;57(2):70-74. doi:10.1080/00228958.2021.1890441
  17. 17.
    Fraundorf SH , CaddickZ, RottmanBM, et al. Conceptual foundations for designing continuing certification assessments for physicians. July2022. https://www.abms.org/wp-content/uploads/2022/07/conceptual-foundations-continuing-certification-assessments-for-physicians.pdfAccessed April 4, 2025.
  18. 18.
    O’Neill TR , NewtonWP, BradyJE, SpogenD. Using the family medicine certification longitudinal assessment to make summative decisions. J Am Board Fam Med. 2019;32(6):951-953. doi:10.3122/jabfm.2019.06.190345
  19. 19.
    Horber DT , FlaminiJ, GimpelJR, TsaiTH (Edward), ShrumK, HudsonK. CATALYST: Piloting a longitudinal assessment and learning program for board recertification and continuous professional development. J Osteopath Med. 2020;120(3):190-200. doi:10.7556/jaoa.2020.031
  20. 20.
    Ward RC , BakerKA, SpenceD, LeonardC, SappA, ChoudhrySA. Longitudinal assessment to evaluate continued certification and lifelong learning in healthcare professionals: A scoping review. Eval Health Prof. 2023;46(3):199-212. doi:10.1177/01632787231164381
  21. 21.
    Griffis CA , DishmanD, GironSE, WardRC, McMullanSP. Concept analysis of longitudinal assessment for professional continued certification. Nurs Forum (Auckl). 2022;57(2):311-317. doi:10.1111/nuf.12678
  22. 22.
    Newton WP , BaxleyE, O’NeillT, RodeK, FainR, StelterK. Family medicine certification longitudinal assessment becomes permanent. J Am Board Fam Med. 2021;34(4):879-881. doi:10.3122/jabfm.2021.04.210242
  23. 23.
    Colenda CC , ScanlonWJ, HawkinsRE. Vision for the future of continuing board certification. JAMA. 2019;321(23):2279-2280. doi:10.1001/jama.2019.4815
  24. 24.
    Committees | Board of Directors . American Board of Medical Specialties. Accessed April 3, 2025. https://www.abms.org/inside-abms/governance/committees-of-the-abms-board-of-directors/
  25. 25.
    ABMS Board Certification Report 2021-2022.; Published 2023. Accessed April 5, 2025. https://www.abms.org/wp-content/uploads/2023/04/ABMS-Board-Certification-Report-2021-2022.pdf
  26. 26.
    American Educational Research Association, American Psychological Association, National Council on Measurement in Education . Standards for Educational and Psychological Testing. American Educational Research Association. Accessed April 3, 2025. https://www.aera.net/publications/books/standards-for-educational-psychological-testing-2014-edition
  27. 27.
    Foster D . The Language of Security and Test Security. Published 2015. Accessed April 4, 2025. https://caveon.com/resources/the-language-of-security-and-test-security/
  28. 28.
    Locatelli S . Evidence of learning and retention in a longitudinal assessment program. Poster presented at: ABMS Conference 2023; Published September 2023. Accessed April 3, 2025. https://abmsconference.eventscribe.net/fsPopup.asp?efp=SVZUSFNLREgxOTk1Nw&PosterID=614891&rnd=0.8626193&mode=posterInfo
  29. 29.
    Schnabel SD . Director of Psychometrics and Assessment, American Board of Ophthalmology. Personal communication with A. Jones, B. Brossman, December15, 2023.
  30. 30.
    Buscemi D , WangH, PhyM, NugentK. Maintenance of certification in internal medicine: participation rates and patient outcomes. J Community Hosp Intern Med Perspect. 2012;2(4). doi:10.3402/jchimp.v2i4.19753
  31. 31.
    Capoocia A . Value of the continuing certification modules and challenging the status quo. J Osteopath Med. 2020;120(3):128-132. doi:10.7556/jaoa.2020.025
  32. 32.
    Norcini JJ , WengW, BouletJ, McDonaldF, LipnerRS. Associations between initial American Board of Internal Medicine certification and maintenance of certification status of attending physicians and in-hospital mortality of patients with acute myocardial infarction or congestive heart failure: a retrospective cohort study of hospitalisations in Pennsylvania, USA. BMJ Open. 2022;12(4):e055558. doi:10.1136/bmjopen-2021-055558
  33. 33.
    Vandergrift JL , GrayBM. Physician clinical knowledge, practice infrastructure, and quality of care. Am J Manag Care. 2019;25(10):497-503.
  34. 34.
    Chesluk B , GrayB, EdenA, HansenE, LynnL, PetersonL. “That Was Pretty Powerful”: A qualitative study of what physicians learn when preparing for their maintenance-of-certification exams. J Gen Intern Med. 2019;34:1790-1796. doi:10.1007/s11606-019-05118-z
  35. 35.
    Sheth BP , SchnabelSD, ComberBA, MartinB, McGowanM, BartleyGB. Relationship Between the American Board of Ophthalmology Maintenance of Certification Program and Actions Against the Medical License. Am J Ophthalmol. 2023;247:1-8. doi:10.1016/j.ajo.2022.11.001
Loading
Loading
Loading
  • Share
  • Bookmark this Article