ABSTRACT
To investigate the practice characteristics of newly licensed physicians for the purpose of identifying the knowledge and skills expected of those holding the general, unrestricted license to practice medicine, a questionnaire was mailed in May 2012 to 8,001 U.S. physicians who had been granted an unrestricted license to practice medicine between 2007 and 2011. The questionnaire requested information on stage of training, moonlighting, and practice setting; it also listed 58 clinical procedures and asked respondents to indicate whether they had ordered, performed, or interpreted the results of each procedure since obtaining their unrestricted license. A strategy was implemented to identify the relevance of each clinical activity for undifferentiated medical practice. The response rate was 37%. More than two-thirds of newly licensed physicians still practiced within a training environment; nearly one-half of those in training reported moonlighting, mostly in inpatient settings or emergency departments. Physicians who had completed training and entered independent practice spent most of their time in outpatient settings. Residents/fellows engaged in a broader range of clinical activities than physicians in independent practice. Several clinical procedures were identified that were specialty-specific and did not appear to be skills expected for general medical practice. The results may help residency programs and licensing authorities identify the knowledge and skills required of newly licensed physicians as they transition from supervised to unsupervised practice. The results are relevant to the topic of moonlighting by identifying the skills and procedures required of physicians who engage in this activity. While the study identified procedures that have limited utility for licensure decisions because they are not consistent with general medical practice, the inclusion of such procedures in residency may add value by promoting beneficial variation in training experiences.
Introduction
The primary goal of undergraduate and graduate medical education is to prepare physicians for the independent practice of medicine.1,2,3,4 The decision to issue a full, unrestricted license in the United States for independent practice rests with the medical licensing authority in each state or territory. The requirements set by individual licensing authorities typically include verified credentials obtained through accredited undergraduate and graduate medical education programs, criminal background checks, and successful completion of a designated examination. The United States Medical Licensing Examination® (USMLE®) contributes to this process by providing a common assessment for physician licensure. The USMLE is a three-step examination that assesses a physician's ability to apply medical knowledge and to demonstrate fundamental patient-centered skills that are important in health and disease, and that constitute the basis of effective patient care. Given the unrestricted nature of the license, the design of the USMLE has been based on a model intended to reflect general undifferentiated medical practice (GUMP). The GUMP model assumes that there is a common core of knowledge and clinical skill required to provide effective patient care at entry into practice, regardless of specialty. A challenge shared by medical educators, USMLE, and medical licensing authorities is to identify this core knowledge and skill.4,5,6
THE GUMP MODEL ASSUMES THAT THERE IS A COMMON CORE OF KNOWLEDGE AND CLINICAL SKILL REQUIRED TO PROVIDE EFFECTIVE PATIENT CARE AT ENTRY INTO PRACTICE, REGARDLESS OF SPECIALTY.
Various data are available to inform curriculum design and establish licensure requirements, including blue ribbon panel reports, curriculum outlines,1,2,7,8 surveys of practice,4,9,10 and databases such as the National Ambulatory Medical Care Survey (NAMCS).11 These data sources, while valuable, have limited use with respect to the GUMP model.
For example, while large-scale practice questionnaires are recommended in the literature,5,12 most are specialty specific or include both newly licensed and seasoned physicians. The purpose of the present research is to identify the skills required at entry into GUMP by surveying a national sample of recently licensed physicians. For purposes of this study, recently licensed physicians are those who have held an unrestricted license for the independent practice of medicine for four years or less. A practice questionnaire was developed to determine the settings in which entry-level physicians practice and to identify core clinical procedures required of them. While this study was completed to help shape the content of USMLE Step 3 — the last examination required for unsupervised, independent practice — the results may be useful for curriculum development and of interest to medical licensing authorities whose decision to issue a license hinges, in part, on completion of the USMLE sequence.
Methods
A two-page questionnaire was piloted in 2010 and finalized in winter 2011. The first page addressed training and licensure status, reason for obtaining a license, moonlighting, and practice setting. The second page included 58 diagnostic studies and interventional procedures, and respondents indicated whether they ordered, performed, or interpreted the results of that activity. Previous studies by the National Board of Medical Examiners (NBME) addressed routine clinical activities such as performing physical examination, interpreting chest x-rays, communicating bad news, and obtaining informed consent; the results of those studies confirmed the importance of such activities to the GUMP model.13 In contrast, the list of clinical activities on the present survey was broad and included procedures that might (or might not) be specific to one or two specialties or performed by only a minority of physicians. The purpose of including such activities was to help define the boundaries of GUMP.
...WHILE LARGE-SCALE PRACTICE QUESTIONNAIRES ARE RECOMMENDED IN THE LITERATURE, MOST ARE SPECIALTY SPECIFIC OR INCLUDE BOTH NEWLY LICENSED AND SEASONED PHYSICIANS.
The sample included MDs and DOs representing all specialties and regions of the United States. Contact information was available from a national physician database maintained by the Federation of State Medical Boards (FSMB). Based on public data from the U.S. Census and the AMA,14 sampling was stratified to reflect the distribution of physicians among nine U.S. geographic regions. For example, 6.9% of physicians in the US practice in the New England states, so the sample was constructed to include that same percentage. Questionnaires were mailed during the first week of May 2011 to 8,001 physicians. Recipients were informed that those who responded would be entered into a drawing to receive $1,800 to attend a CME offering of their choice. Approximately two weeks later nonrespondents received a reminder letter and a replacement survey.
FORTY-TWO PERCENT OF RESIDENTS/FELLOWS REPORTED MOONLIGHTING, WHILE 33% OF THE ENTIRE SAMPLE OBTAINED THEIR LICENSE SPECIFICALLY TO MOONLIGHT.
Data analysis focused on the practice settings of newly licensed physicians and the clinical activities they perform, including moonlighting activities. In addition, logistic regression was used to compare clinical activities by stage of training. The final analysis applied a strategy to evaluate the relevance of each clinical activity across specialties. Study procedures were approved by the institutional review board (IRB) administered by the American Institutes for Research (IRB00000436).*
Results
Response Rate and Demographics
Of the 2,832 returned surveys, 2,448 were available for analysis after screening for missing data and other selection criteria (e.g., active practice). After adjusting for ineligible and undeliverable surveys, the final response rate was 37.1%. Responses were weighted such that the distribution of specialties in the sample of respondents would mirror the actual regional distribution of specialties in the US.14,15
Table 1 presents the number of respondents by specialty. The unweighted values are based on the responding sample, while the weighted values are the estimated percentages for the entire population of recently licensed physicians. Standard errors (SEs) for the specialty population estimates are generally less than 1%. The similarity between most sample values and population estimates indicates that the sample was fairly representative of the population. Other interesting demographics not presented in Table 1 include: 71% of respondents were still engaged in training; of that group, 57% were residents, while 14% were fellows. Of the 29% who had already completed training and were engaged in unsupervised practice, most had finished within the past year. Most respondents reported that they had initially obtained their license to satisfy residency requirements.
Number and Percentage of Respondents by Specialty
Specialties were combined into larger groups to simplify presentation and to achieve adequate sample sizes for subsequent analyses. Based on traditional groupings and the results of empirical cluster analysis, we ultimately decided on five specialty groups: primary care, emergency medicine, obstetrics-gynecology (ob-gyn), surgery, and other. The primary care group includes family practice, internal medicine, and pediatrics, while surgery comprises general surgery and the surgical specialties. The “Other” group is a diverse collection of the remaining specialties from Table 1.
Moonlighting
Forty-two percent of residents/fellows reported moonlighting, while 33% of the entire sample obtained their license specifically to moonlight. However, the amount of time devoted to moonlighting appeared to be limited. About 42% of moonlighters worked four or fewer hours per week, another 30% worked five to eight hours per week, and 28% worked more than eight hours per week, with a portion of this latter group working more than 16 hours per week. Moonlighting rates varied considerably by specialty. The percent of residents/fellows moonlighting for each specialty was: emergency medicine = 56%; primary care = 47%; surgery = 40%; other = 29%; and ob-gyn = 14%. Overall, moonlighters were slightly more likely than nonmoonlighters to perform or interpret the various clinical activities. However, the small number of respondents for some specialties and limitations in data collection attenuated the differences between moonlighters and nonmoon-lighters. Specifically, the responses of those who moonlight also included their clinical activities while not moonlighting, which is most of their practice time. This confounding would have resulted in an underestimate of the effects of moonlighting.
MOONLIGHTING RATES VARIED CONSIDERABLY BY SPECIALTY. THE PERCENT OF RESIDENTS/FELLOWS MOONLIGHTING FOR EACH SPECIALTY WAS: EMERGENCY MEDICINE = 56%; PRIMARY CARE = 47%; SURGERY = 40%; OTHER = 29%; AND OB-GYN = 14%.
Practice Setting
Respondents indicated the percent of time spent in each of 12 specific practice settings, which were collapsed into 4 major settings: inpatient, emergency department, outpatient, and other (e.g., academic). The question on setting was completed only by residents and fellows who moonlighted and by those who had completed training. As Table 2 documents, there are obvious and predictable differences between specialties (e.g., emergency room physicians spend the most time in emergency departments). However, a more notable finding is the shift from inpatient to outpatient settings as physicians move from residency/fellowship to independent practice. This trend was shown even by moonlighting surgeons.
WHILE ALL PROCEDURES WERE ORDERED TO SOME EXTENT BY BOTH GROUPS, RESIDENTS/FELLOWS WERE MORE LIKELY TO ORDER PROCEDURES THAN PHYSICIANS IN INDEPENDENT PRACTICE.
Mean Percent of Time in Each Setting by Specialty Group and Stage of Training.*
Clinical Activities and Stage of Training
Table 3 indicates the percentage of all respondents who ordered, performed, or interpreted each procedure. Logistic regression was employed to investigate the relationship between stage of training and the percentage who perform each clinical activity. While the regression models included both stage of training and specialty, the latter was not of particular interest because large differences are to be expected; the motive for including both variables in the model is to test the effects of stage of training while statistically controlling for specialty. Physicians who had completed training were the “reference” group; thus, odds ratios (OR) greater than 1.0 indicate that those who completed training have greater odds of completing a procedure, while an OR less than 1.0 indicates residents/fellows have greater odds of performing the procedure. We computed a 95% confidence interval around each OR to determine if it was significantly different from 1.0.
Percentage who indicated involvement with the following activities (order, perform, interpret results) since obtaining an independent license. Standard errors range from 0.5% to 1.1%.
Ordered Procedures. Of the 17 procedures that could be ordered, the percentage of physicians with a positive response ranged from a low of 37% for #13 alcohol detoxification to a high of 77% for #36 CT or MRI of the head (see Table 3). Other commonly ordered procedures included #37 CT of thorax or abdomen; # 30 Respiratory medication by nebulizer, and #18 Transfusion of blood products.
While all procedures were ordered to some extent by both groups, residents/fellows were more likely to order procedures than physicians in independent practice, with 14 of the 17 procedures exhibiting significant ORs. The largest effect was for #35 Continuous invasive mechanical ventilation (OR =0.25), which residents and fellows were much more likely to order. Other procedures that residents were far more likely to order included: #12 Parenteral infusion of concentrated nutritional substance (OR=0.28); #27 Infusion of vasopressor agent (OR=0.28); and #11 Enteral infusion of concentrated nutritional substance (OR=0.32) and #18 Transfusion of blood products (OR=0.34). The remaining nine statistically significant ORs were smaller, ranging from 0.40 to 0.76.
Performed Procedures. As indicated in Table 3, total group percentages ran from 6% for #26 Cardiovascular stress test and #28 Simple spirometry, to 57% for #54 Incision with drainage of skin and subcutaneous tissue. Other commonly performed procedures included #53 Closure of skin and subcutaneous tissue, #22 Arterial puncture, #21 IV central catheter placement, and #7 Lumbar puncture.
MOONLIGHTING RESIDENTS WILL NOT HAVE THE SUPERVISION AVAILABLE TO THEM TO WHICH THEY ARE ACCUSTOMED, AND MAY BE CALLED UPON TO PERFORM PROCEDURES FOR WHICH THEY HAVE LIMITED EXPERIENCE.
Residents/fellows were more likely to perform 24 of the procedures, exhibiting far greater odds for: #51 Biopsy of bone marrow (OR=.012); #5 Thoracentesis (OR=0.26); #23 Arterial line placement (OR=0.26); #22 Arterial puncture (OR=0.27); #21 IV catheter placement, central (OR=0.28); #40 Ultrasound guided procedures (e.g., thoracentesis, paracentesis) (OR=0.28); #7 Lumbar puncture (OR=0.28); #6 Paracentesis (OR=0.29); and #26 Cardiovascular stress test (OR=0.32). There were 17 additional procedures favoring residents, with ORs ranging from 0.36 to 0.72. In contrast, six procedures were more likely to be performed by those who had completed training, including: #30 Respiratory medication by nebulizer (OR=1.67); #48 Evaluate for corneal ulcer (OR=1.61); #56 Local excision or destruction of skin lesion or tissue (OR=1.58); #54 Incision with drainage of skin and subcutaneous tissue (OR=1.54); #55 Debridement of wound, infection or burn (OR=1.47); and #58 Debridement of nail, nail bed, or nail fold (OR=1.33).
Interpreting Results. The percentage of physicians in the total group who interpreted results of the 20 studies ranged from 7% for #51 Biopsy of bone marrow to 48% for #25 Continuous ECG monitoring, with a mean of 26%. The most frequently interpreted procedures included #25 Continuous ECG monitoring, # 7 Lumbar puncture, #29 Peak flow measurement, #36 CT or MRI of head, and #37 CT of thorax or abdomen. These latter two also were among the most commonly ordered procedures (see Table 3).
The trend for broader involvement from residents/fellows was also evident for interpreting the results of procedures. Residents/fellows were more likely than physicians who had completed training to interpret results of #51 Biopsy of bone marrow (OR=.12); # 24 Swan-Ganz catheterization (OR=.27); #36 CT or MRI of head (OR=.33); # 37 CT of thorax or abdomen (OR=.33); and #5 Thoracentesis (OR=.34). An additional 10 procedures exhibited ORs ranging from 0.36 to 0.62. Only one activity had greater odds of being performed by physicians who had completed training: #48 Evaluate for corneal ulcer (OR=1.44). Perhaps the decreased rates of involvement in procedures by those who completed training can be explained, in part, by the change from inpatient to outpatient practice settings.
Core Clinical Activities
This analysis sought to identify clinical activities that may not be consistent with the GUMP model underpinning the USMLE because of their lack of applicability to many specialties. Data for individual specialties were analyzed, rather than larger specialty groups, to the extent that sample sizes permitted. Fourteen specialties were included: anesthesiology; dermatology; emergency medicine; family practice; internal medicine; neurology combined with physical medicine and rehabilitation; obstetrics-gynecology; ophthalmology; pathology; pediatrics; psychiatry; radiology; surgery; and all others. Most specialties included more than 100 physicians; only dermatology (n = 54) and other (n = 21) had fewer than 100 respondents.
A core activity was defined as one that is relevant to at least one-third of specialties, with relevance being defined as at least 20% of physicians within the specialty performing the activity. Given the 14 specialties included in this analysis, the one-third criterion corresponds to at least five specialties. We acknowledge that while this criterion seems reasonable, it is arbitrary. Table 4 identifies noncore activities — those activities that did not exceed the relevance criterion for either order, perform, or interpret. All 17 ordered procedures were relevant to more than five specialties and can be considered core by this method. The ordered procedure that was closest to being considered noncore was #13 alcohol detoxification, which was performed by six specialties. Twenty-one of 48 performed procedures did not meet the criterion. These 21 procedures appear in Table 4. Another way to define core is by evaluating endorsement rates for all physicians, as in Table 2, where it can be seen that several of the 21 procedures were performed by a small percentage of the total group. For example, procedures #26, #28, #46, and #51 each were performed by less than 10% of all physicians. Table 4 also indicates that nine interpreted procedures were flagged as noncore. Many of these procedures also have low endorsement rates in Table 2. While the results in Table 2 are strongly influenced by larger specialties, Table 4 has the benefit of being more sensitive to specialties with fewer physicians.
Clinical Activities Not Meeting the Criteria to be Considered Core.*
Discussion
These results have implications for physician education and licensure. Nearly one-half of physicians reported moonlighting while still in training, an estimate that is less than previously reported values.16,17 The prevalence of moonlighting is not too surprising given that only two states mandate three years of progressive GME or the completion of a residency program as a requirement of both U.S. and international medical graduates for an unrestricted medical license.18,19 Moonlighting residents will not have the supervision available to them to which they are accustomed, and may be called upon to perform procedures for which they have limited experience.16,17 To help provide assurance that moonlighting residents are qualified, it is important for licensure tests to assess the knowledge and skills required to manage patient conditions that moonlighting residents/fellows are likely to encounter — those conditions likely to be seen in the emergency department and inpatient settings.
RESULTS INDICATE THAT AS PHYSICIANS COMPLETE TRAINING AND ENTER INDEPENDENT PRACTICE, THEIR PRACTICE SETTING SHIFTS FROM INPATIENT TO OUTPATIENT AND THEIR RANGE OF CLINICAL ACTIVITIES NARROWS.
Results indicate that as physicians complete training and enter independent practice, their practice setting shifts from inpatient to outpatient and their range of clinical activities narrows. Consequently, residents acquire skills during training that may be of limited use when they enter independent practice, and may not gain certain outpatient clinical experiences essential for independent practice.20 This begs the question, “should training mirror more closely the practice of medicine in the specialty for which the resident is preparing?” On one hand it has been convincingly argued that to facilitate the transition to independent practice, GME should be restructured to ensure that residents gain more experience in the outpatient settings in which they will ultimately spend most of their practice.3,4,20,21,22 On the other hand, the current emphasis on inpatient settings may have educational value. One reason is that training should include a balance of common, low-impact diseases (e.g., spondylolisthesis) and rare, high-impact diseases (e.g., meningococcemia) of the type encountered in inpatient settings. Breadth of clinical experiences — even if they do not carry over into independent practice — provides the opportunity for young physicians to hone their clinical acumen. Another reason is that the diversity of conditions seen by residents in inpatient settings may better prepare them to care for critically ill patients. Furthermore, the emergence of new specialties (e.g., hospitalist) argues for continued exposure to inpatient training. The key is to strike a balance, to structure residency education and physician licensure to allow for beneficial variation, while minimizing nonproductive variation.
BREADTH OF CLINICAL EXPERIENCES — EVEN IF THEY DO NOT CARRY OVER INTO INDEPENDENT PRACTICE — PROVIDES THE OPPORTUNITY FOR YOUNG PHYSICIANS TO HONE THEIR CLINICAL ACUMEN.
To help identify the clinical activities that contribute to beneficial variation in training, we developed and applied a method for differentiating core from noncore activities and identified several procedures that may not conform to the GUMP model. Clinical activities that lack broad applicability should be scrutinized for relevance to licensure before including them on an examination. While it might be argued that any procedure required of any undifferentiated physician is relevant for a licensing exam, as a practical matter the number and types of questions that can be asked are limited. Those procedures performed by few specialties, while appropriate for a specialty certifying examination, should be considered for exclusion from a general medical licensing examination. The nature of the physician's responsibility also has implications for training and assessment. For example, ordering a procedure requires knowledge of the indications, contraindications, complications, and underlying principles (e.g., physiology, pathology); interpreting results suggests that the physician also should be expected to understand the limitations of the procedure, its clinical utility, and the implications of positive or negative results. If one actually performs the procedure, then training and assessment should address actual performance. These considerations carry significance for the “national faculty” of USMLE volunteers responsible for overall examination design and who develop content specific to each Step. For example, to better meet the needs of state licensing authorities, USMLE is currently being revised to better distinguish between the knowledge, skills, and competencies required for supervised practice (residency) and those required for unsupervised practice (unrestricted license).23
A limitation of this study is that while most specialties were represented and the response rate was reasonable, nonresponse is a potential source of bias. For instance, the percentage of respondents who were still in training and reported moonlighting could be underestimated if this population was too busy to respond to a survey. Furthermore, the sample sizes for many specialties were too small to permit certain analyses. In addition, while the sample accurately reflected the distribution of newly licensed physicians, it was dominated by those still in residency or fellowship. Finally, the 58 activities included on the questionnaire were far from exhaustive, and the extent to which the results generalize to other procedures should be verified through additional questionnaires or by confirmation with other data (e.g., NAMCS). These limitations notwithstanding, the results should be useful for curriculum development and the design of licensure tests.
About the Authors
↵Mark R. Raymond, PhD, is Director of Research and Development, Test Development, National Board of Medical Examiners.
↵Janet Mee, MA, is Senior Research Measurement Analyst, Measurement Consulting Services, National Board of Medical Examiners.
↵Steven A. Haist, MD, MS, is Vice President, Test Development, National Board of Medical Examiners.
↵Aaron Young, PhD, is Assistant Vice President, Research and Data Integration, Federation of State Medical Boards.
↵Gerard F. Dillon, PhD, is Vice President, Licensure Programs, National Board of Medical Examiners.
↵Peter J. Katsufrakis, MD, MBA, is Senior Vice President, Assessment Programs, National Board of Medical Examiners.
↵Suzanne McEllhenney is Director, Program Management, United States Medical Licensing Examination.
↵David Johnson, MA, is Senior Vice President, Assessment Services, Federation of State Medical Boards.
↵ *A comprehensive report (102 pp.) with standard errors and results for all logistic regression analyses (including specialty comparisons) is available from the first author.
- Copyright 2014 Federation of State Medical Boards. All Rights Reserved.
References:
- 1.↵Association of American Medical Colleges. Recommendation for Preclerkship Clinical Skills Education for Undergraduate Medical Education. Washington, DC: American Association of Medical Colleges; 2008.
- 2.↵Swing, SR. The ACGME outcome project: retrospective and prospective. Med Teach. 2007; 29:648–654.
- 3.↵Burke MM , HaistSA, GriffithCH, WilsonJF, MitchellCK. Preparation for practice: A survey of med-peds graduates. Teach Learn Med. 1999;11(2):80–84.
- 4.↵Morrow G , JohnsonN, BurfordB, et al. Preparedness for practice: The perceptions of medical graduates and clinical teams. Med Teach. 2012;34(2):123–135.
- 5.↵Clauser BE , MargolisMJ, CaseSM. Testing for licensure and certification in the professions. In: BrennanRL, ed.Educational Measurement. 4th ed.Westport, CT: American Council on Education/Praeger; 2006.
- 6.↵Tamblyn R. Is the public being protected? Prevention of suboptimal medical practice through training programs and credentialing examinations. Eval Health Prof. 1994;17(2): 198–221.
- 7.↵Surgical Council on Resident Education (SCORE). SCORE Patient Care Curriculum Outline. Philadelphia, PA: American Board of Surgery; 2008.
- 8.↵Nothnagle M , SiciliaJ, FormanS, et al. Required procedural training in family medicine residency: A consensus statement. Fam Med. 2008;40(4):248–252.
- 9.↵Wong E , StewartM. Predicting the scope of practice of family physicians. Can Fam Physician. 2010;56(6):e219–e225.
- 10.↵Probst J , MooreC, BaxleyE, LammieJ. Rural-Urban Differences in Visits to Primary Care Physicians. Fam Med. 2002;34(8):609–615.
- 11.↵LaDuca, A. Validation of professional licensure examinations: Professions theory, test design, and construct validity. Eval Health Prof. 1994;17:178–197.
- 12.↵Patterson F , FergusonE, ThomasS. Using job analysis to identify core and specific competencies: implications for selection and recruitment. Medical Educ. 2010;42:1195–1204.
- 13.↵Raymond MR , MeeJ, KingA, HaistSA, WinwardML. What new residents do during their initial months of training. Acad Med. 2011;86(suppl):59–62.
- 14.↵Smart DR. Physician Characteristics and Distribution in the US, 2010 Edition. Chicago, IL: American Medical Association.
- 15.↵Heeringa SG , WestBT, Berglund, PA. Applied Survey Data Analysis. Boca Raton, FL: Chapman & Hall/CRC; 2010.
- 16.↵Baldwin DC Jr , DaughertySR. Moonlighting and indebtedness reported by PGY2 residents: It's not just about the money! Acad Med. 2002; 77(10)(suppl):36–38.
- 17.↵Li J , TaborT, MartinezM. Survey of moonlighting practices and work requirements of emergency medicine residents. Am J Emerg Med. 2000;18(2):147–151.
- 18.↵Federation of State Medical Boards. Essentials of a Modern State Medical and Osteopathic Act (13th ed), Euless, Texas: Federation of State Medical Boards; 2012.
- 19.↵Federation of State Medical Boards. State-specific requirements for initial medical licensure. http://www.fsmb.org/usmle_eliinitial.html; 2013. Accessed October 25, 2013.
- 20.↵Holmboe ES , BowenJL, GreenM, et al. Reforming internal medicine residency training: A report from the Society of General Internal Medicine's task force for residency reform. J Gen Intern Med. 2005; 20:1165–1172.
- 21.↵Lister JR , WA, MuradGJ, Dow, J, Lombard, GJ. Evaluation of a transition to practice program for neurosurgery residents: Creating a safe transition from resident to independent practitioner. J Grad Med Edu. 2010; 2(3):366–372
- 22.↵Prince KJAH , van de WielMWJ, van der VleutenCPM, Boshuizen, HPA, Scherpbier, AJJA. Junior doctors' opinions about the transition from medical school to clinical practice: A change of environment. Educ Health. 2004;17:323–331.
- 23.↵Haist SA , KatsufrakisPJ, DillonGF. The evolution of the United States Medical Licensing Examination (USMLE): Enhancing assessment of practice-related competencies. JAMA2013; 310(21):2245–2246.




