Printed from:

Completing Self-Assessment Modules During Residency Is Associated With Better Certification Exam Results

Lars E. Peterson, MD, PhD; Brenna Blackburn, MPH; Michael R. King, MD, MPH

Background and Objectives: Family medicine residents were recently required to complete Self-Assessment Modules (SAMs), part of the American Board of Family Medicine’s (ABFM) Maintenance of Certification for Family Physicians (MC-FP). We studied whether completing SAMs was associated with initial certification exam performance.

Methods: We used ABFM administrative data to identify all family medicine residency graduates who took the ABFM certification exam between 2010 and 2012. We used descriptive statistics to characterize resident and residency demographics by SAM participation. We used both multilevel linear and logistic regression to test for differences in score and pass rate controlling for resident and residency characteristics.

Results: A total of 8,348 graduates took the certification exam between 2010 and 2012. The first time pass rate was 90.4%, and the mean score was 484.2 (SD=80.4). In unadjusted analysis, mean exam score and passing rates were similar regardless of SAM completion (490.7 versus 483.6 and 90.6% versus 90.4%, respectively). Using multilevel logistic and linear regression models, we found that completion of a SAM was associated with a 62% increased odds of passing the exam (OR=1.62 [95% CI=1.05, 2.50]) and an 18.76 score increase. Residents in residencies where greater than 10% of residents fail the examination were less likely to pass (OR=0.63 [CI=0.44, 0.89]), controlling for resident characteristics.

Conclusions: Prior to the new requirements, residents who completed a SAM had higher board scores and exam passing rates. Likelihood of passing initial board certification may be improved by requiring resident participation in MC-FP.

(Fam Med 2014;46(8):597-602.)

Family medicine residency program evaluations by the Accreditation Council for Graduate Medical Education (ACGME) include board certification examination take and pass rates.1,2 Examination pass rates are a readily identifiable, and easily understood, metric and are a necessary step in new graduates obtaining board certification. Programs and residents are constantly seeking effective evidence-based study materials and assessments to enhance a resident’s education. Many educational resources are often not specialty focused for family medicine or not as clinically applicable to primary care practice. Many educational resources also vary in the level of supporting evidence. Through the Maintenance of Certification for Family Physicians (MC-FP) process, the American Board of Family Medicine (ABFM) produces Self-Assessment Modules (SAMs) that are an in-depth, evidence-based knowledge assessment of the most relevant and common conditions seen in the specialty of family medicine.3 SAMs not only offer assessment of knowledge but also offer links to published studies, systematic reviews, and evidence-based guidelines that may provide an effective educational resource for residents.

Starting in July 2012, all residents entering ACGME-accredited Family Medicine Training Programs are required to complete an MC-FP stage prior to registering for the certification examination.4 This requirement was put in place to familiarize residents with the MC-FP process and with clinical simulation technology in the SAMs that will be introduced to the MC-FP examination in the future. To complete an MC-FP stage a resident must earn 50 MC-FP points that must include at least one SAM and one Part 4 module, usually a quality improvement project.

Since the beginning of MC-FP, resident physicians have been able to complete SAMs without paying MC-FP fees. A minority of residency program directors were already utilizing SAMs in their residency prior to the new requirements. An unpublished April 2010 survey of program directors by the ABFM found that only 25% of respondents were using MC-FP tools in their residency and that most were unaware residents could participate in MC-FP activities without paying MC-FP fees.5 Now that MC-FP is a requirement for residents, we sought to study whether participating in MC-FP will lead to any change in certification examination pass rate. Therefore, our objective was to determine whether completing SAMs during residency was associated with higher examination pass rates and higher board scores.




Study Sample

From the ABFM administrative database, we identified all physicians who completed a family medicine residency between 2010 and 2012 and who took the ABFM certification exam. The sample was restricted to those who completed a residency within the 50 states in the United States or Washington, DC.

Study Variables

From the ABFM database we identified SAMs completed in residency by whether the completion date was prior to residency graduation date. SAMs have two parts, a 60-item multiple-choice question knowledge assessment and a clinical simulation. We only classified residents who completed both sections as having completed the SAM, which is the standard in the new requirements. Resident demographic information included age in years at residency graduation, gender, medical degree (MD versus DO), current certification status as of February 1, 2013, and International Medical Graduate (IMG) status. Prior work has shown that DOs and IMGs are less likely to pass the certification examination.6 We identified the last In-Training Exam (ITE) score prior to the certification exam to control for baseline performance on standardized tests and because ITE score is associated with certification exam results.7 We also hypothesized that residents with low ITE scores may be more likely to use a SAM as a study aid. Certification examination result was categorized as pass or fail, as well as the actual scaled score. Scores are reported from 200 to 800, with 390 representing the passing score each year of our study.

Residency characteristics include class size, urban or rural location, and percentage of residents failing the examination. Prior work found that training in a larger residency was associated with higher likelihood of passing the certification examination,8 and we calculated residency class size as the average number of graduates per year from 2010 to 2012. We calculated the percentage of residents in each residency completing a SAM to control for likelihood of participating as we hypothesized some program directors may be more likely to encourage their residents to use SAMs.5 Residency location was determined by linking the zip code of the residency to the Rural-Urban Commuting Area Codes (RUCA) Version 2.09 and classified as urban or rural. Residency quality may be associated with exam score, and the ACGME may cite programs whose graduates have a rolling 3- or 5-year average failure rate greater than 10%.2 We calculated the pass rate for each residency from 2010 to 2012 and created a dichotomous variable indicating if a residency had a >10 % failure rate.

Statistical Methods

We used descriptive statistics, t tests, and chi-square tests to characterize the demographics of the sample. Then, we performed bivariate analyses of the demographics and examination results by SAM completion. Prior to regression modeling we assessed correlations and colinearity of variables. Finally, we ran two separate multivariable multilevel regression models testing for associations between examination performance and SAM completion, controlling for the nesting of residents within residencies and adjusting for resident and residency characteristics that may be associated with passing the examination. The first model was a multilevel logistic regression testing whether completing SAMs in residency was associated with a higher passing rate on the first certification exam after residency. The second model was a multilevel linear regression testing whether SAM completion was associated with a higher exam score. Both of these regressions controlled for resident and residency characteristics. The resident variables controlled for included age in years, gender, degree type (MD or DO), IMG status, and ITE scores. Residency characteristics were residency location (urban or rural), mean number of resident graduates per year, dummy variables representing if the percentage of residents who completed at least one SAM was zero, lower than the mean of those residents where residents completed a SAM, or higher than the mean (21%), and whether residency graduates had a >10% failure rate. After reviewing our initial findings we conducted exploratory analysis to understand our findings. We tested for an interaction between completing a SAM and the percentage of residents completing a SAM, ran the models without the main effect variable included, and bivariate analysis of ITE and board scores by residency characteristics. This study was deemed exempt by the American Academy of Family Physicians Institutional Review Board. All analyses were performed with SAS version 9.3 (Cary, NC).




The final sample contained 8,348 family physicians who graduated from residencies in the 50 states in the United States and Washington, DC, from 2010 through 2012. Our sample had a mean first board exam score of 484.2 (SD= ±80.4) with a pass rate of 90.4% (Table 1). Average age at residency graduation was 33.5 years, 44.0% were male, and 39.8% were IMGs. Residencies had an average of 6.2 (±2.6) residents per year, and 92.4% were located in urban areas. Among all residents, the mean number of SAMs completed was 0.2. While 1,507 residents began a SAM, only 499 residents completed one. Among the SAM completers, 198 completed one SAM, 212 completed two to four SAMs, and 89 completed five or more. The mean percentage of residents per residency who completed a SAM was 6.4%, with a range from 0 to 100%. Among residencies where at least one resident completed a SAM, the median percentage was 21.0%, with only 136 residencies having a resident complete at least one SAM during residency. Bivariate analyses found that residents who completed a SAM were more likely to be male, train in a program where more residents complete SAMs (49.4% versus 3.2%), and be in a residency with a failure rate >10% (50.3% versus 37.4%) than those who did not (Table 2). Mean certification exam and pass rate were similar between those who did and did not complete a SAM in bivariate analysis.


Before proceeding to regression analyses, we tested for colinearity between our variables. We were particularly concerned that including the average residency examination failure rate would be highly colinear with our outcome. No correlation exceeded 0.50, and all variables were advanced to regression models. Using multilevel logistic regression (Table 3), we found that residents who completed a SAM were 62% more likely to pass their first exam than those who did not (OR =1.62, 95% CI=1.05 to 2.50), controlling for both resident- and residency-level characteristics. In the adjusted model, being an MD compared with a DO (OR=1.55, [1.20, 1.99]), being older at residency graduation (OR=0.96 [0.95, 0.98]), and training in a residency with a high failure rate (OR=0.23 [0.19, 0.28]) or where SAM participation by residents was above the mean (OR=0.23 [0.19, 0.28]) were associated with passing the examination.

Modeling actual examination score in multilevel linear regression (Table 3), we found that completing a SAM during residency was associated with an 18.76 point increase in examination score. Direction of association and statistical significance of demographic variables were largely similar between logistic and linear regression models. A notable exception was IMG status, which was statistically significant in both unadjusted models but only remained significant in adjusted linear models (β= -11.34). We conducted sensitivity and post-hoc exploratory analyses regarding the counterintuitive findings that an individual resident who completes a SAM is more likely to pass, but training in a residency where more residents completed SAMs were less likely to pass. An interaction term between these variables increased model instability to the degree that the model would not converge. Removing the resident-
level variable did not change the size or significance of the result, indicating a robust finding at the residency level. Average ITE and certification exam results differed by 1.4 and 5.6 points respectively among residents in residencies where 0%, 1%–21%, and >21% of residents completed SAMs. When we compared the ITE and certification exam scores of residents who personally completed a SAM and were in residencies where 1%–21% and >21% of residents completed SAMs, we found lower resident scores among residencies with higher rates of SAM participation—484.9 versus 472.2 for ITE and 504.1 versus 484.9 for certification exam.




Using data from 3 years of family medicine graduates, we found that completing a SAM during residency was significantly associated with a slightly higher board certification examination score and increased odds of passing the examination. These results are timely as completion of two SAMs in residency, part of completing an MC-FP stage, is now a requirement to sit for the ABFM certification examination. Our findings suggest that as residents who were required to complete an MC-FP stage take the certification examination, scores and passing rates may improve. The robustness of our findings is supported by the consistency of our results using both linear and logistic regression, even among control variables.

We found that at the resident level, completing SAMs was associated with greater odds of passing the certification examination despite lower odds, at the residency level, when more residents take SAMs. One interpretation of this finding is that certain residency directors are steering their poor performing residents, as measured by the ITE or other evaluations, to SAMs as a study aid or source of further assessment. Another interpretation is that SAMs are counterproductive to certification examination success when most residents complete them. Yet another interpretation is that the percent of residents completing a SAM is a marker for residency quality, and the result is a simple case of confounding. This interpretation seems unlikely as we included an ACGME measure of quality, aggregate passing rate, which had a much stronger association with passing the exam than percent of residents taking a SAM at the residency level. We favor the first interpretation as our post-hoc analysis found that residents who completed SAMs in residencies with higher SAM participation rates had lower ITE and certification examination scores. For the middle explanation to hold as few as one third of residents completed a SAM would somehow decrease the medical fund of knowledge of the other two thirds of residents.

While we found higher passing rates and board scores for residents who completed a SAM in residency in regression analysis, the average board scores and passing rates between these two groups in bivariate analysis were statistically the same. We believe this is simply a case of negative confounding10 where the association is attenuated but once other variables associated are controlled for, the true relationship between completing a SAM and board scores becomes evident. Usually the opposite, or positive confounding, occurs where significant relationships in bivariate analyses are exaggerated and become smaller and possibly non-significant when other variables are controlled for. An example of positive confounding in our study is the relationship between IMGs and passing. Prior research has shown that IMGs have lower odds of passing the certification exam, with odds ratios ranging from 0.21 to 0.23.6 We found a similar relationship in unadjusted bivariate analysis, odds ratio 0.45, but this relationship was attenuated and became nonsignificant once other variables were accounted for in multivariable regression.

The results of this study indicate that completion of at least one SAM in residency is associated with higher board scores and better odds of passing the ABFM certification exam. SAMs were created to be summaries of current evidence for a clinical topic or care of a population for a practicing clinician but may also offer an effective examination study resource. The explanation for our main finding is likely that residents who completed a SAM gained useful and valuable knowledge on a core clinical topic in family medicine, which increased their medical fund of knowledge. This increased knowledge was reflected in their certification exam performance. This theory is supported by prior research where 57.9% of family medicine program directors agreed/strongly agreed that MC-FP activities will be an effective tool for resident education, and 34.1% agreed/strongly agreed MC-FP activities will help evaluate a resident’s competency.11 The ABFM may be able to facilitate incorporation of SAMs into residency education by assisting residencies in hosting group SAMs where faculty would receive credit for participation as well. Another way SAMs may be incorporated into residency education is by assigning specific SAMs to corresponding rotations; for example, maternity care during an OB rotation.

This study has multiple limitations. First, many other unobserved factors are likely to contribute to board examination scores, including quality of education and clinical experience during residency. We were included one measure of educational quality, but the strength of board passing rates with overall quality remains unknown. Inclusion of data from the ACGME on number of citations, probationary status, accreditation length, or residency survey data may improve this assessment, but we did not have access to these data. Second, we did not investigate if specific SAMs were associated with higher board scores due to the small numbers of residents who completed SAMs. Some may argue that certain SAMs (diabetes, hypertension) are more “core” to family medicine and may be more relevant to the exam than others. We plan further research on this issue using calculated ability scores for each disease represented by the SAM from the ITE exam and certification exam to better understand the impact of the SAMs on board exam performance. Finally, only 499 (6.0%) of residents completed a SAM, which may limit the generalizability of our findings to other residents. However, at the resident level the characteristics of those who completed a SAM and those who did not were similar except gender, which suggests this small group of residents may be a representative sample.

Using data on over 8,000 of family medicine residency graduates, we found that completing SAMs during residency was associated with higher board scores and higher certification examination passing rates. With residents now being required to take SAMs to sit for the examination, average board scores and passing rates may increase.

Acknowledgements: Robert L. Phillips Jr, MD, MSPH reviewed an early version of the manuscript.

This paper was presented at the 2013 North American Primary Care Research Group Annual Meeting, Ottawa, Ontario.

Corresponding Author: Address correspondence to Dr Peterson, American Board of Family Medicine, 1648 McGrathiana Parkway, Suite 550, Lexington, KY 40511-1247. 859-269-5626. Fax: 859-335-7509.



  1. ACGME. ACGME Program Requirements for Graduate Medical Education in Family Medicine: Accreditation Council of Graduate Medical Education (ACGME). 2007. Accessed October 30, 2013.
  2. ACGME. Specialty-specific referecnes for DIOs: board certficiation requirements and pass rate information. Accreditation Council of Graduate Medical Education (ACGME). Accessed October 30, 2013.
  3. Norris TE, Rovinelli RJ, Puffer JC, Rinaldo J, Price DW. From specialty-based to practice-based: a new blueprint for the American Board of Family Medicine cognitive examination. J Am Board Fam Pract 2005;18:546-54.
  4. ABFM. Maintenance of Certification for Family Physicians (MC-FP) is moving into residency training as of June 1, 2012! ABFM News for Family Medicine Residency Directors, July edition. Lexington, KY: American Board of Family Medicine, 2012.
  5. ABFM. ABFM survey of family medicine residency program directors. Lexington, KY: American Board of Family Medicine, 2010.
  6. Schulte BM, Mannino DM, Royal KD, Walsh S, Peterson LE, Puffer JC. Predictors of successful recertification of family physicians: an empirical investigation of community size and organization of practice. J Am Board Fam Med 2014 May-June;27(3):383-90.
  7. Leigh TM, Johnson TP, Pisacano NJ. Predictive validity of the American Board of Family Practice In-Training Examination. Acad Med 1990;65:454-7.
  8. Falcone JL, Middleton DB. Pass rates on the American Board of Family Medicine Certification Exam by residency location and size. J Am Board Fam Med 2013;26(4):453-9.
  9. Rural-Urban Commuting Area Codes (RUCAs). WWAMI Rural Health Research Center, 2013. Accessed October 30, 2013.
  10. Szklo M, Nieto FJ. Epidemiology: beyond the basics. Sudburry, MA: Jones and Bartlett Publishers, 2004.
  11. Peterson LE, Blackburn B, Phillips RL, Puffer JC. Family medicine residency program director’s plans to incorporate maintenance of certification into residency training: a CERA survey. Fam Med 2014;46(4):299-303.

From the American Board of Family Medicine, Lexington, KY (Dr Peterson and Ms Blackburn); and the Department of Family and Community Medicine, University of Kentucky (Dr King).

Copyright 2018 by Society of Teachers of Family Medicine