Printed from:

Comparison of Textbook to fmCases on Family Medicine Clerkship Exam Performance

Mario P. DeMarco, MD, MPH; Kent D.W. Bream, MD; Heather A. Klusaritz, PhD, MSW;
Katherine Margo, MD

Background and Objectives: The use of online learning with virtual cases has become commonplace in medical education. A series of fmCASES has been developed to assist with learning for clerkship students in family medicine. It has not been shown whether this series of cases improves student learning during their clerkship compared to traditional learning modalities.

Methods: We designed an intervention study to replace the traditional family medicine clerkship textbook with the fmCASES curriculum at one medical school. We then compared two consecutive cohorts of family medicine clerkship students by examining their performance on overall and small groups of exam questions at the end of the clerkship.

Results: Data were obtained for 95% of students across the 2-year study. Overall performance on the end of clerkship exam was unchanged with the transition to fmCASES. Student performance was variable based on subject area and source of examination question.

Conclusions: Using a set of online cases to replace a traditional textbook did not change overall performance on the end-of-clerkship assessment. However, our findings suggest that students demonstrated proficiency in answering questions that came from the sources they studied from. This finding should be considered when curricula transition to greater use of online learning resources.

(Fam Med 2014;46(3):174-9.)

Many disciplines including medicine are evolving to incorporate online teaching as part of the course curricula to meet student education objectives. There have been many reasons cited for the introduction of online learning, including improved student satisfaction as well as requiring less use of faculty time and resources.1,2 More specific to medical education, online case curricula during required clerkships has been posited to help clerkship directors comply with the Liaison Committee on Medical Education (LCME) ED-2 and ED-8 standards.3-5 These requirements aim to ensure comparable educational experiences and evaluation methods across all instructional sites (ED-8) and require clerkships to specify the volume and breadth of conditions that students must encounter to achieve the course’s learning objectives (ED-2). It is thought that advantages of virtual curricula include the ability to update content in real time and the ability of students to access the materials remotely, among others. There has been a continuous move toward developing more sophisticated versions of web-based curricular materials for use in family medicine clerkships.3,4 This has included the development and dissemination of fmCASES, a set of online cases that is designed to teach the national clerkship curriculum recently defined by the Society of Teachers of Family Medicine (STFM).3 Because of its comprehensive nature, fmCASES has been utilized across many medical schools as a replacement for traditional course textbooks or didactic teaching throughout the clerkship. To the best of our knowledge, prior research in this area has reported on the use of supplemental online learning among medical students and has not examined whether replacing traditional teaching modalities results in improved outcomes.

Previous research has demonstrated that successful integration of a web-based case learning program into family medicine clerkships is possible.4,5 Moreover, Shokar and colleagues report that addition of case-based learning to the curriculum may result in improved exam performance.6 However, research to date has reported on student feedback of online curricula and the process of development and implementation of curricula. With the exception of Shokar and colleagues,6 who found that web-based case learning was associated with higher scores on the National Board of Medical Examiners (NBME) subject exam and the standadized patient (SP)-based exam, and Wiecha and colleagues,1 who report students completing an online module during the family medicine clerkship demonstrated higher scores on a post-clerkship diabetes management assessment, we know of no research evaluating the impact of case-based online curricula on medical student clinical learning.

In our family medicine clerkship, we implemented a curricular change by replacing our previously recommended textbook with a set of 29 fmCASES. We studied the effect of this change by evaluating performance on the end of clerkship exam, which was modified to retain specific elements of our historic exam as well as add questions generated from the fmCASES bank of test questions. We hypothesized that student clerkship performance on standard measures would not change with a change of teaching methodology from textbook to online cases.





This study evaluated exam performance on specific test questions by medical students who completed our 4-week required family medicine clerkship. The clerkship is structured to have students work with family medicine preceptors Monday–Thursday and then convene for a series of didactic teaching sessions each Friday. The 4-week experience is a continuous and discreet block that is paired with 2 months of inpatient internal medicine that students do before or after the family medicine block. The overall family medicine and internal medicine curriculum as well as the placement of clerkships did not change during the study period. The didactic sessions consist of a regular series of lecture-style presentations on relevant primary care topics and are given to all students in all blocks and generally by the same presenters each month.


Until 2011, the recommended reading for the family medicine clerkship was Essentials of Family Medicine (Sloan et al). At the start of the 2011 clerkship year, we stopped recommending the textbook reading and instead assigned all clerkship students to complete 29 online cases (27 fmCASES, 2 CLIPP cases from a pediatric set of cases). All other teaching elements of the clerkship, including didactic content, faculty lecturers, and clerkship sites, remained unchanged from the prior year. Although online case completion was recommended, lack of completion did not affect their grade. Students were advised that exam questions would be drawn from the question bank associated with the specific assigned cases.

Exam Assessment

Prior to 2011, students were given a multiple choice exam that represented 30% of their course grade. The 75-question exam contained a mix of questions taken from the Essentials of Family Medicine question bank as well as “home-grown” questions developed by faculty members that correlated to didactic content. Midway through the 2010 year, a random pool of fmCASES test questions corresponding to three non-required fmCASES (hypertension, diabetes, musculoskeletal topics) were added to the student exam in addition to the existing exam questions. We chose these subject areas because they are core components of family medicine and ones that we expect all students to learn throughout the clerkship. Each student received 10–16 additional questions that appeared identical in format to the usual exam (the first month that featured the additional questions had 16 fmCASES questions, but in all subsequent months only 10 questions from the pool of 16 were included to keep the exam from being too long). Although these questions were scored, they did not factor into the student grade. In 2011, we replaced the historic exam with a new exam in which random questions were drawn from the fmCASES test question bank only for the cases that had become recommended reading. In addition, we retained 22 items that had been on our previous exam. These retained items included 11 questions from the Essentials test bank and 11 institutional questions, all of which addressed core clerkship content. All other elements of the exam, including the exam format and percentage of course grade, remained constant.


All data were uploaded to an Excel spread sheet along with additional student information, including preceptor evaluation score and score on a musculoskeletal objective structured clinical exam (OSCE) given during the course. The database was organized to ensure that each unique question was tracked across the 2 years of analysis (2010–2011). We then exported the spreadsheet to a statistical program. All analyses were performed using STATA 12.0 (College Station, TX). We were able to generate mean performance scores by calculating percentage of questions correct for each student on their respective examination. We used those values to calculate mean exam score by year. We used a series of t tests to compare means between years. To control for the maturation effect that occurs across the clerkship year, we controlled for block in our analysis. In addition, we performed sub-analysis using multiple domains. We were able to separately analyze performance on specific sets of questions. We did this based on question source, which were categorized based on whether they came from the Essentials test bank (AIMS), “home-grown” questions (HOME), or the fmCASES test bank (fmCASES). We also examined the performance on subsets of questions based on topic. Specifically, we coded questions if they were relevant to diabetes (DM), hypertension (HTN), or musculoskeletal health (MSK), since those were three topic areas that were both strongly relevant to the course teaching goals and had many questions from multiple sources. Finally, we were able to isolate small subsets of questions that satisfied a narrower domain (eg, diabetes questions from fmCASES). We performed weighted least squares regression to account for imbalances between the two classes and control for potential confounding by sex, age, and pre-clerkship baseline knowledge. The study protocol and specific aims were submitted to the University of Pennsylvania Institutional Review Board and was considered exempt from formal review.




Of 320 students in academic years 2010 and 2011, complete score reporting was available for 303 (95%). Students in these 2 years had similar demographic profiles and pre-clerkship academic achievement (Table 1). Overall average performance on the end of course exam did not change with the transition from textbook to online learning (Table 2). Average exam score in 2010 was 78.4 compared to 79.7 in 2011. Similarly, other elements of course grade including preceptor evaluation and score on the musculoskeletal OSCE were not statistically different between the 2 years. Exam reliability was established using a separate comparison (not shown) that revealed that the average exam score in 2010 (baseline year) was similar to the average exam score in the preceding 5 years of the clerkship as well as a national norm for a similar 75-item examination based on the Essentials of Family Medicine, Fifth Edition textbook.7




Analysis of average score by subject yielded mixed results (Table 3). Student average score on hypertension questions (81.6 versus 78.3) and diabetes questions (79.5 versus 82.3) were not statistically different from 2010 to 2011. However, students scored worse on musculoskeletal health questions (77.8 versus 66.8) in 2011 (P<.001).


We created a separate variable that incorporated average score on the 34-item constant exam that included 11 AIMS, 11 HOME, and 12 fmCASES questions that were asked in both 2010 and 2011 in order to look at a subset of the same questions to control for modification of teaching method. The average score on this measure (77.7 versus 76.4) was not different between the two years (P=.35). Secondary analyses revealed that for this subset of constant questions, subject was not correlated with performance across the 2 years but question source was (Table 4). Students in 2011 performed statistically worse on AIMS questions (75.0 versus 85.2, P<.001) and HOME questions (70.8 versus 77.1, P<.001). However, they scored better on fmCASES questions (86.4 versus 67.5, P<.001).When organized by question source, differences were noted from 1 year to the next for all three types of questions. Average score on textbook (AIMS) questions (79.8 versus 75.0) and homegrown (HOME) questions (79.3 versus 70.7) declined from 2010 to 2011 (P<.001). Conversely, students in 2011 performed better on fmCASES questions than students in 2010. This difference remained statistically significant even after controlling for student block.






In this study of 303 medical students rotating through the family medicine clerkship at our institution, we found that implementing an online teaching curriculum of fmCASES did not change overall content knowledge as measured by an end of clerkship multiple choice examination. Students who were exposed to an assigned fmCASES curriculum scored equally as well on their respective end of clerkship exams as the students who were assigned textbook reading in the previous year. This performance, however, was variable by subject. Although overall performance on diabetes and hypertension questions remained unchanged, students in the 2011 intervention group performed worse on musculoskeletal questions. One potential explanation is that students did not learn as much from the fmCASES surrounding musculoskeletal topics as they had learned from reading a textbook. This conclusion may indicate that online learning may be better or more efficient at teaching certain subjects. Interestingly, however, the performance on a musculoskeletal OSCE was not changed from prior years. This finding may suggest that although knowledge acquisition in this area did occur, the specific musculoskeletal questions associated with fmCASES were not able to detect it. It may also be the case that the degree of difficulty for musculoskeletal questions in the fmCASES test bank is higher than other previously used test questions. It was also noted in our results that students in 2010 performed better on questions that were drawn from AIMS or HOME question banks and worse on questions that came from the fmCASES question bank. Not surprisingly, students in 2011 who learned using fmCASES had the opposite results. This finding highlights the fact that regardless of other learning tools, students appear to do better on the type of question that corresponds to their test preparation. Thus, students who read the textbook performed better on AIMS questions, and students who read fmCASES did better on fmCASES questions. However, this assumption does not explain why students in 2011 did worse on home grown (HOME) questions that matched up with didactic teaching and other core content but not necessarily the assigned course reading. This finding may reflect that student test preparation was directed more toward fmCASES than other learning components of the clerkship.

This intervention study is not the first to explore the effect of online learning cases on medical student education. Previous literature has supported the use of virtual cases or online curricula to enhance clinical teaching.8-11 Our findings, however, are unique in that we examined the effect on learning when the traditional reading material for the course is replaced, not supplemented by online learning cases. These findings extend the previous research in this field by demonstrating that in a family medicine clerkship, transitioning to an online case curriculum does not impact overall knowledge as measured by an end-of-clerkship examination. More specifically, our constant exam questions demonstrated that despite a change in teaching method, knowledge acquisition stayed the same.

A notable strength about our study is that we relied on a controlled intervention study design. In doing so, we were able to isolate curricular change only to the required learning modality (textbook versus fmCASES) while maintaining all other elements of the clerkship essentially unchanged. Although we did not restrict their extracurricular reading, students in both years had equal access to textbooks, the biomedical library, online resources, and full access to the entire set of fmCASES. Our analysis also takes into account the location within the clerkship year in which the students took the exam to better control for maturation effect.12 Further, while this study was not designed to validate individual examination questions, our study design, which included retention of a subset of questions from year to year, reduces the likelihood that variation in scores was caused by a changing mix of easy or difficult questions.

There are also several limitations to our study. Our intervention was performed at a single university that has very competitive entrance standards. Our findings therefore may not be applicable to students at all universities. However, these fmCASES were designed to reflect a national curriculum, and the examination was developed to be taken by students in all medical schools. Another limitation of our study is that we did not begin to incorporate fmCASES until the second half of the 2010 year. Because of this design, in our subanalysis of this question type, we needed to control for the block in which students performed the clerkship as students in later blocks have generally acquired more knowledge across their clerkship year. Additionally, although we changed the assigned reading curriculum for students in the 2 years, we did not survey, quantify, or restrict the amount of reading that students may have done outside of those assignments. Thus, variations in exam-based knowledge may have been at least partially attributed to student selection of other textbooks, study guides, or review materials. Finally, we understand that the control group had been taking an examination that had been largely unchanged in many years, while the experimental group was told they would be given a new fmCASES exam. Although we cannot quantify this effect, we suspect that there may have been some sharing of exam content passed along from class to class in prior years that became muted with the switch to a new test bank of questions. In other words, students could have remembered which concepts from didactic lectures were tested on the examination for prior groups, but they may have paid less attention to this in 2011 upon hearing that the exam was new and based mostly on fmCASES.

Our findings provide insight into student learning that has direct applications to all family medicine clerkships and other health professions programs that rely upon online or virtual cases. In addition to prior evidence that using these tools to supplement clinical learning can be advantageous, our results support that these resources can effectively replace more traditional textbook readings without having significant detriments to student knowledge. However, our findings also suggest that not all online cases may be created equal, and while students can learn many clinical concepts through virtual cases, the degree of learning can vary by topic. Departments who are or will implement online cases as part of their curriculum should be aware of the possibility of compensatory learning that may occur as students focus preferentially on some course material instead of others. Future studies will be needed to examine the full degree to which compensatory learning occurs as a result of change to a clerkship curriculum based primarily on a set of online cases as well as establishing optimum balance of virtual case based learning and other teaching modalities.

Acknowledgments: This project was supported by a HRSA grant (HRSA-D56HP15243).

This manuscript was presented at the 2012 Society of Teachers of Family Medicine Conference on Medical Student Education, Long Beach, CA.

The authors acknowledge Ms Diana Brown for her assistance with construction of our database and Dr Mara McAdams DeMarco for statistical analysis support.

Corresponding Author: Address correspondence to Dr DeMarco, University of Pennsylvania, Department of Family Medicine and Community Health, 51 N. 39th Street, Philadelphia PA 19104. 215-662-8941. Fax: 215-243-3290.



  1. Weicha JM, Vanderchmidt H, Schilling K. HEAL: an instructional design model applied to an online clerkship in family medicine. Acad Med 2002;77(9):925-6.
  2. Morrow JB, Sepdham D, Snell L, Lindeman C, Dobbie A. Evaluation of a web-based family medicine case library for self-directed learning in a third-year clerkship. Fam Med 2010;42(7):496-500.
  3. Leong SL. fmCASES: collaborative development of online cases to address educational needs. Ann Fam Med 2009;7(4):374-5.
  4. Leong SL, Baldwin CD, Adelman AM. Integrating web-based computer cases into a required clerkship: development and evaluation. Acad Med 2003;78(3):295-301.
  5. Shokar GS, Bulik RJ, Baldwin CD. Student perspectives on the integration of interactive web-based cases into a family medicine clerkship. Teach Learn Med 2005;17(1):74-9.
  6. Shokar GS, Burdine RL, Callaway M, Bulik RJ. Relating student performance on a family medicine clerkship with completion of web cases. Fam Med 2005;37(9):620-2.
  7. Slatt LM, Steiner BD, Hollar DW, et al. Creating a multi-institutional family medicine clerkship examination: lessons learned. Fam Med 2011;43(4):235-9.
  8. Chao SH, Brett B, Wiecha JM, Norton LE, Levine SA. Use of an online curriculum to teach delirium to fourth-year medical students: a comparison with lecture format. J Am Geriatr Soc 2012;60(7):1328-32.
  9. Fall LH, Berman NB, Smith S, White CB, Woodhead JC, Olson AL. Multi-institutional development and utilization of a computer-assisted learning program for the pediatrics clerkship: the CLIPP project. Acad Med 2005; 80(9):847-55.
  10. Kalet AL, Coady SH, Hopkins MA, Hochberg MS, Riles T. Preliminary evaluation of the web initiative for surgical education (WISE-MD). Am J Surg 2007;194:89-93.
  11. Chorney ET, Lewis P. Integrating a radiology curriculum into clinical clerkships using case- oriented radiology education. J Am Coll Radiol 2011;8(1):58-64.
  12. Campos-Outcalt D, Witzke DB, Fulginiti JV. Correlations of family medicine clerkship evaluations with scores on standard measures of academic achievement. Fam Med 1994; 26(2):85-8.

From the Department of Family Medicine and Community Health, University of Pennsylvania.

Copyright 2017 by Society of Teachers of Family Medicine