key: cord-0012118-2p9zmtbl authors: Gardner, Amity L.; Halpin, Richard; Saeed, Sophia G. title: Case‐based competency assessments date: 2020-07-26 journal: J Dent Educ DOI: 10.1002/jdd.12324 sha: 187bbd809c3712f09a38966a8f57eb37388b7005 doc_id: 12118 cord_uid: 2p9zmtbl nan The coronavirus disease 2019 (COVID-19) pandemic put clinical care and, therefore, clinical education and assessments, at a sudden standstill. Fourth-year students in the spring semester were counting on the past few months of the semester to demonstrate competence in a number of clinical skills. 1 On average, students had 5 outstanding competency assessments each, spanning multiple clinical courses and disciplines. A stay-at-home order was in place, and students were to remain remote. Alternate methods for assessing competence needed to be developed. With 17 fourth-year clinical courses, each with 1 or more competency assessments tied to the final grade, a decision was made to create an orchestrated plan with a consistent end-user experience. This reduced the number of emails that needed to be sent to students, as the Director of Clinical Education communicated with the students on behalf of the clinical course directors. Some courses, which required case presentations for competency assessment, continued in that manner in a virtual setting. For all other courses, course directors were provided templates and examples to build casebased exams. For each clinical competency assessment that needed an alternate format, course directors created 2 to 4 cases with images, a standardized patient history format, and 3 to 4 questions per case. Each question required a short-answer response that focused on critical thinking, clinical reasoning, and/or synthesis. The questions were designed to mimic the types of questions that would be asked on a clinical competency assessment, with real value for students to think about as they move toward becoming safe, independent, novice dentists. Generic rubrics, using a scale of 5-4-3-2-0 were provided to each course director and a "critical error" with a score of "0" was defined for each question. All cases were reviewed by the Director of Clinical Education and Associate Dean for Patient Care for consistency of voice and format. The exams were then built in ExamSoft. Students were notified which exams to take. A schedule was created to administer the exams during specific time slots over a 3-day period. Clinical course directors were provided training on how to use ExamSoft to grade. During the creation and execution of these exams, a few key observations were made. First, several "competencies for the new general dentist," which often take a back seat to technical skill during traditional clinical competency assessments, were highlighted. These include critical thinking, communication, health promotion, assessment, diagnosis, treatment planning, and evaluation of treatment outcomes. Second, the alternate case-based format allowed for standardized assessment based on real patients seen in the student clinics. During traditional clinical competency assessments, there are numerous patient, student, and examiner variables that make it challenging to reduce inter-rater variability. Third, all responses for all students for a given question or case were graded by the same examiner, eliminating inter-rater variability. An additional benefit was that clinical instructors, who otherwise provide most teaching and assessment in a clinical environment, learned the exam software used by the school. This broadens their skillset and will allow them to contribute to teaching and learning in a new way, post-pandemic. ADEA competencies for the new general dentist Case-based competency assessments