key: cord-0888144-qwqv6wia authors: Nagaraj, Madhuri B.; AbdelFattah, Kareem R.; Scott, Daniel J.; Farr, Deborah E. title: Creating a Proficiency-Based Remote Laparoscopic Skills Curriculum for the COVID-19 Era date: 2021-07-03 journal: J Surg Educ DOI: 10.1016/j.jsurg.2021.06.020 sha: 2cd17a89b9ad8a9ead2c15ebbb74fd67e4c176dd doc_id: 888144 cord_uid: qwqv6wia OBJECTIVE: Social distancing restrictions due to COVID-19 challenged our ability to educate incoming surgery interns who depend on early simulation training for basic skill acquisition. This study aimed to create a proficiency-based laparoscopic skills curriculum using remote learning. DESIGN: Content experts designed five surgical tasks to address hand-eye coordination, depth perception, and precision cutting. A scoring formula was used to measure performance: cutoff time - completion time - (K × errors) = score; the constant K was determined for each task. As a benchmark for proficiency, a fellowship-trained laparoscopic surgeon performed three consecutive repetitions of each task; proficiency was defined as the surgeon's mean score minus two standard deviations. To train remotely, PGY1 surgery residents (n = 29) were each issued a donated portable laparoscopic training box, task explanations, and score sheets. Remote training included submitting a pre-test video, self-training to proficiency, and submitting a post-test video. Construct validity (expert vs. trainee pre-tests) and skill acquisition (trainee pre-tests vs. post-tests) were compared using a Wilcoxon test (median [IQR] reported). SETTING: The University of Texas Southwestern Medical Center in Dallas, Texas PARTICIPANTS: Surgery interns RESULTS: Expert and trainee pre-test performance was significantly different for all tasks, supporting construct validity. One trainee was proficient at pre-test. After 1 month of self-training, 7 additional residents achieved proficiency on all five tasks after 2-18 repetitions; trainee post-test scores were significantly improved vs. pre-test on all tasks (p =.01). CONCLUSIONS: This proficiency-based curriculum demonstrated construct validity, was feasible as a remote teaching option, and resulted in significant skill acquisition. The remote format, including video-based performance assessment, facilitates effective at-home learning and may allow additional innovations such as video-based coaching for more advanced curricula. New surgical interns seek early skills training using simulation in order to gain requisite abilities for operative practice. Access to this training was challenged as the COVID-19 pandemic resulted in social distancing orders affecting multiple aspects of surgical training, including access to our simulation center. In response, we identified a need for at-home laparoscopic skills training to offer access to and practice for basic laparoscopic skills acquisition. Multiple publications report various attempts to creatively maintain skills training via remote methods. These curricula have focused on using at-home or easily accessible products, a variety of low-to high-fidelity products, and spanned many different disciplines of surgery [1] [2] [3] [4] [5] . Support for at-home laparoscopic training has been growing in recent literature. Support first came with evidence of skill acquisition and construct validity for various developed tasks [6] [7] [8] [9] [10] [11] . Randomized trials also demonstrated that skill acquisition with at-home trainers was equivalent to that of onsite trainers, proving this as a reasonable training alternative 12 . Despite some perceptions that competing interests may limit trainee engagement with at-home trainers, data support that residents desire using at-home trainers and are more likely to perform deliberate practice with them in shorter and more frequent sessions 10, [13] [14] [15] . Furthermore, access to at-home training allows more control in practice scheduling, thereby eliminating practice when trainees are fatigued or frustrated 15 . Proficiency-based simulation training has been garnering support since the early 2000s when data proved that learner skill acquisition varied vastly in terms of previous metrics such as time spent or repetitions; performance was better described by using proficiency-based metrics and established goals for training 16 . Proficiency-based training has also demonstrated transference of skills to the operating room and decreased surgical skills decay [17] [18] [19] . Despite the availability of at-home laparoscopic curricula, few programs have used a proficiency-based training model. Furthermore, none to our knowledge have focused on skill acquisition particularly in laparoscopy-naïve learners. The COVID-19 pandemic identified the need for remote proficiency-based training that supports skill acquisition for novice learners. Our institution has had previous experience and success with developing similar curricula to train naïve interns in basic skills such as hand-eye coordination and depth perception later applicable to the more complex tasks of the Fundamentals of Laparoscopic Surgery (FLS) program (www.flsprogram.org). We have not however previously implemented remote laparoscopic training 20 . Using a methodology similar to our previous curriculum, we aimed to develop a costeffective and feasible proficiency-based remote laparoscopic skills curriculum to train basic skills for matriculating general surgery interns. This study was reviewed and approved by the University of Texas Southwestern Institutional Review Board and performed in collaboration with the UT Southwestern Simulation Center. We assembled at-home laparoscopic box trainers using previously donated laparoscopic instruments (graspers, scissors, and needle drivers) and a new donated Ethicon kit consisting of a compressible box trainer, a USB digital camera, and various supplies for task development (e.g., saucers, beads, pegs, rubber bands, and clips). Two content experts first determined the essential laparoscopic skills represented in our traditional in-person basic laparoscopic skills curriculum 17, 19 and then developed five representative tasks that could be performed with the available equipment (Figure 1, Table 1 ). Prior to finalizing the tasks, performance details and errors were defined and several faculty surgeons and a senior surgical resident tested the tasks. Task 1: Bead saucer transfer requires the transfer of five beads from one saucer to another with the dominant hand and then back to the original saucer with the non-dominant hand. One dropped bead is allowed; any subsequent drops are errors. Task 5: Needle bead transfer requires interns to use a needle grasped in a needle driver to pick up and transfer five beads from one container to another. One bead drop is allowed; subsequent drops are errors (Figure 2 , Table 2 ) After developing the tasks, we obtained data on how experts performed them. Metrics measured included total time to task completion (completion time) and errors. In order to maintain high training standards, we utilized three unpracticed repetitions of an expert fellowship-trained minimally invasive surgeon. Expert-level performance data was then used to determine proficiency-based training benchmarks using previously reported methods: benchmark time (rounded to the nearest second) = expert mean raw score or expert mean raw score + 1-2 standard deviations (SD) 21 . Suitability of the benchmarks was determined by measuring the unpracticed performance of a senior resident, defined as suitable if the resident achieved near the benchmark on at least one of three unpracticed attempts. Scoring formulas were created for each task: cutoff time -completion time -penalty score = score 22 . Cutoff time was the baseline intern performance plus one SD. The penalty score was a constant (K) multiplied by the number of errors. The constant was determined by the allowable number of errors until task failure; additional grace for allowable errors was determined by expert unpracticed performance. Higher scores represented better performance. Negative scores were assigned a score of zero. An expert-derived proficiency score was determined for each task by calculating the score from the benchmark time (Table 3) . Finally, a composite score for all five tasks was created by taking the sum of each task score normalized to the proficiency score. A composite score of 500 would be equivalent to achieving an overall expert level of performance. The study was designed as a single-arm unblinded cohort observational trial. All general surgery interns who matriculated in 2020 participated in the trial, including both categorical and preliminary residents. All interns received the training equipment concurrently and underwent a virtual group orientation to the curriculum at the beginning of their intern year. This included providing an overview of the curriculum structure and an introduction to all of the supplies, task setup, and proficiency benchmarks. Learners were additionally provided a curriculum summary document, which covered all that was discussed in the orientation. Interns were then asked to submit individual videos of pre-test performance on each task to the simulation faculty and then to practice (on a self-determined schedule) to reach proficiency based on the previously derived benchmarks. After achieving the proficiency benchmarks on all five tasks, interns were then asked to submit individual videos of proficient post-test performance to one simulation faculty member along with information on the number of repetitions they completed to achieve proficiency. A single faculty member (D.E.F.) graded all pre-test and post-test videos. Data recorded included total time to task completion, number of errors (errors), and number of repetitions to proficiency (repetitions). The pre-test scores and composite scores were calculated for those who achieved proficiency. Overall descriptive statistics are reported as median [IQR] . Construct validity was evaluated by comparing expert scores to intern pre-test scores using a non-parametric Wilcoxon test. Additional paired non-parametric analysis was used to compare the pre-to post-test scores of those who completed the curriculum, in order to assess their skill acquisition. All data were analyzed with RStudio (version 1.3.959), and a p value of <.05 was considered significant. One expert performed three unpracticed repetitions of each task. A comparison of the expert's performance with the senior resident's unpracticed attempts showed no significant difference, supporting the suitability of the expert levels ( Table 4 ). The expert's benchmark times ranged from 45 seconds for task 5 to 133 seconds for task 2. Using the scoring formula, expert-derived proficiency scores were calculated ranging from 98 on task 3 to 255 on task 5. Task 4 suitability was determined at expert mean alone unlike other tasks given the senior resident's ability to reach near expert level without practice ( Table 3) . A total of 29 interns (13 categorical, 16 preliminary) submitted pre-test performance videos. Of those, 26 interns (13 categorical, 13 preliminary) submitted post-test performance videos and thereby completed the curriculum. Three residents did not complete the curriculum due to other clinical priorities, as they were not matriculating to a general surgery residency. Baseline performance data (median [IQR]) on each task were as follows: Table 5) . After one month of training, 8/29 or 28% of the interns achieved proficiency on all tasks (6 categorical, 2 preliminary). By 8 weeks into the curriculum, just over half of the residents 15/29 or 52% had achieved proficiency. And by the end of the curricular year, all but one resident who completed the curriculum were verified as achieving proficiency (Figure 3) . The one resident who did not achieve proficiency had achieved it on all but one task in which he was one point below the proficiency cutoff. Each task took anywhere from 2-18 repetitions during training to reach proficiency. The rubber band task took the fewest attempts with a median of 2, while the bead to peg and precision cutting tasks took a median of 4.5 and 5 attempts, respectively. Composite scores on the pre-test for all residents had a median of 225 seconds [160-329] which rose by post-test to 499 [489-512] (p =.01; Figure 4) . A composite score of 477 was equivalent to achieving proficiency. Significant differences were identified in both the pre-test (median 329 vs. 190, p <.01) and post-test (median 507 vs. 489, p <.01) composite scores of categorical versus preliminary interns respectively who completed the curriculum. Our study aimed to create a feasible and effective remote laparoscopic skills curriculum to support skill acquisition in novice trainees who matriculated during the COVID-19 pandemic. Interns were provided with at-home training equipment and then self-practiced to reach expertderived proficiency benchmarks for five novel tasks that reflected basic laparoscopic skills. Given evidence in previous literature, we were not surprised by the ability of our residents to obtain skill matching proficiency-based benchmarks. We were, however, impressed that the benchmarks were met by some of them relatively rapidly despite ongoing normal clinical duties. Despite comparison to a different curriculum, the ability to self-train at home and to also submit self-made videos for video-based assessment shows earlier proficiency confirmation if not evidence of attaining proficiency abilities earlier. This further cements the benefits of accessible and self-determined practice with remote training resources and the decreased burden of faculty involvement when comparing video-based assessment to proctored in-person post-tests. Furthermore, after achieving proficiency, 4 interns have registered for the FLS examination and 1 has taken and passed it despite no additional structured suturing or endoloop training. This may be in part due to familiarity to other FLS tasks as supplies such as the peg board and circle cut were also included in the kits for self-practice. This is the first time our institution has had interns register and take the examination, reflecting their comfort and skill after completion of our curriculum. Overall, our curriculum fills the aforementioned gap in demonstrating the feasibility of at-home training that focuses on skill acquisition for novice learners using an expert-derived proficiencybased curriculum associated with construct validity evidence. Since its roll-out, other more senior residents have requested access to the training equipment and curriculum as well. We acknowledge certain limitations in our study design, including its lack of randomization and small sample size. Proficiency benchmarks, construct validity, and baseline feasibility analysis were based off of a single minimally invasive trained expert and senior resident given time and social distancing constraints preventing the recruitment. Additionally, a confounding factor that may have affected our data is the grouping of preliminary non-departmental interns (e.g., radiology or anesthesia interns) with categorical and preliminary general surgery interns during data analysis. Despite the lack of a control group, which we omitted to provide fair and equal training across all novices during the pandemic, the strength of our study is that it shows that remote practice can provide necessary training using low-cost equipment over a reasonable amount of time while ensuring the convergence of skill acquisition to the level of an expert. Our curriculum further demonstrates an easily achievable proficiency-based framework that can be applied to task development across a variety of low-fidelity at-home trainers by other institutions and specialties. Additionally, this framework can remain consistent while adapting its task complexity or difficulty for different levels of trainees. In the future, as we anticipate resuming normal practice after the pandemic, we may consider still using this remote curriculum, given its advantages. We may also evaluate whether a remote proficiency-based curriculum produces skill retention and transferability to the operating room, further supporting its role in ongoing surgical education. Remote learning with video capture also allows easier access to remote coaching, learning curve assessment, and many other video-based technologies. This study shows that at-home laparoscopic skills training for novice learners is not only feasible but also efficient, requiring a reasonable amount of time and repetitions to obtain skills without being too easy or frustrating for learners. Additionally, we provide evidence of construct validity on all five tasks which supports the use of proficiency-based training. We conclude that our athome laparoscopic training curriculum and its remote teaching and video-based assessment is effective and relevant for our ongoing needs. Depth perception Hand-eye coordination Fine object manipulation Bimanual manipulation/ambidexterity Tension/tactile perception Precision cutting Needle handling A 3-Dimensional-Printed Hand Model for Home-Based Acquisition of Fracture Fixation Skills Without Fluoroscopy Home Program for Acquisition and Maintenance of Microsurgical Skills During the Coronavirus Disease Minimal Access Gardening: Laparoscopic Techniques during Coronavirus Disease Lockdown Surgical training during COVID-19: a validated solution to keep on practicing Impact of the COVID-19 pandemic on core surgical training Takehome" box trainers are an effective alternative to virtual reality simulators Bringing the skills laboratory home: an affordable webcam-based personal trainer for developing laparoscopic skills Validation Study of a Portable Home Trainer Using a Pad for Laparoscopic Practice Face, content, and construct validity of a novel portable ergonomic simulator for basic laparoscopic skills A randomised clinical trial of take-home laparoscopic training Improved laparoscopic skills in gynaecology trainees following a simulation-training program using takehome box trainers Randomized comparison of standard laparoscopic trainer to novel, at-home, low-cost, camera-less laparoscopic trainer Barriers and facilitators to deliberate practice using take-home laparoscopic simulators Face-and Content Validity of a New Portable Tablet Box Trainer for Training Laparoscopic Skills at Home Effective home laparoscopic simulation training: a preliminary evaluation of an improved training paradigm Determining standards for laparoscopic proficiency using virtual reality Developing and Testing Competency Levels for Laparoscopic Skills Training Skill retention following proficiency-based laparoscopic simulator training Laparoscopic training on bench models: better and more cost effective than operating room experience Design of a proficiency-based skills training curriculum for the fundamentals of laparoscopic surgery Determining standards for laparoscopic proficiency using virtual reality Evaluating laparoscopic skills: setting the pass/fail score for the MISTELS system We thank the Department of Surgery at UT Southwestern Medical Center and the UT Southwestern Simulation Center for their support. We also thank all the residents who participated and provided feedback on the curriculum. The authors report no proprietary or commercial interest in any product mentioned or concept discussed in this article.