key: cord-0990619-o2xji5qb authors: Perez, Elisa Danielle; Keesee, Lea; Moore, Courtnie Nichole; Gallegos, Belinda Ann; Guest, Heather Ann; Franco, Hollis Heather; Hoffman, Cynthia Ann title: Connecting the dots: Bridging virtual to in-person physical assessment date: 2021-09-14 journal: Teach Learn Nurs DOI: 10.1016/j.teln.2021.08.002 sha: fc167e9a40cc67e213a7e08d3eeff15dd51bb78e doc_id: 990619 cord_uid: o2xji5qb When restrictions imposed by COVID-19 prevented prelicensure nursing students from practicing skills in the simulation center, the faculty designed a plan to bridge the gap from virtual to in-person skill performance for physical assessments. The faculty anticipated an inadequacy of head-to-toe assessment skills related to the lack of face-to-face clinical. Therefore, the faculty used the Plan-Do-Study-Act model (Agency for Healthcare Research and Quality, 2020) for quality improvement to address the skill performance issue. The plan used colored adhesive "garage sale" dots to identify anatomical landmarks to help students with correct stethoscope placement during their first in-person simulation after a virtual semester. Pre- and post-tests were administered to assess confidence in assessment skills. Using this method on standardized patients in simulation, nursing students reported increased confidence in stethoscope placement. Students of all healthcare disciplines could benefit from the economical dotting process to learn correct stethoscope placement. During the COVID-19 pandemic, traditional prelicensure nursing programs rapidly converted to deliver clinical and didactic education through a virtual format. The transition to an online didactic model was much easier to navigate than the conversion to virtual clinical education. As a result of government and university mandates, firstsemester prelicensure nursing students at one large multi-campus Southwestern public university completed the entire summer 2020 semester online. Due to the inability to have close contact with others, the students practiced foundational nursing assessment skills at home on student-made makeshift manikins such as rolled-up towels, pillows, and stuffed animals for bodies. At the same time, faculty members evaluated student performance over teleconferencing. Students learned the physical assessment process but had limited hands-on experience with manikins or patients, which resulted in students lacking confidence in assessment skills and accuracy for correct stethoscope placement. Faculty members teaching the second-semester clinical courses faced numerous uncertainties when planning the fall semester due to COVID-19 restrictions and limited accessibility to simulation space that came after an entirely virtual semester. COVID-19 regulations required the students to maintain social distancing and limit contact with other students and faculty. The faculty designed the dotting process to efficiently reinforce stethoscope placement within these barriers: (a) a large number of students, (b) limited faculty, (c) limited time in skills lab, and (d) numerous skills to bridge from the previous semester. In the second semester, simulations are traditionally higher level and focus on critical thinking and application of knowledge rather than skills-based learning. Nursing faculty recognize that the bedside physical assessment is a crucial clinical skill; however, research indicates that many health care providers and students have weak assessment skills (McKinney et al., 2013; Montinari & Minelli, 2019) . Due to the limitations students faced during a completely virtual semester and the lack of hands-on skill acquisition, faculty felt it necessary to include a station where students practice stethoscope landmark placement. Faculty predicted reviewing stethoscope placement/landmarks would cut into simulation time and anticipated a lack of time for one-on-one instruction. The innovative dotting Teaching and Learning in Nursing journal homepage: www.journals.elsevier.com/ teaching-and-learning-in-nursing method provides each student with an opportunity to practice the physical assessment on an actor trained to consistently portray a patient, also known as a standardized patient [SP] (INACSL Standards Committee, 2016) . Friederichs et al. (2014) suggested that realistic training simplified teaching cardiac auscultation in the clinical context and was highly accepted by students, tutors, and SPs. Effective and innovative teaching strategies are essential to fostering learning and physical assessment skills (Weaver & Jones, 2020) . The faculty anticipated providing each student with time to practice on an SP, marked with adhesive color-coded dots, would improve aggregate confidence six weeks after initial instruction. Faculty planned for SPs to provide students with feedback in the head-to-toe assessment station (Lewis et al., 2017) because SPs fill a critical educational role and a unique perspective when providing feedback on a learner's clinical skills (Berenson et al., 2012) . Faculty predicted using the dotted SP would provide more time for students to practice hands-on and less time lecturing in simulation. Simulation with SPs increases skill mastery in an active and safe learning environment, whereas passive lecture and faculty-led skill demonstration do not (Weaver & Jones, 2020) . The use of simulation-based learning strategies promotes the transference of theoretical knowledge to nursing practice (Culyer et al., 2018) . There is a lack of literature about adhesive dots to aid students with stethoscope placement during auscultation. The closest match found in the literature was when Friederichs et al. (2014) removed the hardware of an auscultation torso, glued the hardware to an SP's chest, and hid the hardware under a flesh-colored shirt to create a realistic training situation. The dotting process incorporates SP methodology, peer and faculty feedback, and visual aids (dots) to mark correct stethoscope placement for cardiac, respiratory, and abdominal assessments. The dotting approach incorporates the three essential underpinnings of student engagement during simulations revealed by Battista (2017), which include the (a) use of an SP and the nursing student (social interaction), (b) use of a stethoscope, and other vital sign equipment (physical clinical tools), and (c) performance of a physical assessment (structured interventions). Our PDSA model (Agency for Healthcare Research and Quality 2020) to bridge the gaps and barriers included surveying the students to assess their confidence in stethoscope placement after using the innovative teaching method implemented in this study. One hundred thirty junior-level students participated in the 4hour simulation lab. The students were placed in groups, and each group rotated through three stations: a subcutaneous and intramuscular injections station with practice retrieving medications from the automated medication dispensing system; a faculty guided blood pressure demonstration and student return demonstration with repeated practice on peers and vital sign simulator; and a head-totoe assessment with medication administration station. In the headto-toe assessment with medication administration station, faculty used adhesive colored-coded dots to mark and label the standardized patients' anatomical landmarks. The dotted landmarks included the SPs cardiac, lung, abdominal, radial, and pedal pulses. Setup took approximately five minutes per SP. Dots were color-coded by the body system and numbered to reinforce the assessment sequence. As shown in Fig. 1 , faculty used red dots to signify landmarks for cardiac assessment. The dots were labeled specifically for each heart sound: "A" for aortic, "P" for pulmonic, "E" for Erb's point, "T" for tricuspid, and "M" for mitral. The faculty used blue dots to mark anterior and posterior lung sound auscultation points and numbered to indicate the correct auscultation order. Fig. 2 shows green dots marked the abdominal quadrants and were numbered for the correct listening sequence. The faculty used yellow dots with X's to indicate the radial, dorsalis pedis, and posterior tibial pulses. During the head-to-toe and medication administration scenario, paired students worked with SPs to assist them with correct stethoscope placement. Each student was given 25 minutes to complete a head-to-toe assessment and administer one medication to the SP. The students used the dots to assess the radial and apical pulse rates, then determined if medication administration was safe. The scenario included two medications, and students were expected to use critical thinking and assessment findings to administer medications safely. Similar to Weaver and Jones (2020), while one student was performing the head-to-toe assessment on the standardized patient, the partner used headto-toe and medication administration grading rubrics to give feedback at the end of 25 minutes. Faculty observed students and were available to (a) provide corrective and immediate feedback, (b) answer students' questions during the experience, (c) clarify safety issues during the debriefing, and (d) correct student misconceptions. The faculty obtained Quality-Improvement Review Board approval to review student test responses. Prior to COVID-19, students were evaluated on their head-to-toe physical assessments of standardized patients during formative and summative Objective Structured Clinical Examination [OSCE] (McWilliam & Botwinski, 2010) . Faculty originally planned on evaluating the effectiveness of the "dotting" educational strategy by comparing pre-COVID-19 (no dotting) to post-COVID-19 (dotting) student performance data. During the semester students were evaluated face-to-face for formative OSCE. A surge of COVID-19 cases caused a shut-down to the simulation facility before the summative evaluation so faculty were unable to collect data for face-to-face summative OSCE. However, many students commented how the "dotting" experience improved their performance in completing a head-to-toe in formative OSCE evaluation which was consistent with the post simulation and end of semester survey. Faculty assessed student confidence level of stethoscope placement in a survey since students were unable to demonstrate their performance ability in a summative OSCE evaluation. The faculty surveyed the students three times: formative pre-and post-encounter, and at the end of the semester using the learning management system. Each survey consisted of four questions using the following choices: Not confident (25%), Somewhat confident (50%), Confident (75%), Very Confident (100%). The questions were as follows: How confident are you at taking vital signs? How confident are you at stethoscope placement for auscultation of lung, heart, and bowel sounds on a patient? The end of semester survey also contained two open-ended questions, which students rated if the dotting process increased confidence. Posttest data indicate an increase in student self-confidence regarding stethoscope placement after participating in the scenario. According to the pretest answers, few (n=127; 18.9%) students were very confident with stethoscope placement. Refer to Table 1 . On the post-test, more (n=125; 32.8%) students were very confident with stethoscope placement. On the pretest (n=127; 42.5%) students selected confident, then on the posttest, (n=125; 44%) students indicated they were confident. For the end of semester test, the student's documentation for either not confident, somewhat confident, confident, and very confident stayed consistent. On the end of semester test (n=130; 32.3%) indicated they were very confident and (n=130; 41.5%) indicated they were confident. Therefore, students-maintained confidence throughout the semester after the use of the "dotting" method. Student groups did not have the opportunity to talk with each other between stations due to strict COVID-19 social distancing policies. During the study segment of the PDSA, faculty examined the impact of the innovative dotting teaching method. Comparatively, aggregate data results matched faculty predictions of an increase in confidence in stethoscope placement. However, faculty understand these limitations: a) lack of a control group for comparison purposes; b) the observational nature of the activity; c) possible influence of completing the head-to-toe station first versus last may have had on the performance and confidence scores. These conditions limit our identification and the extent to which the dotting process enhanced stethoscope placement skill. During the PDSA cycle study component, the faculty learned the dots do not stick well to dry skin or hair. Students agreed; student surveys included the following recommendations: "putting new dots on or just making sure they are still there" as the dots "were falling off or slightly misplaced for students completing the simulation later in the day." Another student suggested, "maybe mark with a marker instead of the stickers," which aligned with the faculty's thoughts of using non-toxic markers like "bingo daubers." One student commented, "When I first went into the simulation, I was extremely nervous. I felt like I didn't know where to place my stethoscope or even where anything was. The dots placed on the SP were the best thing and helped my confidence!" The faculty shared pictures of the "dotted" SPs via video teleconferencing simulation when clinical was converted back to online at mid-semester. The faculty encouraged students to place stethoscopes on the computer screen to mimic a face-to-face assessment with attention to landmarks. Another student commented, "The dots on the Zoom simulations when they were in photographs were great." Faculty continue to improve the dotting process due to lessons learned and plan to use a more permanent dot since one limitation was that the dots fell off. Additional plans are to a) ask foundational instructors to use the dotting process during the first semester of clinical skills; b) continue to use the dotting process during secondsemester clinical simulations; c) use the dotted SP when assigning independent practice to students requiring clinical skills remediation d) post pictures of a dotted SP in the learning management system and suggest student print and use these pictures whenever practicing physical assessment; and e) continue to use the photos in videotelephony simulations. Another lesson learned was the need to provide the images before and during the simulation experience. This lesson was reinforced by student comments, "The use of the dots was very helpful in understanding the correct placement. Before this, I was unable to locate the correct spots. In the future, I would suggest posting a picture of a patient with the dots on their chest to help remind students if they forget." Although the traditional undergraduate faculty initiated the dot method during the second-semester of the nursing program, the method is best suited for first-semester prelicensure nursing students. Further study or an additional PDSA cycle is necessary to understand how using the dotting method in simulation translates to in-person clinical. Faculty should use the dot method during the second-semester to reinforce proper assessment techniques. As one student reflected, "The dots were very helpful, especially coming in after an odd semester of not dealing with any real patients. The placement of our stethoscopes are very different from a pillow patient to a real person." The challenges of social distancing and limited simulation space resulted in a need to provide students hands-on practice of foundational skills in a semester where learning typically involves higherlevel simulations. Therefore, this innovative teaching method provided nursing students with the skills necessary to perform an accurate and thorough assessment confidently. The mastery of nursing students' physical assessment skills lays the foundation of quality nursing practice and safe patient care (American Nurses Association, 2015; Weaver & Jones, 2020) . Even if the auscultation method is replaced by advancing technology such as ultrasound (Montinari & Minelli, 2019) , practitioners will still need to have confidence in using landmarks. McKinney et al. (2013) identified time spent engaged in simulation-based cardiac physical examination led to better learning outcomes. The dotting process is sustainable for programs that already use SPs and is feasible for programs that do not. The multicolored dots cost less than ten dollars, and bingo daubers are less than fifteen dollars making this teaching method suitable for any healthcare discipline. The dotting process described in this article is one strategy that may be effective in increasing student confidence for accurate stethoscope placement and the use of landmarks to guide assessment. Future plans include measuring competence after teaching with the dotting process and incorporating ink or paint, like bingo daubers, to ensure the dots remain consistently placed throughout the clinical experience. None. Plan-do-study-act (PDSA): Directions and examples Nursing scope and standards of practice An activity theory perspective of how scenario-based simulations support learning: A descriptive analysis Standardized patient feedback: Making it work across disciplines Evidenced-based teaching strategies that facilitate transfer of knowledge between theory and practice: What are nursing faculty using? Teaching and Learning in Nursing Combining simulated patients and simulators: Pilot study of hybrid simulation in teaching cardiac auscultation INACSL standards of best practice: Simulation glossary The Association of Standardized Patient Educators (ASPE) Standards of Best Practice (SOBP) Simulation-based training for cardiac auscultation skills: systematic review and meta-analysis Developing a successful nursing objective structured clinical examination The first 200 years of cardiac auscultation and future perspectives An innovative educational trio for physical assessment in an undergraduate nursing course