492 Documenting the Value of Librarians in the Classroom: Results from a Mixed-Methods Research Collaboration with Campus Partners M. Sara Lowe, Abby Currier, and Steven Graunke* This paper details the results of a mixed-methods study of first-year and upper-division students’ information literacy (IL) competencies. The study used a rubric and a survey, seeking to answer two research questions: 1) Is there a correlation between National Survey of Student Engagement (NSSE) IL survey responses and IL rubric scores? 2) Are there any indicators that correlate to improved IL performance in first-year students? Results demonstrated that first-year students reported greater engagement with IL and also indicated that instructors placed greater emphasis on IL competencies than students in upper-division courses. They also show a statistically significant impact on first-year students’ rubric scores when a librarian is in the class. This finding held even when controlling for other variables. Results provide an evidence-based foundation to spur conversations with faculty and university administration on the value of IL and the role of librarians in undergraduate student success. Introduction The importance of information literacy (IL) competencies while in college and after graduation is recognized both within and outside of librarianship and higher education.2 Project Information Literacy has conducted numerous studies highlighting students’ struggles with information and their continued challenges upon entering the workforce.3 While in school, inadequate IL skills can be a barrier to performance.4 After graduation, while employers value critical think- ing skills, many feel that their new hires are not adequately prepared in these areas.5 IUPUI (formerly Indiana University-Purdue University Indianapolis) is a large, urban research institution with a FTE of more than 25,000 undergraduate and 8,000 graduate stu- dents.6 IUPUI has a relatively diverse student population, with about 22 percent identifying as non-white;7 29 percent are first-generation college students and 41 percent are Pell Grant * M. Sara Lowe is Associate Dean for Educational Services at Indiana University-Purdue University Indianapolis, e-mail: mlowe@iupui.edu; Abby Currier is an MLS graduate student at Indiana University-Purdue University Indianapolis, e-mail: abbycurr@iu.edu; Steven Graunke is Director of Institutional Research and Assessment at Indiana University-Purdue University Indianapolis, e-mail: sgraunke@iupui.edu. The research was made possible through a $33,000 grant from the Indianapolis Foundation Library Fund, an affiliate of the Central Indiana Com- munity Foundation.1 The grant allowed for the hiring of a GA to help with data collection as well as incentives to paper readers. ©2020 M. Sara Lowe, Abby Currier, and Steven Graunke, Attribution-NonCommercial (https:// creativecommons.org/licenses/by-nc/4.0/) CC BY-NC mailto:mlowe@iupui.edu mailto:abbycurr@iu.edu mailto:sgraunke@iupui.edu https://creativecommons.org/licenses/by-nc/4.0/ https://creativecommons.org/licenses/by-nc/4.0/ Documenting the Value of Librarians in the Classroom 493 recipients.8 Of the goals in the IUPUI strategic plan, the first is “Promote Undergraduate Student Learning and Success.”9 The importance of this priority is reflected in the campus attitude toward assessment. IUPUI was an early adopter of campuswide student learning outcomes (SLOs).10 The Principles of Undergraduate Learning (PULs)11 were adopted in 1997 and recognized by the Association of American Colleges and Universities (AAC&U).12 IL is referenced in PUL 2: Critical Thinking. In 2018, the university revamped the PULs and created a more holistic framework of student learning outcomes that include co-curricular outcomes. Called the Profiles of Learning for Undergraduate Success: IUPUI+ (Profiles), they include four quadrants titled: Communicator, Innovator, Problem Solver, and Community Contributor.13 The Profiles encompass all the Frames from the ACRL Framework for Information Literacy for Higher Education14 and allow for more holistic integration of IL.15 This shift makes room for subject librarians to advocate for more comprehensive, systematic integration of IL through the curriculum, which, one can hope, better makes the case to the campus of the importance of teaching librarians to student success. While IL is represented in the campuswide SLOs, there had never been a systematic assessment of students’ IL competencies. This factor left subject librarians, faculty, and cam- pus administration without a clear picture of what IL skills students have when entering the university or what IL skills they have acquired at graduation. Additionally, with the diverse student population at IUPUI, the authors did not know if there were variables (such first- generation status, Pell Grant recipient) that impacted students’ IL performance. With these circumstances, it can be challenging to know how best to scaffold IL through the curriculum. Without a clear baseline of students’ IL competencies, teaching librarians don’t know where to start and they can’t accurately map increasingly complex competencies through students’ years in college. Similarly, without this baseline it was also challenging for some faculty and campus administration to realize the value of librarians teaching these IL competencies. As a product of collaboration with the campus office of Institutional Research and Decision Support (IRDS), this paper details the results of a mixed-methods study of first- year and upper-division students’ IL competencies at our large, urban research institution. This project incorporated direct, authentic assessment of end-of-semester student products using the AAC&U Information Literacy VALUE rubric16 (N = 772) and indirect assessment through the National Survey of Student Engagement (NSSE) Experiences with Informa- tion Literacy module17 (N = 630). As IL had never been systematically assessed at IUPUI, the assessment helped establish a baseline of IL competencies for students at the first-year and upper-division levels. For the paper collection, upper division included students in 300- and 400-level classes. For coding in data analysis, upper division is defined as stu- dents who had attended IUPUI in a prior semester. The assessment would allow teaching librarians to better tailor their instruction as well as foster discussions among university administration, faculty, and librarians. Additionally, the project sought to answer two re- search questions: 1) Is there a correlation between NSSE IL survey responses and IL rubric scores? 2) Are there any indicators that correlate to improved IL performance in first-year students (such as librarian in the classroom, high school GPA, first-generation student, and so on)? Because the principal investigator (PI) partnered with IRDS, rubric and survey results could be combined with extensive student demographic data (like first-generation status, Pell Grant recipient, and others) to determine if there were any student indicators that correlated to IL competencies. 494 College & Research Libraries April 2020 Literature Review Multiple research projects have sought to demonstrate the value of academic libraries to their institutions. Numerous big data projects (for example, at the University of Minnesota,18 University of Huddersfield,19 University of Wollongong,20 University of Nebraska-Lincoln,21 Georgia State University,22 and Curtin University23) link improved student outcomes (like GPA or retention) to library usage (such as book loans or use of e-resources). These studies rely on passive metrics routinely collected by libraries (examples: computer login information, book checkout statistics). Library instruction has also been studied using big data. Melissa Bowles-Terry combined transcript analysis and focus groups and found a statistically sig- nificant difference in GPA between seniors who did, and did not, have library instruction in upper-division classes.24 However, Priscilla Coulter, Susan Clarke, and Carol Scamman found no correlation between course grades in classes and courses with or without an IL session.25 Other studies examine student work, often through authentic assessment via rubric evaluation and often using a mixed-methods approach, to assess students’ IL competencies. Many of these studies examine correlations between IL performance and library instruction. Sara Lowe, Char Booth, Sean Stone, and Natalie Tagge evaluated student papers in first-year seminars at a group of small liberal arts colleges and correlated rubric scores with level of librarian involvement in the classroom.26 They found that the higher the level of librarian involvement in the class, the better students’ IL rubric scores. Meagan Bowler and Kori Street assessed students’ papers at a mid-sized public university from courses with librarians em- bedded at various levels.27 In some cases, the librarian was identified as a subject specialist in information literacy; in others, the librarian was not identified explicitly as a specialist. They found students’ IL skills improved when IL was explained as its own entity and when the librarian was identified as a subject specialist. To determine the IL competencies of graduat- ing students at their community college, Brandy Whitlock and Nassim Ebrahimi conducted a campuswide assessment of all grade levels using faculty surveys, a curriculum map, and authentic rubric assessment of student artifacts.28 Rubric scores demonstrated that students, upon graduation, did not have adequate IL competencies, specifically in ethical use of sources. Faculty survey results found that faculty teaching in courses that had IL outcomes generally felt confident that they could teach and assess IL competencies and all faculty surveyed agreed that IL should be a core competency at the institution.29 In first-year seminars at a small lib- eral arts college, Char Booth, Sara Lowe, Natalie Tagge, and Sean Stone found a correlation between the level of librarian intervention and students’ IL skills in authentic, rubric evalua- tion of student papers.30 However, the indirect assessment, which examined student course evaluations, found no correlation between students’ perceptions of the class and their rubric scores. Erin Rinto and Elisa Cogbill-Seiders found that students in an introductory English program who attended an IL session performed better on an annotated bibliography.31 Philip Smith employed an IL test, rubric, focus groups and a survey, finding higher IL proficiency at the end of curriculum integration.32 Conversely, Veronica Douglas and Celia Rabinowitz used surveys, interviews, and authentic rubric assessment to research the correlation between faculty-librarian collaboration and students’ IL abilities.33 They found none. Even without a mixed-methods component, authentic, rubric-based assessment of stu- dent work is a common method of assessing students’ IL competencies. Amanda Shannon and Vaughn Shannon examined students’ papers in Political Science classes; they found that repeated librarian visits correlated to better IL in papers.34 Wendy Holliday et al. assessed Documenting the Value of Librarians in the Classroom 495 student papers from four different stages in the undergraduate curriculum at a large state university using a revised version of the AAC&U IL VALUE rubric.35 Although not a longi- tudinal study, they found that students’ rubric scores improved between the first-year and junior and senior levels. Holly Luetkenhaus, Steve Borrelli, and Corey Johnson assessed first- year student writing to evaluate a writing program and the success of the program to teach IL competencies.36 Overall, they found students did learn IL skills but struggled in areas such as thesis development. Brianne Markowski, Lyna Fontes McCartin, and Stephanie Evers used the results of a rubric assessment to modify their instruction curriculum to address student deficiencies.37 Methods This project used a mixed-methods approach incorporating direct assessment of end-of- semester student products (like papers and presentations) using the AAC&U Information Literacy VALUE rubric38 (N = 772) and indirect assessment through the NSSE Experiences with Information Literacy module39 (N = 630). For the purposes of this analysis, the Experi- ences with Information Literacy module was divided into two subscales. “Use of Information” refers to specific self-reported behaviors related to obtaining information appropriately or improving IL skills. “Institutional Emphasis” items ask about the extent to which instructors encouraged students to engage in appropriate use of information. Scores on each scale were calculated as the mean of all responses to each item in the subscale. Internal reliability for both scales was strong (Cronbach’s α = 0.85 for Use of Information; α = 0.84 for Institutional Emphasis). A complete list of items included in the Experiences with Information Literacy module is included in the appendix. The AAC&U IL VALUE rubric was chosen, as the cam- pus has used other VALUE rubrics. Additionally, the rubric has been shown to be effective in assessing IL skills.40 Data merged with Experiences with IL survey module results, and VALUE rubric scores were obtained from the university data warehouse. High school GPA was obtained from transcripts at the point of admission. Students with a nontraditional GPA were rescaled to a 4.0 scale. Scores on the Scholastic Aptitude Test (SAT) were also obtained, specifically students’ scores from the version developed and released by the College Board in 2017.41 Students who had taken the SAT in prior years or who submitted an ACT score instead were rescaled using norms from the 2017 administration by the College Board. Two variables were also obtained to serve as proxies for students’ socioeconomic status. First-generation status was a dichotomous variable obtained from university records, which was generated using a combination of Free Application for Federal Student Aid (FAFSA) and admissions application information.42 Students who indicated their parents’ highest level of education was middle school or high school were coded as 1, while those who indicated either parent had completed a higher level of education were coded as 0. Receipt of a Pell Grant was also coded as a dichotomous variable, with Pell recipients coded as 1 and those not receiving a Pell grant coded as 0. For the ordinary least square (OLS) regression, a fifth dichotomous variable was also included to signify which students had been in classes in which a librarian was embedded. Students in classes in which there was a librarian were coded as 1, while students not in courses with an embedded librar- ian were coded as 0. 496 College & Research Libraries April 2020 Through a collaboration with the campus first-year experience coordinator and the business librarian, the NSSE survey was administered in IUPUI-wide first-year seminars as well as an introductory business class. To collect as representative a sample of papers as possible at the first year, the PI partnered with several large-enrollment, multisection first-year courses (6 courses, 72 sections) (such as English or business). Upper-division courses (5 courses, 8 sections) were limited to those that had a librarian working with them. Initially, the project sought to include all capstone classes; however, the project ran in the fall semester, and most capstone classes at the university are offered in the spring semester. While the study was deemed exempt by university IRB, the paper collection required active student consent.43 This meant that the PI or graduate assistant (GA) had to visit every class to distribute and collect signed consent forms. The IRB requirements, combined with the first-year survey and paper arrangement, means that not all students who took the NSSE survey also submitted a paper and vice versa. Overall, 434 first-year students and 103 upper- division students completed the NSSE IL survey; 39 of the upper-division students completed the survey as part of a first-year class. For the rubric evaluation, 707 first-year and 65 upper- division final projects were collected and read. The NSSE questionnaire was administered using the online survey software Qualtrics. Only the PI and a member of IRDS had access to the data set. Papers were downloaded by the PI from Canvas course sites and stored in a secure folder available only to the PI and the GA. Papers were anonymized by removing any identifying information in the text of the document, the document title, and the document metadata. After identifying informa- tion was removed, the GA assigned each paper a number, which is how readers identified which papers to read. De-identified papers were housed in a secure Box folder to allow readers access to them. Librarians at IUPUI (N = 23) were recruited to read papers. Before papers were read, three norming sessions were held using a sample paper and the IL VALUE rubric. After the norming session, each paper was read by two different librarians and scored using the IL VALUE rubric. Scores were entered into a Qualtrics form available only to the PI, GA, and the member of IRDS. Where scores between the two raters differed by more than one point, a third librarian read the paper. After the papers had been read, an Excel file containing the student name and correspond- ing paper number was provided to the member of IRDS via a secure file transfer method for critical data.44 Per IRB requirements, only the member of IRDS had access to student data beyond the rubric scores (for example, first-generation status, Pell grant recipient, or other identifying data). All data analysis was performed by IRDS and then returned to the PI in the aggregate. Results AAC&U IL VALUE Rubric score Students received a score on each dimension of the VALUE rubric from 1 (“Below Bench- mark”) to 5 (“Capstone”). Differences between students in first-year courses (100-level) and upper-division courses (300 & 400 level) are displayed in table 1. An overall score was also calculated consisting of the average of each of the five dimensions. Cronbach’s alpha revealed very strong internal consistency between each dimension of the VALUE rubric (α = 0.91). Documenting the Value of Librarians in the Classroom 497 Differences between upper-division and first-year students were assessed using an in- dependent samples t-test. As expected, results suggest that upper-division students scored significantly higher on the VALUE rubric than students in first-year courses (t (682) = 17.15; α < 0.05). Specifically, the average score for upper-division students was 3.95, consistent with the expected performance of students at the junior level, but shy of what was expected at the “Capstone” level.45 The average score for first-year students was 2.81, between the “Bench- mark” and “Milestone 2” level. These results suggest first-year students generally scored consistently with ratings expected based on their level of development. The value for Hedges g (2.34) suggested that this was a large effect. While a t-test can describe the probability that two means are differences, an effect size calculation is needed to determine the magnitude of the difference between two groups. Hedges’ g is recommended for cases such as these analyses when the sizes of the two groups are very different.46 An OLS regression was conducted to determine which factors had an effect on rubric scores. Only scores from first-year students were included in this analysis, as the relatively small number of upper-division students (n = 65) would have yielded a limited model with fewer independent variables.47 Furthermore, using only first-year students meant that partici- pants had fewer experiences with college-level IL instruction than upper-division students. This allowed for an assessment of the intervention (that is to say, experiences with the IL module) without the potential confound of prior IL experiences. High school GPA and SAT scores were included to control for academic ability and prior academic achievement. First- generation status and receipt of a Pell grant were included to control for socioeconomic status. An interaction term between first-generation and Pell status was also included in the model to isolate the specific effects of first-generation status, which may be related to culture and TABLE 1 AAC&U VALUE Rubric Means for Lower and Upper Division* First-Year Mean** Upper-Division Mean*** Determine the Extent of Information Needed 2.98 4.12 Access the Needed Information 2.93 4.11 Evaluate Information and Its Sources Critically 2.60 4.06 Use Information Effectively to Accomplish a Specific Purpose 2.78 4.02 Access and Use Information Ethically and Legally 2.76 3.43 Overall 2.81 3.95 * 5-point scale: 1 = ”Below Benchmark”; 2 = ”Benchmark”; 3 = ”Milestone 2”; 4 = ”Milestone 3”; 5 = ”Capstone.” ** First-year (n = 619). *** Upper-division (n = 65). TABLE 2 Independent Samples T-Test Results for Upper-Division and First-Year Students* N Mean Standard Deviation DF t First-year 619 2.81 0.51 682 17.15** Upper-division 65 3.95 0.54 * First-year: n = 619, mean = 2.81, s.d. = 0.51. **Statistically significant at α < 0.05. Hedges’ g = 2.24. 498 College & Research Libraries April 2020 ability to navigate a higher education institution,48 and Pell status, which is determined more directly by expected family contribution.49 A dichotomous variable indicating which students were in courses with an embedded librarian was also included in the model. Results of the OLS regression can be found in table 3. The model was statistically sig- nificant (F = 23.0, p < 0.05), though the R2 suggested only a small portion of the variance was accounted for by this combination of variables (R2 = 0.11). High school GPA and SAT score both had a statistically significant effect on VALUE rubric score, suggesting that prior academic achievement and academic ability were unsurprisingly related to perfor- mance on the assignment. However, it should be noted that having a librarian embedded in the course also had a small but statistically significant effect on VALUE rubric score. Specifically, being in a course with an embedded librarian would lead to an increase of 0.08 points in their VALUE rubric score (B = 0.07, std. error = 0.03). This was statistically significant even net the effect of high school GPA, SAT score, and variables associated with socioeconomic status. Experiences with Information Literacy Results Students’ responses to the NSSE Experiences with Information Literacy survey items are described in tables 4 and 5. The results are divided between the two Use of Information and Institutional Emphasis subscales that appear in the information literacy module. First-year students were compared with students not in their first year using Chi- square test for independence. For the purposes of this analysis, first-year students were defined as those in their first semester at the university, while upper-division students were defined as those who had attended in a prior semester. In general, first-year students were significantly more likely to report engaging in activities associated with the devel- opment of IL. For example, first-year students were more likely to indicate that they very often worked on a paper or project that had multiple smaller assignments or changed the focus of a paper or project based on information they found while researching the topic (see table 4). TABLE 3 OLS Regression Results on VALUE Rubric* B β Std. Error Embedded Librarian† 0.07 0.06 0.03 High School GPA† ‡ 0.37 0.25 0.05 SAT Score† § 0.28 0.10 0.09 First-Generation Status –0.02 –0.02 0.05 Received Pell Grant –0.05 –0.04 0.04 First-Gen*Pell Grant <0.01 <0.01 0.07 Intercept 1.19 — 0.15 * F = 23.94; R2 = 0.11. † Effect was statistically significant net the effect of the other independent variables (p < 0.05). ‡ 4.0 scale. § 2017 SAT scores. Students completing the SAT at an earlier date or completing the ACT were recalibrated using 2017 SAT norms. SAT score divided by 1,000. Documenting the Value of Librarians in the Classroom 499 Similar to their Use of Information, for Institutional Emphasis first-year students also reported that their instructors emphasized practices associated with the development of IL skills. First-year students were especially more likely to indicate that their instructors placed very much emphasis on appropriately citing the sources used in a paper or project or using scholarly or peer-reviewed sources in their course assignments (see table 5). Correlation between Rubric and Survey Results A total of 110 students completed both the survey and an assignment scored using the VALUE rubric. Of those who completed both, 31 were upper-division students and 79 were first-year TABLE 4 Experiences with IL Survey Use of Information Item Responses by First-Year and Upper- Division Students Item Percentages During the current school year, how often have you done the following? N Never Sometimes Often Very Often Completed an information source (book, article, website, or the like) other than required course readings First-year 434 5.3% 24.7% 37.8% 32.3% Upper- division 103 2.9% 27.2% 32.0% 37.9% Worked on a paper or project that had multiple smaller assignments such as an outline, annotated bibliography, or rough draft* First-year 434 3.5% 20.5% 44.0% 32.0% Upper- division 103 9.7% 36.9% 36.9% 16.5% Received feedback from an instructor that improved your use of information resources (source selection, proper citation, and such)* First-year 434 6.5% 23.0% 40.3% 30.2% Upper- division 103 19.4% 33.0% 31.1% 16.5% Completed an assignment that used the library’s electronic collection of articles, books, and journals (examples: JSTOR, EBSCO, LexisNexis, ProQuest) First-year 433 33.3% 32.3% 21.0% 13.4% Upper- division 103 32.0% 28.2% 21.4% 18.4% Decided not to use an information source in a course assignment due to its questionable quality First-year 434 17.7% 34.8% 31.1% 16.4% Upper- division 103 19.4% 35.0% 26.2% 19.4% Changed the focus of a paper or project based on information you found while researching the topic* First-year 434 9.4% 36.6% 36.6% 17.3% Upper- division 103 14.6% 46.6% 26.2% 12.6% Looked for a reference that was cited in something you read First-year 432 9.0% 28.9% 42.1% 19.9% Upper- division 103 9.7% 34.0% 38.8% 17.5% Identified how a book, article, or creative work has contributed to a field of study* First-year 433 13.6% 34.9% 35.6% 15.9% Upper- division 102 22.5% 35.3% 34.3% 7.8% *Chi-square test revealed statistically significant differences in response patterns between first-year and upper-division students (p < 0.05). 500 College & Research Libraries April 2020 students. Because of the different patterns of survey responses between upper-division and first-year students, correlations between rubric scores and survey responses were calculated separately. For first-year students, there were no significant correlations between the two survey scales and rubric score. For upper-division students, reported use of information was significantly correlated with rubric score (r = 0.38, p < 0.05), but institutional emphasis was not correlated with rubric score. These results should be interpreted with caution, as low N’s may increase the chances of a false positive. Correlations can be seen in tables 6 and 7 below. TABLE 5 Experiences with IL Survey Institutional Emphasis Item Responses by First-Year and Upper-Division Students Item Percentages During the current school year, how much have your instructors emphasized the following? N Very Little Some Quite a Bit Very Much Not plagiarizing another author’s work First-year 433 2.3% 3.5% 13.6% 80.6% Upper- division 103 1.0% 5.8% 13.6% 79.6% Appropriately citing the sources used in a paper or project* First-year 433 1.4% 4.8% 22.4% 71.4% Upper- division 103 2.9% 14.6% 27.2% 55.3% Using scholarly or peer-reviewed sources in your course assignments* First-year 433 3.5% 14.1% 30.9% 51.5% Upper- division 103 9.7% 21.4% 29.1% 39.8% Questioning the quality of information sources First-year 431 3.5% 18.6% 29.7% 48.3% Upper- division 103 6.8% 24.3% 31.1% 37.9% Using practices (terminology, methods, writing study, and other practices) of a specific major or field of study First-year 433 8.8% 21.9% 30.3% 39.0% Upper- division 103 9.7% 19.4% 33.0% 37.9% *Chi-square test revealed statistically significant differences in response patterns between first-year and upper-division students (p < 0.05). TABLE 6 Correlations between VALUE Rubric Score and Survey Results for Students in First-Year Courses (N = 79) Rubric Score Use of Information Institutional Emphasis Use of Information 0.04 — — Institutional Emphasis –0.04 0.58* — *Correlation statistically significant at p < 0.05. TABLE 7 Correlations between VALUE Rubric Score and Survey Results for Students in Upper- Division Courses (N = 31) Rubric Score Use of Information Institutional Emphasis Use of Information 0.38* — — Institutional Emphasis 0.16 0.49* — *Correlation statistically significant at p < 0.05. Documenting the Value of Librarians in the Classroom 501 Discussion The results have given teaching librarians a baseline of students’ IL competencies at both the first-year and upper-division levels. Students in 300- and 400-level courses scored better on the IL VALUE rubric than those in 100-level courses (3.95 versus 2.81 out of 5). This provides evidence that students are developing their IL competencies as they progress through their undergraduate degrees. However, while first-year students were where they were expected to be on the rubric at the “Benchmark” level, upper-division students fell short of the “Cap- stone” level. The evaluation also identified gaps at the upper division specifically in the rubric category “Access and Use Information Ethically and Legally,” which was the lowest scoring category for upper-division students. While upper-division students still outperformed first-year stu- dents, it was by a much smaller margin than the other rubric criteria (3.43 versus 2.76 out of 5). This finding has led to discussions with teaching librarians and faculty about how better to integrate citation and ethical use concepts into instruction and the wider IUPUI curriculum. The lowered average may also be related to issues such as students’ struggles with citation and understanding what plagiarism is and how to avoid it50 as well as the broader conversa- tion in higher education on the prevalence of plagiarism or unethical use of information.51 Another interesting finding from the rubric evaluation was at the first year, where stu- dents scored the lowest (2.60 out of 5) in the criterion “Evaluate Information and Its Sources Critically.” This criterion assesses whether students used a variety of sources and if the sources were the best evidence for the claims they were making. Research has shown that high school and first-year students struggle with evaluating information.52 It seems our students struggle with this as well. Regarding the NSSE IL module, it is interesting that students in 100-level courses were more likely than those in upper-division courses to report engaging in IL behaviors and to indicate that their faculty were emphasizing IL in their courses. There may be multiple expla- nations for this. A majority (75%) of first-year papers (534 of 707 total first-year) were collected from an introductory English class that includes curricular elements specifically mentioned in the NSSE survey. For example, “worked on a paper or project that had multiple smaller assignments such as an outline, annotated bibliography, or rough draft,” and “received feed- back from an instructor that improved your use of information resources (source selection, proper citation, and such).” Perhaps in the upper-division classes, faculty are emphasizing or articulating these things less because they already expect students to have these skills.53 On the other hand, first-year students could simply be overestimating how often they have per- formed these tasks and how often faculty have emphasized IL concepts. Research has shown that students, especially first-year students, overestimate their skills.54 One troubling finding is the relatively low percentage of students who reported “completing an assignment that used the library’s electronic collection of articles, books, and journals.” Only 34 percent of first-year students and slightly more upper-division students (39%) indicated they had done this often or very often. While in first-year courses this could be due to instructor-provided readings, at the upper division we would hope that students would be doing more external research for projects to build those IL competencies. Returning to our original research questions. First, is there a correlation between NSSE IL survey responses and IL rubric scores? There was a correlation between students’ self-reported use of information and rubric scores at the upper division. However, because of the small num- 502 College & Research Libraries April 2020 ber of students who completed the Experiences with IL module and received a score from the IL rubric, it is difficult to make any conclusions about the relationship between self-reported engagement and IL performance at this time.55 As authentic, rubric evaluation is a time and labor-intensive process, if a survey could be administered that would allow institutions to make connections between student responses and their IL proficiency, then it could be administered at a larger scale and used regularly to assess students’ skills. However, the larger first-year sample size but lack of correlation would lead the authors to the conclusion that this is not a viable, or valid, substitute for authentic assessment of student work to gauge IL competencies. Second, are there any indicators that correlate to IL performance in first-year students (such as librarian in the classroom, high school GPA, or first-generation student)? Thanks to the collaboration with IRDS, the authors were able to determine that the only factors with a statistically significant effect on students’ IL performance at the first year were students’ prior academic ability and performance (as measured by high school GPA and SAT score), and the presence of a librarian in the classroom. The first connection is, perhaps, not surprising. The finding that having a librarian in the classroom had a significant and positive effect on students’ IL performance, even including the effects of GPA, SAT score, and socioeconomic status, provides further evidence of the positive impact librarians may have on the develop- ment of IL skills. These results support other studies, which also showed a librarian impact on students’ IL competencies.56 That said, the relatively low R2 (0.11) suggests there may still be other factors associated with students’ development of IL that have not been explored. Researchers should continue to use student data and characteristics along with authentic as- sessment to better understand the factors that may contribute to IL development. Limitations There are limitations to the study. First, due to logistics already discussed, the upper-division student sample was not as representative as hoped. Second, because IRB required active student consent for paper collection, there is the possibility of selection bias. Due to paper collection from Canvas course sites, the PI had detailed course information and IRDS was able to determine which students from a class consented. A comparison of students who did and did not consent found that there was a difference in academic qualifications. Specifically, students who consented to have their papers read had higher mean high school GPA and higher mean SAT scores; they also earned a higher GPA in the semester in which the paper was submitted. Students who consented also had a significantly lower level of unmet financial need and were more likely to be female (see table 8). TABLE 8 Population Differences by Consent Female African American Latinx First Generation Admitted as a Transfer Best SAT Score High School GPA Received Pell Fall Unmet Financial Need in Fall Fall Semester GPA N Percentage Mean % Mean Did not consent 807 48% 9% 11% 31% 8% 1034 3.38 40% $5,180.44 2.65 Consented to participate 975 56% 7% 9% 30% 5% 1062 3.46 39% $4,337.10 3.04 All 1,782 52% 8% 10% 31% 6% 1049 3.42 40% $4,712.88 2.86 Documenting the Value of Librarians in the Classroom 503 Conclusion Through a relatively comprehensive assessment of first-year and upper-division students, the authors now have data on students’ IL competencies that can inform multiple layers of discussions: librarian-faculty, librarian-department, librarian-program, and library-campus administration. These data have reinvigorated discussions with campus administration. As the data provide evidence of the impact of librarians, the library is able to position itself in discussions that previously may have excluded librarians from the conversation. For example, the collaboration with IRDS led to greater awareness in their office of the role of librarians in teaching and learning, and the IRDS coauthor has been instrumental in pushing out the results of the impact of librarians on students’ IL skills to the campus. Because the research can be linked to a campus strategic goal, “Promote Undergraduate Student Learning and Success,” library administration included the results in the annual report to the campus. The timing is especially fortuitous as the campus undergoes efforts to rethink the first-year student experience and launch the new undergraduate student learning outcomes (Profiles).57 Librar- ians have actionable, institution-specific data on which to build IL instruction and scaffold IL through the curriculum. Results have been communicated to stakeholders in the library and on campus by both the PI and the member of IRDS. Inside the library, results have been shared not only with the teaching librarians but with the library as a whole through multiple all-staff presentations. Sharing the data internally has enhanced the reputation of the library educational services unit and bolstered the spirits of teaching librarians in demonstrating the impact of librar- ians’ teaching efforts. Anecdotally, in conversations with teaching librarians at IUPUI, they sometimes feel their teaching labor is not valued or appreciated as it is work done regularly and not an exciting new development. The latter is easier to publicize. Data demonstrating the impact of teaching librarians on student IL competencies adds value to their work and justifies the existence of library educational services. At the campus level, the PI and member of IRDS gave a well-received presentation to the campuswide assessment committee. The presentation generated deeper discussions about the importance of IL to the campus and the role of librarians in teaching IL competencies. The discipline-library collaboration not only provided actionable data for teaching librarians but also informed programs. The PI wrote and disseminated individual reports for each disci- pline that participated in the research. These reports have been used in departmental assess- ment reports and planning. The coauthors are also in the process of writing a research brief to distribute to campus. Currently the campus is rethinking first-year experience programs. Conversations are just beginning on the role of librarians in the first year. Study data have been shared with the first-year coordinators and were instrumental in getting librarians a seat at the table when discussions start. There are many strong supporters of teaching librarians on campus, but these cheerleaders can only take their advocacy so far. When faced with those on campus who are not sure if classroom time would be better spent with a librarian, strong data can open doors that sometimes advocacy cannot. Importantly, teaching librarians have begun to make curricular changes. Librarians have been strongly encouraged to revisit and revise their curriculum maps. The data have given librarians talking points when discussing IL integration with faculty. Librarians and faculty need to ensure that IL concepts are properly scaffolded or sequenced through the curriculum, which has been shown to increase students’ IL proficiencies.58 Progress for both curriculum 504 College & Research Libraries April 2020 mapping and scaffolding will be assessed in upcoming annual reviews. Librarians have also instituted changes to their teaching and learning objectives. For example, after seeing the rubric scores for upper-division students in the ethical rubric criterion, the business librar- ian added information to a LibGuide on citing in presentations and emphasizes this more strongly in her teaching.59 Beyond our institutional context, the results of this study add to the ACRL Academic Library Impact Report.60 This was seen especially in the research area “Quantify the Library’s Impact on Student Success,” where study results help to provide evidence for multiple research questions for further study. For example: How do library resources and programs (such as courses and events) impact indicators of student success? and What effects do libraries have on success outcomes for different types of students? The present study builds on previous research providing evidence demonstrating the impact of librarians on student success. APPENDIX. NSSE Experiences with Information Literacy 1. During the current school year, about how often have you done the following? (Very Often, Often, Sometimes, Never) a. Completed an assignment that used an information source (book, article, website, or the like) other than required course readings b. Worked on a paper or project that had multiple smaller assignments such as an out- line, annotated bibliography, or rough draft c. Received feedback from an instructor that improved your use of information resources (source selection, proper citation, and such) d. Completed an assignment that used the library’s electronic collection of articles, books, and journals (examples: JSTOR, EBSCO, LexisNexis, ProQuest) e. Decided not to use an information source in a course assignment due to its question- able quality f. Changed the focus of a paper or project based on information you found while re- searching the topic g. Looked for a reference that was cited in something you read h. Identified how a book, article, or creative work has contributed to a field of study 2. During the current school year, how much have your instructors emphasized the follow- ing? (Very Much, Quite a Bit, Some, Very Little) a. Not plagiarizing another author’s work b. Appropriately citing the sources used in a paper or project c. Using scholarly or peer-reviewed sources in your course assignments d. Questioning the quality of information sources e. Using practices (terminology, methods, writing style, and other practices) of a specific major or field of study 3. How much has your experience at this institution contributed to your knowledge, skills, and personal development in using information effectively? (Very Much, Quite a Bit, Some, Very Little) Documenting the Value of Librarians in the Classroom 505 Notes 1. Central Indiana Community Foundation, “The Indianapolis Foundation Library Fund,” available online at https://www.cicf.org/funds-and-foundations/special-focus-funds/the-indianapolis-foundation-library-fund/ [accessed 26 June 2019]. 2. Alison J. Head, “Learning the Ropes: How Freshmen Conduct Course Research Once They Enter College,” Project Information Literacy (Dec. 5, 2013), available online at https://www.projectinfolit.org/uploads/2/7/5/4/27541717/ pil_2013_freshmenstudy_fullreportv2.pdf [accessed 11 July 2019]; Alison J. Head, “Learning Curve: How Col- lege Graduates Solve Information Problems Once They Join the Workplace,” Project Information Literacy (Oct. 16, 2012), available online at https://www.projectinfolit.org/uploads/2/7/5/4/27541717/pil_fall2012_workplaces- tudy_fullreport_revised.pdf [accessed 11 July 2019]; Alison J. Head, “Staying Smart: How Today’s Graduates Continue to Learn Once They Complete College,” Project Information Literacy (Jan. 5, 2016), available online at www.projectinfolit.org/uploads/2/7/5/4/27541717/staying_smart_pil_1_5_2016b_fullreport.pdf [accessed 11 July 2019]; Tina Klomsri and Matti Tedre, “Poor Information Literacy Skills and Practices as Barriers to Academic Performance: A Mixed Methods Study of the University of Dar Es Salaam,” Reference & User Services Quarterly 55, no. 4 (2016): 293–305, doi:10.5860/rusq.55n4.293; Hart Research Associates, “Falling Short? College Learning and Career Success” (survey conducted on behalf of the Association of American Colleges & Universities, Jan. 20, 2015), available online at https://www.aacu.org/sites/default/files/files/LEAP/2015employerstudentsurvey.pdf [accessed 11 July 2019]. 3. Head, “Learning the Ropes”; Head, “Learning Curve”; Head, “Staying Smart.” 4. Klomsri and Tedre, “Poor Information Literacy Skills and Practices as Barriers to Academic Performance.” 5. Hart Research Associates, “Falling Short?” 6. Institutional Research and Decision Support, “Indiana University Purdue University Indianapolis. India- napolis/Columbus/Ft. Wayne Campus Enrollment Summary,” IUPUI (Aug. 27, 2018), available online at https:// irds.iupui.edu/_documents/enrollment-management/census-report/2018%20Indianapolis+Columbus+Fort%20 Wayne%20Campus%20Enrollment%20-%20Summary.pdf [accessed 11 July 2019]. 7. “IUPUI Annual Diversity Report,” IUPUI (2018), available online at https://diversity.iupui.edu/publica- tions/diversity-report-2018.pdf [accessed 11 July 2019]. 8. Michele J. Hansen, “Update on IUPUI Students: Retention/Graduation Rates,” IUPUI Office of Institutional Research and Decision Support, available online at https://irds.iupui.edu/_documents/students/retention-and- graduation/2018%20IUPUI%20Students%20Retention%20IFC.pdf [accessed 11 July 2019]. 9. “Goals: IUPUI Campus Strategy,” IUPUI Strategic Plan, available online at https://strategicplan.iupui.edu/ goals [accessed 11 July 2019]. 10. “IUPUI Named to Inaugural Class of Excellence in Assessment Designees,” IUPUI Newsroom (Aug. 22, 2016), available online at http://archive.news.iupui.edu/releases/2016/08/excellence-in-assessment-inaugural- class-iupui.shtml [accessed 11 July 2019]. 11. Center for Teaching and Learning, “PULs,” IUPUI, available online at https://ctl.iupui.edu/Resources/ Preparing-to-Teach/PULs [accessed 11 July 2019]. 12. Office of Academic Affairs, “IUPUI+,” IUPUI, available online at https://academicaffairs.iupui.edu/Strategic- Initiatives/IUPUI-Plus [accessed 11 July 2019]. 13. Office of Academic Affairs, “IUPUI+.” 14. ACRL Board, “Framework for Information Literacy for Higher Education,” Association for College and Research Libraries (Jan. 11, 2016), available online at www.ala.org/acrl/standards/ilframework [accessed 11 July 2019]. 15. University Library, “IL LOs with Profiles,” IUPUI, available online at http://iupui.libguides.com/ ld.php?content_id=15346306 [accessed 11 July 2019]; University Library, “UL IL LOs aligned with Profiles,” IU- PUI, available online at http://iupui.libguides.com/ld.php?content_id=15346306 [accessed 11 July 2019]. 16. Association of American Colleges & Universities (AAC&U), “Information Literacy VALUE Rubric,” avail- able online at https://www.aacu.org/value/rubrics/information-literacy [accessed 23 July 2019]. 17. National Survey of Student Engagement (NSSE), “Experiences with Information Literacy Module,” accessed July 23, 2019, available online at http://nsse.indiana.edu/pdf/modules/2017/NSSE_2017_Experiences_with_Infor- mation_Literacy_Module.pdf (Used with permission from NSSE) [accessed 23 July 2019]. 18. Shane Nackerud et al., “Analyzing Demographics: Assessing Library Use Across the Institution,” portal: Libraries and the Academy 13, no. 2 (2013): 131–45, doi:10.1353/pla.2013.0017; Krista M. Soria, Jan Fransen, and Shane Nackerud, “Library Use and Undergraduate Student Outcomes: New Evidence for Students’ Retention and Academic Success,” portal: Libraries and the Academy 13, no. 2 (2013): 147–64, doi:10.1353/pla.2013.0010. https://www.cicf.org/funds-and-foundations/special-focus-funds/the-indianapolis-foundation-library-fund/ https://www.projectinfolit.org/uploads/2/7/5/4/27541717/pil_2013_freshmenstudy_fullreportv2.pdf https://www.projectinfolit.org/uploads/2/7/5/4/27541717/pil_2013_freshmenstudy_fullreportv2.pdf https://www.projectinfolit.org/uploads/2/7/5/4/27541717/pil_fall2012_workplacestudy_fullreport_revised.pdf https://www.projectinfolit.org/uploads/2/7/5/4/27541717/pil_fall2012_workplacestudy_fullreport_revised.pdf http://www.projectinfolit.org/uploads/2/7/5/4/27541717/staying_smart_pil_1_5_2016b_fullreport.pdf https://www.aacu.org/sites/default/files/files/LEAP/2015employerstudentsurvey.pdf https://irds.iupui.edu/_documents/enrollment-management/census-report/2018%20Indianapolis+Columbus+Fort%20Wayne%20Campus%20Enrollment%20-%20Summary.pdf https://irds.iupui.edu/_documents/enrollment-management/census-report/2018%20Indianapolis+Columbus+Fort%20Wayne%20Campus%20Enrollment%20-%20Summary.pdf https://irds.iupui.edu/_documents/enrollment-management/census-report/2018%20Indianapolis+Columbus+Fort%20Wayne%20Campus%20Enrollment%20-%20Summary.pdf https://diversity.iupui.edu/publications/diversity-report-2018.pdf https://diversity.iupui.edu/publications/diversity-report-2018.pdf https://irds.iupui.edu/_documents/students/retention-and-graduation/2018%20IUPUI%20Students%20Retention%20IFC.pdf https://irds.iupui.edu/_documents/students/retention-and-graduation/2018%20IUPUI%20Students%20Retention%20IFC.pdf https://strategicplan.iupui.edu/goals https://strategicplan.iupui.edu/goals http://archive.news.iupui.edu/releases/2016/08/excellence-in-assessment-inaugural-class-iupui.shtml http://archive.news.iupui.edu/releases/2016/08/excellence-in-assessment-inaugural-class-iupui.shtml https://ctl.iupui.edu/Resources/Preparing-to-Teach/PULs https://ctl.iupui.edu/Resources/Preparing-to-Teach/PULs https://academicaffairs.iupui.edu/Strategic-Initiatives/IUPUI-Plus https://academicaffairs.iupui.edu/Strategic-Initiatives/IUPUI-Plus http://www.ala.org/acrl/standards/ilframework http://iupui.libguides.com/ld.php?content_id=15346306 http://iupui.libguides.com/ld.php?content_id=15346306 http://iupui.libguides.com/ld.php?content_id=15346306 https://www.aacu.org/value/rubrics/information-literacy http://nsse.indiana.edu/pdf/modules/2017/NSSE_2017_Experiences_with_Information_Literacy_Module.pdf http://nsse.indiana.edu/pdf/modules/2017/NSSE_2017_Experiences_with_Information_Literacy_Module.pdf http://nsse.indiana.edu/pdf/modules/2017/NSSE_2017_Experiences_with_Information_Literacy_Module.pdf https://doi.org/10.1353/pla.2013.0017 https://doi.org/10.1353/pla.2013.0010 506 College & Research Libraries April 2020 19. Graham Stone et al., “Increasing the Impact: Building on the Library Impact Data Project,” Journal of Academic Librarianship 41, no. 4 (2015): 517–20, doi:10.1016/j.acalib.2015.06.003. 20. Brian L. Cox and Margie Jantti, “Capturing Business Intelligence Required for Targeted Marketing, Demonstrating Value, and Driving Process Improvement,” Library & Information Science Research 34, no. 4 (2012): 308–16, doi:10.1016/j.lisr.2012.06.002. 21. DeeAnn Allison, “Measuring the Academic Impact of Libraries,” portal: Libraries and the Academy 15, no. 1 (2015): 29–40. doi:10.1353/pla.2015.0001. 22. Felly Chiteng Kot and Jennifer L. Jones, “The Impact of Library Resource Utilization on Undergraduate Students’ Academic Performance: A Propensity Score Matching Design,” College & Research Libraries 76, no. 5 (2015): 566–86; doi:10.5860/crl.76.5.566. 23. Gaby Haddow and Jayanthi Joseph, “Loans, Logins, and Lasting the Course: Academic Library Use and Student Retention,” Australian Academic & Research Libraries 41, no. 4 (2010): 233–44, doi:10.1080/00048623.2010.1 0721478. 24. Melissa Bowles-Terry, “Library Instruction and Academic Success: A Mixed-Methods Assessment of a Library Instruction Program,” Evidence Based Library and Information Practice 7, no. 1 (2012): 82–95, doi:10.18438/ B8PS4D. 25. Priscilla Coulter, Susan Clarke, and Carol Scamman, “Course Grade as a Measure of the Effectiveness of One-Shot Information Literacy Instruction,” Public Services Quarterly 3, no. 1/2 (2007): 147–63, doi:10.1300/ J295v03n01_08. 26. M. Sara Lowe et al., “Impacting Information Literacy Learning in First-Year Seminars: A Rubric-Based Evaluation,” portal: Libraries and the Academy 15, no. 3 (2015): 489–512, doi:10.1353/pla.2015.0030. 27. Meagan Bowler and Kori Street, “Investigating the Efficacy of Embedment: Experiments in Information Literacy Integration,” Reference Services Review 36, no. 4 (2008): 438–49, doi:10.1108/00907320810920397. 28. Brandy Whitlock and Nassim Ebrahimi, “Beyond the Library: Using Multiple, Mixed Measures Simul- taneously in a College-Wide Assessment of Information Literacy,” College & Research Libraries 77, no. 2 (2016): 236–62, doi:10.5860/crl.77.2.236. 29. Whitlock and Ebrahimi, “Beyond the Library.” 30. Char Booth et al., “Degrees of Impact: Analyzing the Effects of Progressive Librarian Course Collabora- tions on Student Performance,” College & Research Libraries 76, no. 5 (2015): 623–51, doi:10.5860/crl.76.5.623. 31. Erin E. Rinto and Elisa I. Cogbill-Seiders, “Library Instruction and Themed Composition Courses: An Investigation of Factors that Impact Student Learning,” Journal of Academic Librarianship 41, no. 1 (2015): 14–20, doi:10.1016/j.acalib.2014.11.010. 32. Philip A. Smith, “Information Literacy Integration as Quality Enhancement of Undergraduate Curricu- lum,” Communications in Information Literacy 10, no. 2 (2016): 214–44, doi:10.7548/cil.v10i2.340. 33. Veronica A. Douglas and Celia E. Rabinowitz, “Examining the Relationship Between Faculty-Librarian Collaboration and First-Year Students’ Information Literacy Abilities,” College & Research Libraries 77, no. 2 (2016): 144–63, doi:10.5860/crl.77.2.144. 34. Amanda Shannon and Vaughn Shannon, “Librarians in the Midst: Improving Student Research Through Collaborative Instruction,” Journal of Political Science Education 12, no. 4 (2016): 457–70, doi:10.1080/15512169.2016. 1157486. 35. Wendy Holliday et al., “An Information Literacy Snapshot: Authentic Assessment Across the Curriculum,” College & Research Libraries 76, no. 2 (2015): 170–87, doi:10.5860/crl.76.2.170. 36. Holly Luetkenhaus, Steve Borrelli, and Corey Johnson, “First Year Course Programmatic Assessment: Final Essay Information Literacy Analysis,” Reference & User Services Quarterly 55, no. 1 (2015): 49–60, doi:10.5860/ rusq.55n1.49. 37. Brianne Markowski, Lyna Fontes McCartin, and Stephanie Evers, “Meeting Students Where They Are: Using Rubric-based Assessment to Modify an Information Literacy Curriculum,” Communications in Information Literacy 12, no. 2 (2018): 128–49, doi:10.15760/comminfolit.2018.12.2.5. 38. AAC&U, “Information Literacy VALUE Rubric.” 39. NSSE, “Experiences with Information Literacy Module.” 40. David J. Turbow and Julie Evener, “Norming a VALUE Rubric to Assess Graduate Information Literacy Skills,” Journal of the Medical Library Association 104, no. 3 (2016): 209–14, doi:10.3163/1536-5050.104.3.005; Gloria F. Creed-Dikeogu, “Can Smaller Colleges Use the AAC&U Rubrics?” Kansas Library Association College & Univer- sity Libraries Section Proceedings 6, no. 1 (2016): 1–7, doi:10.4148/2160-942X.1061; Markowski, McCartin, and Evers, “Meeting Students Where They Are.” 41. “The New SAT. We’re ON IT,” Princeton Review, available online at https://www.princetonreview.com/ college/sat-changes [accessed 23 July 2019]. https://doi.org/10.1016/j.acalib.2015.06.003 https://doi.org/10.1016/j.lisr.2012.06.002 https://doi.org/10.1353/pla.2015.0001 https://doi.org/10.5860/crl.76.5.566 https://doi.org/10.1080/00048623.2010.10721478 https://doi.org/10.1080/00048623.2010.10721478 https://doi.org/10.18438/B8PS4D https://doi.org/10.18438/B8PS4D https://doi.org/10.1300/J295v03n01_08 https://doi.org/10.1300/J295v03n01_08 https://doi.org/10.1353/pla.2015.0030 https://doi.org/10.1108/00907320810920397 https://doi.org/10.5860/crl.77.2.236 https://doi.org/10.5860/crl.76.5.623 https://doi.org/10.1016/j.acalib.2014.11.010 https://doi.org/10.7548/cil.v10i2.340 https://doi.org/10.5860/crl.77.2.144 https://doi.org/10.1080/15512169.2016.1157486 https://doi.org/10.1080/15512169.2016.1157486 https://doi.org/10.5860/crl.76.2.170 https://doi.org/10.5860/rusq.55n1.49 https://doi.org/10.5860/rusq.55n1.49 https://doi.org/10.15760/comminfolit.2018.12.2.5 https://doi.org/10.3163/1536-5050.104.3.005 https://doi.org/10.4148/2160-942X.1061 https://www.princetonreview.com/college/sat-changes https://www.princetonreview.com/college/sat-changes Documenting the Value of Librarians in the Classroom 507 42. University Institutional Research and Reporting, “IR First Generation Indicator,” IR Data Guides (2017), available online at https://uirr.iu.edu/resources/ir-data-guides/index.html [accessed 25 July 2019]. 43. The methodology attempts to address data management issues mentioned by Kristin A. Briney, “Data Management Practices in Academic Library Learning Analytics: A Critical Review,” Journal of Librarianship and Scholarly Communication 7, no. 1 (2019): eP2268, doi:10.7710/2162-3309.2268. 44. “About Slashtmp at IU,” IU Knowledgebase, available online at https://kb.iu.edu/d/angt [accessed 26 June 2019]. 45. AAC&U, “Information Literacy VALUE Rubric.” 46. Paul D. Ellis, “Introduction to Effect Sizes,” in The Essential Guide to Effect Sizes: Statistical Power, Meta- analysis, and the Interpretation of Research Results (New York, NY: Cambridge University Press, 2010), 3–30. 47. Abdelmonem Afifi, Susanne May, and Virginia A. Clark, Practical Multivariate Analysis, 5th ed. (Boca Raton, FL: CRC Press, 2012), 154. 48. Louise Archer, “Social Class in Higher Education,” in Higher Education and Social Class: Issues of Exclusion and Inclusion, eds. Louise Archer, Merryn Hutchings, and Alistair Ross (New York, NY: Routledge Falmer, 2003), 5–20. 49. “Quick Reference Guide to Evaluating Financial Aid Award Letters,” Fastweb, available online at www. finaid.org/fafsa/FinancialAidAwardLetters.pdf [accessed 22 July 2019]. 50. Ali R. Abasi and Barbara Graves, “Academic Literacy and Plagiarism: Conversations with International Graduate Students and Disciplinary Professors,” Journal of English for Academic Purposes 7, no. 4 (2008): 221–33, doi:10.1016/j.jeap.2008.10.010; James Elander et al., “Evaluation of an Intervention to Help Students Avoid Un- intentional Plagiarism by Improving their Authorial Identity,” Assessment & Evaluation in Higher Education 35, no. 2 (2010): 157–71, doi:10.1080/02602930802687745; Rebecca M. Howard and Laura J. Davies, “Plagiarism in the Internet Age,” Educational Leadership 66, no. 6 (2009): 64–67; John Abdul Kargbo, “Undergraduate Students’ Prob- lems with Citing References,” Reference Librarian 51, no. 3 (2010): 222–36, doi:10.1080/02763871003769673; Diane Pecorari, “Good and Original: Plagiarism and Patchwriting in Academic Second-Language Writing,” Journal of Second Language Writing 12, no. 4 (2003): 317–45, doi:10.1016/j.jslw.2003.08.004; Lori G. Power, “University Students’ Perceptions of Plagiarism,” Journal of Higher Education 80, no. 6 (2009): 643–62, doi:10.1080/00221546.2009.11779 038; Ling Shi, “Textual Appropriation and Citing Behaviors of University Undergraduates,” Applied Linguistics 31, no. 1 (2010): 1–24, doi:10.1093/applin/amn045; 51. Donald L. McCabe, Linda K. Trevino, and Kenneth D. Butterfield, “Cheating in Academic Institutions: A Decade of Research,” Ethics & Behavior 11, no. 3 (2001): 219–32, doi:10.1207/S15327019EB1103_2; Donald L. Mc- Cabe, Kenneth D. Butterfield, and Linda K. Trevino, Cheating in College: Why Students Do It and What Educators Can Do about It (Baltimore, MD: Johns Hopkins University Press, 2012); Chris Park, “In Other (People’s) Words: Plagiarism by University Students—Literature and Lessons,” Assessment & Evaluation in Higher Education 28, no. 5 (2003): 471–88, doi:10.1080/02602930301677. 52. Head, “Learning the Ropes”; Alison J. Head and Michael B. Eisenberg, “Truth Be Told: How College Students Evaluate and Use Information in the Digital Age,” Project Information Literacy (2010), available online at https://www.projectinfolit.org/uploads/2/7/5/4/27541717/pil_fall2010_survey_fullreport1.pdf [accessed 11 July 2019]; Sam Wineburg et al., “Evaluating Information: The Cornerstone of Civic Online Reasoning,” Stanford Graduate School of Education (2016), available online at https://purl.stanford.edu/fv751yt5934 [accessed 11 July 2019]. 53. Steve Kolowich, “What Students Don’t Know,” Inside Higher Ed (Aug. 22, 2011), available online at https:// www.insidehighered.com/news/2011/08/22/what-students-dont-know [accessed 26 July 2019]. 54. Melissa Gross and Don Latham, “What’s Skill Got to Do with It? Information Literacy Skills and Self-Views of Ability among First-Year College Students,” Journal of the Association for Information Science and Technology 63, no. 3 (2012): 574–83, doi:10.1002/asi.21681; Katherine Schilling and Rachel Applegate, “Best Methods for Evaluat- ing Educational Impact: A Comparison of the Efficacy of Commonly Used Measures of Library Instruction,” Journal of the Medical Library Association 100, no. 4 (2012): 258–69, doi:10.3163/1536-5050.100.4.007. 55. Katherine S. Button et al., “Power Failure: Why Small Sample Size Undermines the Reliability of Neuro- science,” Nature Reviews Neuroscience 14 (2013): 365–76, doi:10.1038/nrn3475. 56. Booth et al., “Degrees of Impact”; Bowler and Street, “Investigating the Efficacy of Embedment”; Lowe et al., “Impacting Information Literacy Learning in First-Year Seminars.” 57. “IUPUI Profiles of Learning for Undergraduate Success,” Division of Undergraduate Education, available online at https://due.iupui.edu/undergraduate-curricula/general-education/profiles/index.html [accessed 17 July 2019]. 58. Kacy Lundstrom, Pamela Martin, and Dory Cochran, “Making Strategic Decisions: Conducting and Using Research on the Impact of Sequenced Library Instruction,” College & Research Libraries 77, no. 2 (2016): 212–26, https://uirr.iu.edu/resources/ir-data-guides/index.html https://doi.org/10.7710/2162-3309.2268 https://kb.iu.edu/d/angt http://www.finaid.org/fafsa/FinancialAidAwardLetters.pdf http://www.finaid.org/fafsa/FinancialAidAwardLetters.pdf https://doi.org/10.1016/j.jeap.2008.10.010 https://doi.org/10.1080/02602930802687745 https://doi.org/10.1080/02763871003769673 https://doi.org/10.1016/j.jslw.2003.08.004 https://doi.org/10.1080/00221546.2009.11779038 https://doi.org/10.1080/00221546.2009.11779038 https://doi.org/10.1093/applin/amn045 https://doi.org/10.1207/S15327019EB1103_2 https://doi.org/10.1080/02602930301677 https://www.projectinfolit.org/uploads/2/7/5/4/27541717/pil_fall2010_survey_fullreport1.pdf https://purl.stanford.edu/fv751yt5934 https://www.insidehighered.com/news/2011/08/22/what-students-dont-know https://www.insidehighered.com/news/2011/08/22/what-students-dont-know https://doi.org/10.1002/asi.21681 https://doi.org/10.3163/1536-5050.100.4.007 https://doi.org/10.1038/nrn3475 https://due.iupui.edu/undergraduate-curricula/general-education/profiles/index.html 508 College & Research Libraries April 2020 doi:10.5860/crl.77.2.212. 59. Katharine Macy, “Business Guide,” IUPUI University Library, available online at https://iupui.libguides. com/business/citation#s-lg-box-19181681 [accessed 30 September 2019]. 60. Association of College and Research Libraries (ACRL), Academic Library Impact: Improving Practice and Es- sential Areas to Research, prepared by Lynn Silipigni Connaway, William Harvey, Vanessa Kitzie, and Stephanie Mikitish of OCLC Research (Chicago, IL: ACRL, 2017), available online at www.ala.org/acrl/sites/ala.org.acrl/ files/content/publications/whitepapers/academiclib.pdf [accessed 26 July 2019]. https://doi.org/10.5860/crl.77.2.212 https://iupui.libguides.com/business/citation#s-lg-box-19181681 https://iupui.libguides.com/business/citation#s-lg-box-19181681 http://www.ala.org/acrl/sites/ala.org.acrl/files/content/publications/whitepapers/academiclib.pdf http://www.ala.org/acrl/sites/ala.org.acrl/files/content/publications/whitepapers/academiclib.pdf