31 Fortifying the Pipeline: A Quantitative Exploration of High School Factors Impacting the Information Literacy of First-Year College Students Jennifer L. Fabbi Jennifer Fabbi is Dean of the Library at California State University, San Marcos, previously Associate Dean for Research and Education at the University of Nevada, Las Vegas Libraries. She has published and presented more than 80 scholarly works, and her current research is on factors impacting the information literacy of first-year college students. This paper represents the culmination of her dissertation work for her earned PhD in Higher Education Leadership. © 2015 Jennifer L. Fabbi, Attribution-NonCommercial (http://creativecommons.org/licenses/by-nc/3.0/) CC BY-NC. The purpose of this study is to explore the relationship between a sample of first-time college freshmen students’ high school experiences that are developmentally related to information literacy competency and their scores on the iSkills assessment. iSkills is an online evaluation devel- oped by the Educational Testing Service (ETS), which tests the range of Information and Communications Technology literacy (ICT literacy) skills aligned with nationally recognized Association of Colleges & Research Libraries (ACRL) standards. Through hierarchical multiple regression analysis, four variables predictive of a significantly higher score on the iSkills assessment at the p < .05 level were identified. nformation literacy—the ability “to recognize when information is needed and have the ability to locate, evaluate, and use effectively the needed information”—has been widely and increasingly cited as an essential competency for college success, for the workplace, and for life.1 Several related competencies have been identified in the last decade, specifically Information and Communications Technology literacy (ICT literacy), a term used globally for information literacy and one that recognizes the development of information literacy within a technology-rich environment.2 Based upon national information literacy standards, students should begin to de- velop information literacy competency during their K–12 education, and information literacy has now infiltrated the K–12 educational standards for a number of states.3 Additionally, the emerging Common Core Standards for K–12 education, an initiative of the National Governors Association Center for Best Practices and the Council of Chief State School Officers, address information literacy competency by weaving expectations for students into the English Language Arts content area standards.4 However, at the doi:10.5860/crl.76.1.31 crl13-549 32 College & Research Libraries January 2015 present time, there are no high school graduates who have experienced a mandatory curricular implementation of this important set of competencies. Many times, teachers and school librarians attempt to teach this competency to high school students through library skills instruction, a method that is often a stand-alone lesson, independent of the curriculum, and that focuses on lower-order thinking skills and recall of information.5 In addition, there are several models for teaching informa- tion literacy as a process of research and inquiry, and best practice asserts that the competency is most effectively taught when integrated into the existing curriculum.6 In these latter models, information literacy competency is developed through the as- signment of research and inquiry projects that require students to use the library and/ or the school librarian as resources.7 Broadly stated, the purpose of this study is to explore selected high school experiences that may influence students’ development of information literacy competency. The researcher framed this study using constructivist learning theory as the theo- retical lens. Constructivism was developed in response to perceived shortcomings in previous learning theories and has resulted in significant learner-centered changes over the past several decades.8 Constructivists sought to shift the focus of instruction from passive to active and from educator to learner. Consistent with this model, exist- ing information literacy standards state the importance of “furthering the influence and impact of such student-centered teaching methods as problem-based learning, evidence-based learning, and inquiry learning,” all of which emphasize the develop- ment of higher-order thinking skills.9 In a related area of research informed by learning theory, a number of studies report that teachers in classes of high-achieving students are substantially more likely to em- phasize higher-order thinking processes than are teachers in classes of low-achieving students.10 These studies assert that students who have been placed into a higher-level academic track will more likely be exposed to high–critical-thinking teaching activi- ties, requiring higher-order thinking skills, while those placed into a lower academic track will experience rote-learning teaching styles. Examples of cognitive activities classified as higher order include analyzing, synthesizing, and evaluating information, constructing arguments, making comparisons, asking research questions, dealing with controversies, and identifying hidden assumptions, all of which draw directly from constructivist learning theory and relate directly to information literacy best practice.11 Several methods have been developed to assess information literacy competency, and they have been classified into the categories of fixed-choice tests, performance assessments, and rubrics.12 Most recently, the Educational Testing Service developed the iSkills assessment to measure ICT literacy. This performance-based exam is an hour-long, real-time, simulated, scenario-based test, and it has been noted that it was created based on constructivist theories of learning, often associated with higher-order information literacy skills and problem-solving.13 While a number of researchers have measured the information literacy of certain populations using the iSkills assessment and attempted to relate specific factors to student success on the measure, none have attempted to explore a relationship between students’ iSkills test scores and specific high school academic experiences asserted in the literature to impact the development of information literacy competency.14 In addition, no research has been reported on the measurement of the information literacy competency of special populations of students who may require additional academic support in the transition to college, such as students admitted to college on probationary status (henceforth referred to as “alternate admits”) or those who matriculate without a declared major (henceforth referred to as “exploring majors”). Finally, while a few studies exist on how the high school library, or library skills instruction, relates to students’ information literacy Fortifying the Pipeline 33 competency as measured in college, these studies are primarily based on lower-order skills instruction and fixed-choice tests as measures.15 Conceptual Framework In creating the conceptual framework of this study, the researcher explored the ar- ticulated professional best practice in the realm of information literacy. Information literacy best practice and standards state that students optimally develop this skill set through immersion in the research process—often and over time—and this proposition is also supported in the scholarly literature (see figure 1).16 Additionally, best practice emphasizes that students further develop these skills through exposure to problem solving and higher-order thinking activities—a teaching style that best matches that of constructivist learning theory.17 To explore this premise, the researcher also investigated teacher beliefs about student ability, based on their curricular track, and the resulting pedagogies the teachers employ. The measure for curricular track is students’ enrollment in honors courses, which include Advanced Placement, International Baccalaureate, and College Preparatory courses. Finally, the researcher chose to measure students’ information literacy abilities using an assessment that was created with students’ critical thinking and problem-solving abilities—rather than information recall—as its basis, again consistent with constructivist theory. This study also both explores and controls for effects of students’ background characteristics and cumulative core high school GPAs. Specifically, the author addressed the following research questions through this study: 1. How much of the variance in students’ iSkills assessment scores is predicted by the theoretically important variables of a) number of research assignments they completed in high school, b) students’ high school curricular tracks, and c) students’ cumulative core high school GPAs? 2. How much of the variance in students’ iSkills assessment scores is predicted by additional background variables—gender, race, best language, and type of admission (alternate admit or exploring major)? This study relies on two major assumptions, which serve to link professional best practice and theory with these research questions. The first is that, with regard to the FIGURE 1 Visual Model of Conceptual Framework 34 College & Research Libraries January 2015 development of information literacy competency, constructivist approaches to teach- ing and learning are superior to behaviorist approaches, which rely on rote learning and memorization. Second, it assumes that teachers of students in advanced curricular tracks believe them to be capable of learning higher-order skills; consequently, they will employ constructivist pedagogies more frequently with these students. Methodology In this study, the researcher explored the high school factors potentially impacting the information literacy competency of those 2011–12 academic year University of Nevada, Las Vegas (UNLV) freshmen who, as alternate admits and/or exploring majors, may require additional academic support in the transition to college. Approximately 46 percent of the students in this study’s sample were unable to meet the minimum re- quirement of a 3.0 GPA in their core academic high school courses and were admitted to the university using alternate criteria and placed on probationary academic status. Approximately 18 percent of the 2011–12 UNLV first-time college freshmen are alter- nate admits, and, in previous cohorts, this group of students has consistently had a lower first-year retention rate than students admitted on regular status.18 In addition, approximately 61 percent of the students in this sample did not declare a major upon entry and have been termed “exploring majors.” Approximately 25 percent of the 2011–12 UNLV first-time freshmen overall are exploring majors. Students who are both alternate admits and exploring majors comprise 22 percent of the sample. The students in this sample self-selected into a suite of services offered by the university’s Academic Success Center, including an optional first-year course and academic coaching. Participants Participants in the study were drawn from a cohort of first-time college freshmen, who attended and graduated from high school (not home schooled) in 2011 and enrolled at the University of Nevada, Las Vegas for the fall 2011 semester. These students self- selected into a program designed to foster academic success. Ninety-three students were surveyed, took the iSkills assessment, and agreed to provide access to background data. Hierarchical multiple regression analysis was used to predict how much of the variance in iSkills scores (dependent variable) can be explained by theoretical variables (independent variables of core GPA, number of honors classes, and number of research projects or assignments in high school), while controlling for demographic and other subject variables (that is, gender, best language, ethnicity, and type of admission—al- ternate admit/exploring major). This study sampled from a population of 1,333 students who were first-time fresh- man and alternate admits/exploring majors. Ninety-five students both self-selected into the Academic Success Center’s programs and took the iSkills assessment. Ninety-three of those students agreed to participate in this study. This sample represents 7 percent of the population being studied. After testing the normality of the sample based on the incoming core GPA of the population of UNLV alternate admits/exploring majors, the researcher found the sample to be a reasonable subset. Limitations While there are several limitations that may impact the findings and conclusions of this study, the researcher identified four major limitations. First, this study relies in part on students’ self-report of high school experiences, which may not be reflective of actual quantity or quality of these experiences. For example, although the researcher included number of research assignments in the conceptual framework and data col- lection, she was unable to confirm a quantitative link between numbers of research Fortifying the Pipeline 35 projects or assignments students completed in high school and information literacy competency. One reason for this was unreliability of student self-report data as con- firmed in subsequent focus groups. Second, there are a number of stages at which the UNLV alternate admits/exploring majors self-selected into the study sample, leading to a degree of self-selection bias. Third, findings from this study are limited to associa- tions between variables and cannot be used to establish causation. Finally, the study is based on a relatively small sample size that, while appropriate for the statistical methods used, may inhibit the significance of quantitative results and are addressed in their discussion below. Analysis and Findings The purpose of the quantitative analysis is to examine predictors of first-time fresh- men students’ information literacy competency as measured by the iSkills assessment. Although learning theory and information literacy standards assert best practice in students’ development of this competency, few previous studies have focused efforts on identifying variables related to the development of higher-order skills with a per- formance-based measure as its basis. In addition, research on the information literacy competency of special populations (such as alternate admits and exploring majors) is needed, as it holds promise for the improvement of programs aimed at enhancing student success. Version 19 of the SPSS statistical package was used to analyze the quantitative data. In this sample, 56 percent of participants were female, and 44 percent were male. In terms of “best language,” 84 percent responded that “English only” was their best language, while 16 percent responded that “English and another language” or “another language” was their best language. A majority of the sample was Caucasian (39 per- cent), with 22 percent African-American, 15 percent Asian, and 25 percent identifying as Hispanic and “other.” Alternate admits comprised 46 percent of the sample, while 61 percent were exploring majors. Students who were both alternate-admit students and exploring majors made up 22 percent of participants. Table 1 presents descriptive statistics for and correlations among all study variables. Scores on the iSkills assessment range from 0 to 500 and are allocated in 10-point inter- vals, with the cut score (in other words, the minimum exam score needed to determine whether students meet ICT literacy standards) at 260. Overall, the mean for students’ reported iSkills scores was below the cut score (M = 207.85, SD = 58.18). The group’s mean cumulative core high school GPA exceeded a 3.0, which is the GPA needed for regular admission to UNLV (M = 3.15, SD = .58). Finally, the mean number of honors courses taken by the group in high school was 6.20, with a standard deviation of 6.11. Additional means and standard deviations provided in table 1 are based on dummy- coded variables (gender, best language, race, and admission type). There were significant correlations between iSkills assessment scores and almost all predictor variables (see table 1). iSkills scores correlated positively at the p < .05 level with exploring major, core high school GPA, and having had five to twelve honors courses. iSkills scores correlated negatively at the p < .05 level with languages other than English being “best,” Asian race, alternate admission status, and having had one to four honors courses. There was not a significant correlation between iSkills scores and gender, and African American, Hispanic or other races. The researcher performed a hierarchical multiple regression analysis to test the vari- ance in iSkills assessment scores explained by student background characteristics and theoretically important variables. Regression analysis is used to predict a continuous dependent variable (such as iSkills assessment score) from a number of independent variables. Hierarchical multiple regression allowed the researcher to specify a fixed 36 College & Research Libraries January 2015 order of entry for variables to measure the effects of several different independent variables on the dependent variable at one time, while controlling for the effects of covariates, or to test the effects of certain predictors independent of the influence of others. Student gender was entered into the first block, best language into the second block, and student race into the third block. The researcher entered admission type, including alternate admit or exploring major status, into the fourth block. Core cumula- tive high school GPA was entered into the fifth block and curricular track (number of honors courses) into the sixth block. The researcher performed the analysis using SPSS REGRESSION. Tables 2 and 3 show the results of the model. In hierarchical multiple regression analysis, each dummy-coded variable must have a reference group to which it is compared.19 In this analysis, the reference group characteristics are: male, English as best language, Caucasian, and both alternate admit and an exploring major. Students in this group, on average, scored a 207.85 on the iSkills assessment and provide a basis to which other students are compared, reflected as CONSTANT in table 3. As shown in table 2, three of the variables explained unique variance in the depen- dent variable, iSkills assessment score: best language (R2change = .08, p < .01), cumu- TABLE 1 Descriptive Statistics and Correlations among Study Variables Variable l 2 3 4 5 6 7 8 9 10 11 1.Gender — 2. English Not Best Language –.05 — Race: 3. African Am –.01 –.22* — 4. Asian –.17* .34* –.22* — 5. Hispanic and Other .11 .31* –.30* –.24* — Admission Type: 6. Alt Admit .13 –.02 .14 –.03 .02 — 7. Exploring Major –.04 –.04 –.28* .03 .05 –.28* — 8. Core GPA (Mean Deviated) –.05 –.04 –.15 –.05 .01 –.80* .32* — Number of Honors Courses: 9. 1–4 –.24 .00 .00 –.03 .02 .07 –.01 –.03 — 10. 5–12 .16 –.01 .01 .02 .10 –.07 .11 .01 –.28* — 11. 13+ .01 .00 –.06 –.17* .02 –.38* .10 .53* –.33* –.28* — 12. iSkills Score (M = 207.85, SD = 58.18 .07 –.28* –.13 –.22* –.04 –.22* .21* .37* –.30* .22* .36* M .56 1.17 .22 .15 .25 .46 .61 .00 .25 .19 .25 SD .50 .41 .41 .36 .43 .50 .49 .58 .43 .40 .43 *p < .05 Fortifying the Pipeline 37 lative core GPA (R2change = .06, p = .01), and high school curricular track (R2change = .14, p < .001). Race (R2change = .08, p = .07) and type of admission (R2change = .05, p = .07) contributed smaller effects. Gender was not a significant predictor of iSkills score (R2change = .01, p = .52). Results in table 3 show the effect of independent variables across the regression model, as they were entered in blocks, as explained above. As shown in table 3, the effect of not having English as a preferred language was a consistently negative pre- dictor for the iSkills assessment score across the regression model (β = –29.50, p = .05). For race, the two predictors that still had a moderate effect in step 6, both in a negative direction, were: African American significantly predicting iSkills score (β = –28.72, p = .05) and Asian as a moderate predictor (β = –29.75, p = .09). Cumulative core GPA remained a positive predictor of iSkills score through step 6 (β = 32.26, p = .05). When added in as the final step, the categories of 5–12 honors courses (β = 39.67, p = .01), and at a moderate level, 13+ honors courses (β = 32.85, p = .06) were significant positive predictors of iSkills score above and beyond GPA. To summarize, through the hierarchical multiple regression analysis, four variables predictive of a significantly higher score on the iSkills assessment at the p < .05 level were identified. Among background variables, a student’s best language, and to some extent, race, are significant predictors of his or her iSkills assessment score. After con- trolling for these differences, among the theoretically important variables, students’ cumulative core high school GPAs, as well as their curricular tracks (number of honors and other advanced-placement classes taken) explained a significant amount of the variance in students’ iSkills assessment scores. Discussion of Findings The results of these exploratory quantitative analyses supported the researcher’s conceptual framework. With this framework as a guide, the researcher expected that students who experienced a higher curricular track in high school—students who had taken greater numbers of honors courses—could be predicted to score significantly higher on the iSkills assessment than their peers, who had experienced a lower cur- ricular track in high school. While this expectation was confirmed in the quantitative results, there could be many explanations for it. However, even when controlling for factors such as gender, language, race, admission type, and GPA, curricular track continued to be a significant and strong predictor of iSkills score. One interesting element of the findings arises when examining results by curricular track breakdown. To normally distribute the curricular track variable, the researcher categorized students into groups of those who took no honors classes in high school, TABLE 2 Hierarchical Regression Analyses Predicting Performance on iSkills Step 1 Gender Step 2 Eng Not Best Step 3 Race Step 4 Adm Type Step 5 Core GPA Step 6 # Honors Courses R2 .01 .08 .15 .20 .27 .40 Adj R2 –.01 .06 .10 .14 .19 .32 R2 Change .01 .08 .07 .05 .06 .14 F Change .42 7.39** 2.48+ 2.70+ 6.73* 6.45*** df Change 1, 91 1, 90 3, 87 2, 85 1, 84 3, 81 + p < .10; *p < .05; **p < .01; ***p < .001 38 College & Research Libraries January 2015 TA B L E 3 H ie ra rc hi ca l R eg re ss io n C oe ffi ci en ts P re di ct or St ep 1 St ep 2 St ep 3 St ep 4 St ep 5 St ep 6 β SE .β β SE .β β SE .β β SE .β β SE .β β SE .β C O N ST A N T 20 3. 42 9. 11 2 50 .4 2 19 .4 0 26 0. 70 20 .3 2 25 6. 58 23 .5 5 23 9. 46 23 .7 3 23 5. 41 23 .0 7 G en de r 7. 93 12 .1 9 6. 31 11 .8 0 3. 13 11 .7 1 6. 50 11 .5 9 4. 56 11 .2 4 –6 .5 4 10 .7 8 E ng lis h N ot B es t L an gu ag e –3 9. 33 ** 14 .4 7 –3 2. 25 + 1 6. 63 –3 1. 13 + 16 .4 0 –2 8. 51 + 15 .9 0 –2 9. 50 * 14 .6 8 R ac e A fr ic an A m er ic an A si an H is pa ni c an d O th er –3 6. 67 * –3 6. 30 + –1 3. 96 15 .3 6 1 9. 39 16 .1 0 –2 7. 40 + –3 4. 26 + –1 1. 87 15 .8 0 19 .0 5 15 .8 3 –2 5. 48 + –3 0. 56 –1 2. 26 1 5. 31 18 .4 9 15 .3 3 –2 8. 72 * –2 9. 75 + –1 5. 04 14 .0 8 17 .5 0 14 .1 8 A dm is si on T yp e A lt A dm it E xp lo ri ng M aj or –2 0. 97 + 12 .7 3 11 .8 6 12 .4 8 15 .9 5 8. 14 18 .2 9 12 .2 1 23 .4 7 6. 04 16 .9 8 11 .2 7 C or e G PA (m ea n de vi at ed ) 41 .0 5* 15 .8 3 32 .2 6* 16 .4 1 # o f H on or s C ou rs es 1– 4 5– 12 13 + –2 1. 46 39 .6 7* 32 .8 5+ 14 .5 4 15 .3 6 17 .3 9 + p < .1 0; * p < .0 5; * *p < .0 1; * ** p < .0 01 Fortifying the Pipeline 39 those who took 1–4 honors classes, those who took 5–12 honors classes, and those who took 13 or more honors classes. While the 5–12 honors classes category was significant at the p = .01 level, the 13+ honors classes category was only significant at the p = .08 level. When examining the 13+ group further, the researcher noted that there was a large amount of variance in iSkills scores of the group, perhaps causing the predictive result to be less statistically significant. In subsequent focus groups, the researcher asked students how well they felt the iSkills assessment measured the critical-thinking and problem-solving skills they developed in high school. This question resulted in comments about iSkills that may shed light on the phenomenon described above. While non–honors-track students talked about test anxiety, honors track students discussed two aspects of the assess- ment that non-honors students did not: the desire to get the right answer and the fact that the test bored them. In regard to their desire to get the right answer, some honors track students mentioned that this caused them to run out of time on the test (which, in turn, would have lowered their scores). For example: Student 4: I never got to it ’cause I was just trying to get that one task right and—I was just focusing on it so much. I ran out of—you know I didn’t finish ’em. But it was like, if, ’cause I only had a little bit of time I just wanted to finish it. So like maybe the last two questions, I was like, okay. Whatever. Just clicking. Whatever. and Student 4: I felt like if I just had more time I could have gotten a way better score. I suck at, I suck at timed anything. Like SAT’s. Like since they’re all timed, like I definitely got more problems wrong than I would have if I, I just had enough time. In addition, two honors students discussed being bored by the assessment, especially toward the conclusion: Student 6: I really got bored at the end. I was just like feeling whatever. Student 9: Yeah, I don’t think it was accurate. Student 6: Just to get it done. In the end. ’Cause it was long. Thus, while there is no definitive answer on why students taking 13 or more hon- ors classes did not consistently score higher than those taking 5–12, the discussion in focus groups did bring to light some interesting factors that very well could have influenced scores. Ultimately, this study is informed by constructivist theory and further informs the literature on constructivist practices. Kuhlthau states: Constructivist type of learning is transferable to situations in the real world. Students learn to think through issues that do not have prescribed responses or preset solutions. Students learn to identify what is important to them, to construct new meanings, and to explain their new understanding to others in some way that is authentic to the topic.20 Through quantitative explorations, the researcher has shown that, even after controlling for background characteristics and GPA, students who have experienced an honors-intense curriculum in high school are more likely to score significantly higher on the iSkills assessment, a measure that, consistent with constructivist 40 College & Research Libraries January 2015 theory, was created with students’ critical thinking and problem-solving abilities as its basis. Implications for Further Research This study examines educational practices that have been in use for over fifty years. One of the study’s major assumptions revolves around the belief that, with regard to the transfer of information literacy competency, constructivist approaches to teaching and learning are superior to behaviorist approaches, which rely on rote learning and memorization. However, within the realm of information literacy, the study serves to validate recommendations in standards and best practices that have not previously been empirically tested. In that regard, there is still far to go when it comes to under- standing how students best develop information literacy competency. The second assumption of this study is that teachers of students in advanced cur- ricular tracks believe them to be capable of learning higher-order skills; consequently, they will employ constructivist pedagogies more frequently. While this assumption was confirmed in subsequent focus groups, there are many other ways that this phe- nomenon could be studied and confirmed, both generally and specifically within the realm of information literacy. While the research design employed was appropriate for the exploratory nature of the study, there is reason to believe that a larger, randomly drawn sample could pro- duce different results. That being said, there is an interesting quantitative result that deserves further exploration: that is, to explore if there is, in fact, a real phenomenon explaining the findings that students who took 13+ honors courses in high school are less likely to score as high on the iSkills assessment as those who took 5–12 honors courses. Simply stated, is there a “sweet spot” for honors students—a point after which additional courses make no difference or may in fact be detrimental when it comes to information literacy competency? Alternatively, does iSkills do a good job of measuring ICT skills for those moderately competent but not for those who are very accomplished? The population of the study begs further examination in terms of granularity. For example, while honors track is defined broadly in this study, a more specific study of students in Advanced Placement or International Baccalaureate curricula may yield different results. Additionally, with regard to the variable of “race,” even when controlling for English language abilities (which is a known issue with the iSkills as- sessment), African American and Asian race still predicted a moderately significant decrease in iSkills score. At the time of this study, there is no mandatory information literacy curriculum, regardless of existing standards. However, as the emerging Common Core Standards for K–12 education are implemented, they may further impact students’ development of information literacy competency. Therefore, a repetition of this study in five years is warranted. Finally, although outside the scope of this study, there is evidence to suggest that the same pedagogies and educational practices recommended for high school students and their development of information literacy competency would continue to benefit them into the college years. This study has sparked a very real interest in a longitudi- nal cohort study to examine these same students’ college curricular experiences and whether they might predict score increases in further iSkills testing at exit. Conclusion Through this study, the researcher has contributed to the knowledge base of informa- tion literacy competency and the factors that impact its development. Studies that have previously attempted to explore how high school factors relate to students’ informa- Fortifying the Pipeline 41 tion literacy competency, measured once students matriculate to college, have been based on lower-order skills instruction using fixed-choice tests as measures.21 In this study, the researcher explored the development of information literacy competency using higher-order skills and a performance-based measure and was able to show that curricular track is a significant predictor of information literacy competency, regardless of student gender, language, race, admission type, or GPA. Although the researcher was unable to confirm a quantitative link between numbers of research projects or assignments students completed in high school and information literacy competency, she was able to further pursue this thread through qualitative methods to better understand what “research” meant to these students. These results will be reported in a subsequent article. By exploring the relationship between high school experiences asserted to impact the development of information literacy competency and students’ iSkills test scores, the researcher was able to validate key aspects of information literacy standards and best practice. Finally, this study adds to the literature of information literacy gener- ally and provides a basis for further study of the information literacy competency of special college populations known to be at higher risk of not succeeding academically. Notes 1. American Library Association, Presidential Committee on Information Literacy: Final Report (1989), para. 3, available online at www.ala.org/acrl/publications/whitepapers/presidential [ac- cessed 6 March 2011]; Christine Bruce, The Seven Faces of Information Literacy (Adelaide: Auslib Press, 1997); Michael Eisenberg, “Information Literacy: Essential Skills for the Information Age,” DESIDOC Journal of Library & Information Technology 28 (2008): 39–47; Mary Ann Fitzgerald, “Making the Leap from High School to College,” Knowledge Quest 32, no. 4 (Mar. 2004): 19–24; Bill Johnston and Sheila Webber, “Information Literacy in Higher Education: A Review and Case Study,” Studies in Higher Education 28, no. 3 (2003): 335–52; National Leadership Council for Liberal Education and America’s Promise, College Learning for the New Global Century (Washington, D.C.: Association of American Colleges and Universities, 2007), available online at www.aacu.org/leap/ documents/GlobalCentury_final.pdf [accessed 24 November 2014]; Barack Obama, National Infor- mation Literacy Awareness Month: A Proclamation (2009), available online at www.whitehouse.gov/ the_press_office/Presidential-Proclamation-National-Information-Literacy-Awareness-Month/ [accessed 21 September 2010]; Hannelore B. Rader, “Information Literacy 1973–2002: A Selected Literature Review,” Library Trends 51, no. 2 (2002): 242–59. 2. Susan M. Allen, “Information Literacy, ICT, High School, and College Expectations,” Knowledge Quest 35, no. 5 (May 2007): 18–24; International ICT Literacy Panel, Digital Transfor- mation: A Framework for ICT Literacy (2007), available online at http://www.ets.org/Media/Tests/ Information_and_Communication_Technology_Literacy/ictreport.pdf [accessed 24 November 2014]. 3. American Association of School Librarians (AASL) [and] Association for Educational Communications and Technology (AECT), Information Power: Building Partnerships for Learning (Chicago: American Library Association, 1998). Some examples are described in Colet Bartow, “How One State Established School Library/Technology Standards,” School Library Monthly 26, no. 3 (Nov. 2009): 19–21; and Nevada Department of Education, Nevada Information Literacy Standards (2003), available online at http://doe.nv.gov/Standards/IL/infolit.pdf [accessed 15 July 2013]. 4. National Governors Association Center for Best Practices (NGA Center) [and] Council of Chief State School Officers (CCSSO), Common Core State Standards Initiative (2010), available online at www.corestandards.org/ [accessed 24 February 2012]. 5. Ramona L. Islam and Lisa Anne Murno, “From Perceptions to Connections: Informing Information Literacy Program Planning in Academic Libraries through Examination of High School Library Media Center Curricula,” College & Research Libraries 67, no. 6 (2006): 492–514; Patricia Owen, “A Transition Checklist for High School Seniors,” School Library Monthly 26, no. 8 (2010): 20–23. 6. Michael B. Eisenberg and Robert E. Berkowitz, Information Problem Solving: The Big Six Skills Approach to Library & Information Skills Instruction (Norwood, N.J.: Ablex Publishing, 1990); Amy Irving, Study and Information Skills across the Curriculum (Portsmouth, N.H.: Heinemann, 1985); Carol C. Kuhlthau, Rutgers, The State University of New Jersey School of Communication, 42 College & Research Libraries January 2015 Information, and Library Studies and Others, “Facilitating Information Seeking through Cogni- tive Modeling of the Search Process: A Library Studies Research Project,” (1986), available online at www.eric.ed.gov/PDFS/ED328268.pdf [accessed 24 November 2014]; Barbara K. Stripling and Judy M. Pitts, Brainstorms and Blueprints: Teaching Library Research as a Thinking Process (Englewood, Colo.: Libraries Unlimited, 1988); AASL and AECT, Information Power. 7. Allen, “Information Literacy;” Carol A. Gordon, “Students as Authentic Researchers: A New Prescription for the High School Research Assignment,” School Library Media Research 2 (Jan. 2, 1999), available from Education FullText; Islam and Murno, “From Perceptions to Connections;” Carol C. Kuhlthau, Seeking Meaning: A Process Approach to Library and Information Services, 2nd ed. (Westport, Conn.: Libraries Unlimited, 2004); Rader, “Information Literacy 1973–2002.” 8. Char Booth, Reflective Teaching, Effective Learning: Instructional Literacy for Library Educators (Chicago: American Library Association, 2011). 9. Association of College and Research Libraries (ACRL), Information Literacy Competency Standards for Higher Education (2000), para. 10, available online at www.ala.org/acrl/standards/ informationliteracycompetency [accessed 1 November 2010]. 10. David H. Hargreaves, Social Relations in a Secondary School (London: Routledge, 1998); Mary Haywood Metz, Classrooms and Corridors: The Crisis of Authority in Desegregated Secondary Schools (Berkeley: University of California Press, 1978); Jeannie Oakes, Keeping Track: How Schools Structure Inequality (New Haven: Yale University Press, 1985); Jeannie Oakes, Multiplying Inequalities: The Effects of Race, Social Class, and Tracking on Opportunities to Learn Mathematics and Science (Santa Monica, Calif.: Rand Corporation, 1990); Reba N. Page, “Games of Chance: The Lower-Track Curriculum in a College-Preparatory High School,” Curriculum Inquiry 20, no. 3 (1990): 249–81; Stephen W. Raudenbush, Brian Rowan, and Yuk Fai Cheong, “Higher Order Instructional Goals in Secondary Schools: Class, Teacher, and School Influences,” American Educational Research Jour- nal 30, no. 3 (1993): 523–53; Bruce Torff, “Expert Teachers’ Beliefs about Use of Critical-Thinking Activities with High-and Low-Advantage Learners, “Teacher Education Quarterly 33, no. 2 (2006): 37–52; Bruce Torff, “Using the Critical Thinking Belief Appraisal to Assess the Rigor Gap,” Learn- ing Inquiry 2, no. 1 (2008): 29–52; Edward Warburton and Bruce Torff, “The Effect of Perceived Learner Advantages on Teachers’ Beliefs about Critical-Thinking Activities,” Journal of Teacher Education 56, no. 1 (2005): 24–33; Anat Zohar, Adi Degani, and Eomav Vaaknin, “Teachers’ Beliefs about Low-Achieving Students and Higher Order Thinking,” Teaching and Teacher Education 17, no. 4 (2001): 469–85; Anat Zohar and Yehudit J. Dori, “Higher Order Thinking Skills and Low- Achieving Students: Are They Mutually Exclusive?” Journal of the Learning Sciences 12, no. 2 (2003): 145–81. 11. Benjamin S. Bloom and David R. Kratwohl, Taxonomy of Educational Objectives: The Clas- sification of Educational Goals, Handbook I: Cognitive Domain (New York: Longmans, 1956). 12. Megan J. Oakleaf, “Dangers and Opportunities: A Conceptual Map of Information Literacy Assessment Approaches,” portal: Libraries and the Academy 8, no. 3 (2008): 233–54. 13. Ibid. 14. Allen, “Information Literacy;” Teresa Egan and Irvin R. Katz, “Thinking beyond Technol- ogy,” Knowledge Quest 35, no. 5 (2007): 36–42; Irvin R. Katz and Alexius Smith Macklin, “Infor- mation and Communication Technology (ICT) Literacy: Integration and Assessment in Higher Education,” Journal of Systemics, Cybernetics and Informatics 5, no. 4 (2007): 50–55. 15. For example, Topsy N. Smalley, “College Success: High School Librarians Make the Dif- ference,” Journal of Academic Librarianship 30, no. 3 (2004): 193–98. 16. ACRL, Information Literacy Competency Standards; AASL and AECT, Information Power; Eisenberg and Berkowitz, Information Problem Solving; Kuhlthau et al., “Facilitating Information Seeking;” Irving, Study and Information Skills; Stripling and Pitts, Brainstorms and Blueprints. 17. ACRL, Information Literacy Competency Standards. 18. University of Nevada, Las Vegas, Office of Institutional Analysis and Planning, Retention Summary Points (Feb. 2009), available online at https://ir.unlv.edu/IAP/Files/Retention_Sum- mary_Points.aspx [accessed 24 November 2014]. 19. Jacob Cohen, Patricia Cohen, Stephen G. West, and Leona S. Aiken, Applied Multiple Re- gression/Correlation Analysis for the Behavioral Sciences, 3rd ed. (Mahwah, N.J.: Lawrence Erlbaum Associates, 2003). 20. Carol C. Kuhlthau, “Learning in Digital Libraries: An Information Search Process Ap- proach,” Library Trends 45, no. 4 (1997): 711. 21. For example, Smalley, “College Success.”