orme.indd Residual Impact of the Texas Information Literacy Tutorial 205 A Study of the Residual Impact of the Texas Information Literacy Tutorial on the Information-Seeking Ability of First Year College Students William A. Orme The study discussed in this paper investigated the residual impact of the Web-based tutorial known as the Texas Information Literacy Tutorial (TILT) on first-year college students and their ability to perform tasks re- lated to information research. Unique to this study is the investigation of ability beyond the semester in which instruction was provided.The study examined four groups of students, each of which received a different type of information skills instruction. Results and implications are discussed at the end of the article. hroughout the decade of the 1990s, colleges and univer- sities adopted a variety of innovations in the spirit of continuous improvement. For library instruction programs, one of the most important of these innovations was the development of the “first-year experi- ence” program for incoming freshmen. The National Resource Center for the First-Year Experience reported in 2000 that “over 70 per cent of U.S. colleges and universities offer special first year seminars.”1 Library instruction pro- grams have traditionally worked most closely with freshman populations, and it has long been an aspiration of many programs on large campuses to have the opportunity to reach the entire freshman cohort. In 1996, Indiana University-Purdue University Indianapolis (IUPUI) began offering a first-year experience program through the newly formed academic unit, University College. IUPUI’s program teamed faculty with student mentors, advisors, and librarians so that they could share expertise and jointly fashion a program that would prove meaningful for incoming students. In response to this new initiative, IUPUI’s University Library formed a team of instructional librarians who were charged with developing and providing the library component of this new course. Prior to this new initiative, most fresh- man library instruction occurred in mul- tisection survey courses such as speech communication or elementary composi- tion. Student research in these courses William A. Orme is Associate Librarian at Indiana University-Purdue University Indianapolis; e-mail: orme@iupui.edu. 205 mailto:orme@iupui.edu 206 College & Research Libraries covered a broad spectrum of topics, and librarians were typically given a single opportunity to provide research guidance in an instructional se ing. The so-called one-shot instructional session has been falling out of favor for some time. As John E. Cooper noted: At large universities a common problem has been finding an effec- tive way of handling instruction for beginning students. Orientation tours of the library, used success- fully with small classes, become unwieldy when they involve hun- dreds of students. The one-hour bibliographic instruction session, another frequently used format, may be perceived by students as irrelevant to their specific informa- tion needs. In either format students are too o en passive recipients of instructions, retaining li le of what is presented in their orientation.2 The notion of retaining information is of particular importance. Where freshman bibliographic instruction (BI) sessions once were simply pragmatic exercises in negotiating the expectations of a particu- lar course, today they are o en intended to form the foundation for information literacy skills that students can build on as they proceed through their academic careers and beyond. In what has become a landmark article, John Bransford discussed Alfred North Whiteheads’ concept of inert knowledge.3 Inert knowledge is defined as “knowledge that is accessed only in a restricted set of contexts even though it is applicable to a wide variety of domains.” Bransford re- lated a situation in which college freshmen were asked, “Try to remember what you learned about the concept of logarithms. Can you think of any way that they might make problem-solving simpler than it May 2004 would be if they did not exist?” Bransford indicated that college students who were asked this were able to recall something about logarithms, but most viewed them as math exercises rather than useful inven- tions that simplify problem-solving. It ap- pears that instructional librarians and math professors share common problems. William Graves Perry’s research into the intellectual development of college stu- dents revealed that there is a predictable pa ern of growth and that this pa ern is related not to specific intellectual content but, rather, to conceptual hierarchies that shape how students perceive knowledge and are prepared to accumulate knowl- edge as they proceed through their aca- demic careers.4 If these two notions are combined and applied to the freshman learning experience, the stakes for in- structional programs become somewhat higher than they have been traditionally and our responsibility as instructional librarians expands beyond the confines of the course because our contribution to student success is not only to help students succeed in a particular course, but also to help them develop conceptual hierarchies that foster their continuing intellectual development. This expanded notion of contribution is in line with our professional concerns surrounding the development of information literacy skills. It also has importance for how and when we assess whether students have achieved our educational objectives. As stated previously, first-year experi- ence seminars offer a unique opportu- nity to provide baseline instruction to the entire cohort of incoming students. The instructional team approach offered the opportunity to customize instructional approaches for different disciplines and different first-year populations (first-gen- eration students, adult returning students, and so on). Unfortunately, libraries are typically ill equipped to take advantage Residual Impact of the Texas Information Literacy Tutorial 207 of this opportunity. The title of a recent article sums it up succinctly: “Too Many Students, Too Li le Time.”5 Of course, there also is the issue of too few librarians for the task at hand. At IUPUI, the rapid growth of the first-year program meant that it was not long before the number of librarians available to work with this pro- gram was far outstripped by the number of course sections being offered. The IU- PUI program is successful. It has increased student retention (academic persistence) and increased freshman grade-point averages, particularly among students admi ed conditionally to the university.6 Success has bred growth. The first semes- ter of the new first-year program (fall 1996) included twenty-three course sections. By fall 1999, more than one hundred sections were included in the program. A variety of articles in the library literature have stressed the challenge of meeting large-scale demands for library instruction, and in many cases alternative instructional methods were cited as a pos- sible solution.7 In fact, Stephanie Michel specifically stated in her article that “the CAI (computer-assisted instruction) phe- nomenon resulted from libraries’ inability to keep up with the tremendous demand for instruction.”8 This was true at IUPUI, and in order to address the problem, the library’s team of instructional librarians investigated the possibility of developing a Web-based tutorial to alleviate these demands. The workgroup assigned to this task reported that time and resources needed for the development of such a tu- torial precluded rapid development and that other venues should be explored. The workgroup discovered TILT—The Texas Information Literacy Tutorial. TILT, as most instructional librarians know by now, was developed at the Uni- versity of Texas-Austin (UTA). Consist- ing of three modules, each with a quiz at the end, TILT is a Web-based tutorial that covers a wide range of relatively basic information research skills. The first module is intended primarily as a review of what students should already know when they arrive at college. The second acquaints students with search strate- gies, sources, and techniques. And the third covers, among other things, issues of source evaluation and plagiarism. The three modules were carefully designed to follow the contours of Bloom’s cognitive taxonomy.9 The quizzes, each containing eight questions, provided immediate feedback to students. The quizzes can be taken repeatedly until students achieve a desired or specified score. UTA made TILT available to IUPUI and offered to handle the e-mail traffic that the quizzes would generate. (Quiz results are typi- cally e-mailed to faculty or librarians as proof that modules have been completed.) The IUPUI workgroup arranged for TILT to be linked to University Library’s Web site, and IUPUI’s instructional librarians agreed to begin using TILT in the fall 2000 semester. This researcher used TILT in conjunc- tion with classroom-based instruction during the fall 2000 semester. To ensure that students were familiar with the TILT material, TILT was assigned prior to classroom instruction and students were required to pass each module quiz with a minimum score of 90 percent. TILT had no noticeable impact on the preparedness level of students. Moreover, students showed no evidence of grasping termi- nology or concepts covered in the TILT modules. The research study described in this paper was designed and carried out to investigate whether TILT was a useful instructional tool for this type of program. Methodology The working hypothesis for this study was the null hypothesis that TILT had 208 College & Research Libraries no residual impact on the improvement of first-year students’ information-seek- ing skills. The notion of residual impact hearkens back to Whitehead’s concept of inert knowledge and to a professional desire to provide fundamental informa- tion literacy skills that students can call upon later in their academic careers. Perry’s research shows that, intellectu- ally, first-year students are motivated to please authority figures (faculty) by providing answers that they think are expected. One student interviewed by Perry is quoted as saying, “Well, the only thing I could say to a prospective student is just say, ‘If you come here and you do everything you’re supposed to do, you’ll be all right.’ That’s just about all.”10 Perry’s findings serve as a warning that perhaps traditional assessment schedules, which a empt to measure learning within the same semester before and a er instruc- tion, might not be the best approach to investigating whether persistent learning is taking place because students are still motivated to “please the instructor.” As a result of this concern, this study was constructed so that instructional infor- mation would be provided during the fall semester and assessment would be conducted in the spring semester. The spring semester immediately following the fall semester has no formal library instruction component. Thus, student responses would not be affected by any subsequent learning opportunities and would be free from the psychological forces mentioned by Perry. In order to fairly assess its impact, it is important to consider how TILT was intended to be used. Elizabeth Dupuis and Clara Fowler, the persons most re- sponsible for the development of TILT, clearly indicated that it was not intended as a replacement for classroom instruction but, rather, as a replacement for the more mundane aspects of classroom instruction May 2004 so that librarians could transcend funda- mental concepts and go beyond what they had been doing before.11 Four student cohorts were established for the fall 2001 semester, with each cohort representing a different instructional en- vironment. The four cohorts were: • no instruction; • TILT alone; • classroom instruction; • TILT plus classroom instruction. Although Dupuis and Fowler em- phasized that TILT was not intended as a replacement for classroom instruc- tion, it was important to discover what impact, if any, TILT might have on its own. This study also provided a unique opportunity to examine a control group of students that received no information skills instruction so that their results might be compared with those of students who did receive traditional classroom instruction. A large population was needed for this study. The Kelley School of Business on the IUPUI campus requires a first-year seminar for entrance into the school. Con- sequently, this is the largest first-semester population in a single course on campus and multiple sections of this course are offered each fall. Approval for including sections of this course in the study were received from the course coordinator, fac- ulty members teaching individual course sections, and instructional librarians responsible for providing information TABLE 1 Cohorts/No. of Students in Cohorts (N = 128) Cohort No. of Students in Cohort No instruction 34 TILT alone 32 Classroom instruction 31 TILT + classroom 31 http:before.11 Residual Impact of the Texas Information Literacy Tutorial 209 skills instruction to this population. The researcher was not responsible for pro- viding instruction for any of the course sections involved in the study. Each cohort consisted of four course sections. Course sections were selected so that each cohort contained one morning class, one midmorning or early a ernoon class, one later a ernoon class, and one evening class. This was done to eliminate time of day as a variable that might affect the results of the study. The original intent was to draw forty students from each cohort for the study. Not all sections filled to maximum enroll- ment (25 students). Some students who enrolled did not complete the course, and some students who completed the course did not complete the TILT modules successfully. Some students could not be reached by telephone or e-mail, and some declined to be part of the study. The study was conducted successfully with at least thirty students from each cohort and a to- tal of 128 students, as shown in table 1. Protocol During the fall 2001 semester, students were not told that their course section was to be part of a research study nor that there was anything distinctive about their class in terms of how or whether they received information skills instruc- tion. During the spring 2002 semester, the researcher contacted students by e-mail or telephone and asked them to participate in a study of the effectiveness of teach- ing tools used in their fall 2001 first-year seminar course. Students were offered a stipend of $20, which could be applied to existing bursar fees or placed on a student debit card, and asked to come to the library where they would participate in a thirty-minute study that included a questionnaire and some brief library research tasks. Students in cohorts where TILT was used were only included in the study if they passed the TILT quizzes at the designated threshold (a score of 90% or higher a er multiple quiz a empts), and students from non-TILT cohorts were included in the study if they completed the seminar course successfully. In all cases, students included in the study would have been typically identified as “successful students.” Students were scheduled individu- ally for participation in the study. They received an explanation of its general pur- pose and requirements, and information on how the stipend would be provided. They then were asked to read and sign a consent form. The first part of the protocol consisted of a forty-item questionnaire. (The questionnaire is available upon request from the researcher.) The initial portion of the questionnaire focused on demographics (gender, age range, credit hours completed prior to fall 2001), con- fidence (level of enjoyment in using com- puters for class work, level of confidence in using various computerized library services), usage pa erns (frequency of use of library resources, method of ac- cess to library resources, most commonly used library resources, and most common reason for using a library computer). The remainder of the questionnaire was drawn directly from the TILT quizzes. One ques- tion from TILT was omi ed because all available answers were correct. Upon completion of the questionnaire, each student was shown the library’s Web site and asked to find and launch a specific database (Ingenta). When this was accomplished, the student was given an index card bearing the phrase “infor- mation literacy” and asked to conduct a search for that term in the database. Stu- dents then were guided to a specific result (citation) and given a transcription task: to write on a second blank index card what- ever information appeared in their result that would be needed to discover whether 210 College & Research Libraries TABLE 2 Mean Scores (TILT-based Questions) Cohort No. in Cohort Mean TILT Score No instruction 34 24.47 TILT 32 26.34 Classroom instruction 31 25.42 TILT + classroom 31 26.61 the library could provide the item in ques- tion. The researcher was careful not to use terms such as periodical, index, or citation that students might have encountered during instructional sessions. Upon completion of the transcription task, students again were shown the library’s Web site and asked to discover whether the library could provide the item described on the card. Again, the researcher was careful not to use terms such as catalog, periodical, or article. All tasks were recorded and charted, includ- ing the search paths that each student a empted and whether the a empts were successful. Checklists used to record at- tempts included information about Web navigation strategies, number of a empts needed to find the designated database and item, success or failure in finding the designated database and item, and amount of time required for each task. Students received minimal assistance with the protocol and none that would influence results. They were made aware when questionnaire items might require more than one answer and shown text boxes in which to type database searches. No assistance was provided with tran- May 2004 scription tasks or database naviga- tion (because search paths were of interest to the researcher). The protocol was piloted in January 2002 with twenty-eight students from first-year seminars who were not part of the Kelley School of Business. These students were drawn from sections that the researcher had worked with during the fall 2001 semester. As a result of the pilot, minor changes were made in instru- mentation and in the conduct of the pro- tocol. Most especially, the pilot allowed the researcher to discover how much time was needed for the protocol and to make useful adjustments to observational checklists used to track student progress with tasks. Pilot students were awarded the same stipends that students would receive for participation in the actual study. Finally, the study was launched in February 2002. Results TILT questions were part of the protocol questionnaire. The first aspect of the study examined for results was the mean TILT score difference between cohorts. Two cohorts were not exposed to these ques- tions during the fall semester as part of instruction. This would suggest that there should be noticeable differences between the scores of students in cohorts with and without TILT as part of their instructional environment. Analysis revealed that dif- ferences lie where they might be expected to lie, but these differences were not significant. (See table 2.) The only statisti- TABLE 3 Comparison of Mean TILT Scores by Cohort Cohorts No instruction: TILT No instruction: Classroom instruction No instruction: TILT + Classroom TILT: Classroom instruction TILT: TILT + classroom Classroom instruction: TILT + classroom p-value .0201 .2226 .0071 .2779 .7510 .1551 Residual Impact of the Texas Information Literacy Tutorial 211 TABLE 4 Ability to Locate Library Catalog Did Not Find Online Catalog Found Online Catalog Mean TILT-based score 24.43 25.93 Standard deviation 3.34 3.25 No. 21 107 cally significant differences (p value < .05) appeared between TILT-exposed cohorts and the cohort that received no informa- tion skills instruction. (See table 3.) This can be taken as evidence that information skills instruction, whether conducted live or via tutorial, has some impact on students’ library information skill. More interestingly, TILT scores ap- peared to be related to the ability to perform specific tasks. The first example concerns the ability to find or not find the library’s catalog while looking at the library’s Web site. (See table 4.) Results are not significant at the p < .05 level, but they approach significance. A more dramatic result can be seen in the ability of students to find a periodical title in an online catalog. (See table 5.) Here, there is a clear relationship between mean TILT scores and the ability to perform an es- sential information research skill. An even more interesting result con- cerns TILT experience and the ability to locate the library’s online catalog. Table 6 focuses on the number of a empts re- quired to locate the catalog. Here, it can be seen that librarian-trained students (and those who had no instruction) needed few- er a empts to locate the catalog than students who were exposed to TILT (including those who had TILT training in conjunction with classroom training by a librarian). However, table 7 reveals that al- though this is true, TILT-exposed students were able to locate the catalog in less time. Frequency of library resource usage revealed that students with the highest TILT scores re- ported using library resources on average once a week, whereas students with the lowest TILT scores reported using library resources on average more than once a week. A comparison of abil- ity functions (TILT scores, number of at- tempts needed to locate the online catalog, amount of time required to locate the online catalog) with confidence measure- ments shows no measurable correlation between the two. Prior education, mea- sured in previous credit hours completed, had a minimal impact on TILT scores (p value = .1511); and age had no impact. Gender differences were not apprecia- ble. There were sixty-one males and sixty- seven females in the study, and the p value for differences in TILT scores was .5596, implying no statistically significant gender- based differences. There was no correlation between age and TILT scores (p value = .6329). However, when confidence scores and TILT scores were analyzed across age brackets, the two older age groups (22–24 and 25+) appear to have positive correlations between these two variables, indicating that older students have more realistic views of their abilities. When age, credit hours, TILT scores, and confidence were cross-tabulated, it was found that younger students with higher numbers of credit hours already taken re- TABLE 5 Ability to Locate a Periodical Title in an Online Catalog Did Not Find Title Found Title Mean TILT-based score 24.90 27.04 Standard deviation 3.37 2.71 No. 81 47 212 College & Research Libraries TABLE 6 No. of Attempts Required to Locate Online Catalog Cohort Mean Attempts Standard Deviation No. No instruction 1.8 1.22 34 TILT 2.0 1.89 32 Classroom instruction 1.6 1.34 31 TILT + classroom 2.0 1.96 31 Total population 1.8 1.62 128 ported confidence levels that tended not to correlate to their TILT scores, whereas older students with fewer credit hours reported confidence levels that tended to correlate more to their TILT scores. The transcription task provided the most compelling results of the study. The transcription task required students to view an article citation and transcribe the information needed to discover the availability of that periodical in a specific library collection. Six pieces of information can be discovered within a citation that might be useful: article author, article title, periodical title, volume, issue, and date of publication. The most critical of these is the periodical title; without it a student would not be able to locate the item. Table 8 reveals that librarian-trained students fared no be er at transcribing critical cita- tion elements than students who received no instruction, whereas TILT-exposed students fared much be er at this task regardless of whether they had received instruction from a librarian in a classroom se ing. Conclusions/Discussion This study appears to reveal that TILT, as an example of Web-based instruction, can be at least as effective as face-to-face instruction for teaching first-year students fundamental information May 2004 research skills. Web-based instruction in this context “does not include syllabi, bibliographies or other infor- mation sheets placed on the Web, but only sites that are interactive, that is, those that request thoughtful action or feedback by the learner.”12 The desire to use technol- ogy to provide large-scale training has long been a goal of the profession. More than a decade ago, before the advent of the World Wide Web, computer-assisted instruction (CAI) was envisioned as an answer to the demands of large-scale training programs.13 Unfor- tunately, there has not been a wide range of end-user studies regarding the educa- tional impact of online tutorials.14 Studies have been conducted that show conflicting and varying results. Some studies seem to indicate that students are receptive to CAI because “CAI works to increase the motivation of the learner primarily because of its ability to provide immediate and ap- propriate feedback.”15 In a 1991 article, Gavriel Salomon, David N. Perkins, and Tamar Globerson made a distinction between effects with technology and effects of technology. They posited that effects with technology are “obtained during intellectual partnership with the technology; effects of technology are thought of in terms of the transferable cognitive residue that the partnership TABLE 7 Time Required to Locate Online Catalog Cohort Mean (Minutes) Standard Deviation No. No instruction 6.3 3.78 34 TILT 5.3 3.93 32 Classroom instruction 6.5 4.73 31 TILT + classroom 5.9 5.88 31 Total population 6.0 4.60 128 http:tutorials.14 http:programs.13 Residual Impact of the Texas Information Literacy Tutorial 213 TABLE 8 Instances of Citation Elements Being Transcribed from a Periodical Index Cohort Article Author Article Title Periodical Title Volume Issue Year of Publication No. No instruction 30 18 18 30 28 17 34 TILT 26 16 28 28 24 21 32 Classroom instruction 22 16 18 26 25 22 31 TILT + classroom 24 14 28 29 24 23 31 leaves behind in the form of be er mastery of skills and strategies.”16 Their concept of cognitive residue is particularly germane to the results of this study, and it speaks in a way to the notion of inert knowledge discussed by Bransford. The ability to lo- cate a particular information source from a particular Web site is not part of the TILT “curriculum,” yet students who were exposed to TILT were able to complete this task successfully in less time than their non-TILT counterparts, although they required more a empts to do so. Is it possible that technology teaches students to somehow be more nimble, more will- ing to explore, more willing to take risks? This question is of particular interest in relation to this study because the ability of librarian-trained students to locate a library catalog on a Web site is the sole area in which it was shown that librarian instruction in a classroom adds to student proficiency in searching for information. Instructional librarians might wish to consider whether they wish to train stu- dents to focus on doing things correctly or whether they might wish to train students more generally to navigate an information landscape and recognize what is correct when they encounter it. Conversely, are there effects of technology that run coun- ter to what we wish students to learn? Would we prefer to teach students to take more time to accomplish a task but ac- complish it in fewer a empts (essentially, more accurately)? These are questions that can only begin to be considered if we understand the effects of technology, if in fact such effects are real. If information skills instruction is a component of information literacy, our efforts need to have some lasting impact. Knowledge should be retained and, more important, transferable to other situations and contexts. Salomon and colleagues go on to say that: the possibility of a cognitive residue rests on an important assumption. The assumption is that higher or- der thinking skills that are either activated during an activity with an intellectual tool or are explicitly modeled by it can develop and be- come transferred to other dissimilar, or at least similar situations. This expectation for transferable cogni- tive skills rests itself on a more basic assumption that, contrary to some views, cognitive skills of the kind one would want to cultivate in school are not necessarily context-bound or “situated.” The question is, does any evidence support the existence of such side of technology use?17 This study appears to provide at least some evidence for this proposition. The statement that “cognitive skills of the kind one would want to cultivate in school are not necessarily context-bound” lies beneath the entire ethos of informa- tion literacy. The BI movement has been transformed by this ethos from a product- 214 College & Research Libraries May 2004 (and place-) bound endeavor to a process- bound endeavor primarily because of technological shi s that have undermined the solid foundation that once existed in a predictable tool-based print environment. Even with this paradigm shi firmly in place, there is one other aspect of the librarian-versus-machine instructional question that needs to be considered and that is which mode of interaction serves best to allow students to activate knowl- edge when it is needed. Bransford offers advice on this issue. Distinguishing among “blind training,” “informed training,” and “informed train- ing plus self-control,” he argues that stu- dents need not only to understand when and why to use various strategies, but also the opportunity to practice strategies and monitor their effects. This was a major reason that Dupuis and Fowler were so adamant that TILT not stand alone as an instructional tool. Ironically, it is this most critical aspect of activating knowledge that librarians so rarely have at their disposal. More common is the situation in which librarians teach when and why to use various strategies and then turn students back over to professors whose research assignments allow them the opportunity to monitor students’ practice and the effects of their a empts to utilize these strategies. The two pieces are separated physically because two separate individuals are re- sponsible for them. Unfortunately, these exercises are typically not done to hone research skills but, rather, to allow students to prepare to hone other skills, most com- monly, writing and public speaking. Bransford argues that “different types of teaching environments have strong ef- fects on transfer.” He goes on to say, “By placing more emphasis on the systematic development of well-organized knowledge in addition to executive processes, it may be possible to increase considerably the speed with which people can become able to think effectively in a variety of knowl- edge-rich domains.”18 The opportunity for “executive processes” that promote skill enhancement are commonly lacking in first-year courses. Students are typically provided a choice of a range of topics, a required number of background sources, and a specified length for the completed work. Again, the research aspect of these assignments is commonly a by-product, rather than the focus, of the assignment. This study may not provide results that are significant enough or pronounced enough to validate Bransford’s position, but it does seem clear that a well-designed teaching tool such as TILT provides benefits that are not being realized from more traditional interactions between students and librarians. TILT appears to be a sound technological approach to of- fering students an opportunity to enhance their “executive processes.” This study provides impetus to conduct further re- search into the utility of technological ap- proaches to information research instruc- tion. Much of TILT’s apparent success can be a ributed to its rich interactive nature. It may be that other, nontechnological approaches, particularly problem-based approaches that have research skills as an integral aspect of the experience, might be useful as well, although these would lack the economy of scale that a technological tool such as TILT provides. It appears that it may be time to reconsider how informa- tion research is provided, particularly to first-year students. Notes 1. Betsy O. Barefoot, “The First-year Experience: Are We Making It Any Be er?” About Campus 4, no. 6 (Jan. –Feb. 2000): 12–18. 2. John E. Cooper, “Using CAI to Teach Library Skills,” College and Research Libraries News 54, no. 2 (Feb. 1993): 75–78. Residual Impact of the Texas Information Literacy Tutorial 215 3. John Bransford, Robert Sherwood, Nancy Vye, and John Reiser, “Teaching Thinking and Problem Solving: Research Foundations,” American Psychologist 41, no. 10 (Oct. 1986): 1078–89. 4. William G. Perry Jr., Forms of Intellectual and Ethical Development in the College Years: A Scheme (New York: Holt, Rinehart and Winston, 1970), 14. 5. Anna Marie Johnson and Phil Sager, “Too Many Students, Too Li le Time: Creating and Implementing a Self-paced Interactive Computer Tutorial for the Libraries’ Online Catalog,” Research Strategies 16, no. 4 (1998): 271–84. 6. IUPUI Office of Information Management and Institutional Research, “Fall 1999 Enroll- ment: The Impact of Learning Community Participation in Freshman Retention Rates and Fall Semester Grades by Type of Admit,” Enrollment Reports and Analysis 5, no. 2 (Dec. 1999). 7. Ann Margaret Scholz, Richard Cary Kerr, and Samuel Keith Brown, “PLUTO” Interactive Instruction on the Web,” College and Research Libraries 57, no. 6 (June 1996): 346–49. 8. Stephanie Michel, “What Do They Really Think? Assessing Student and Faculty Perspec- tives of a Web-based Tutorial to Library Research,” 62, no. 4 (July 2001): 317–32 9. Elizabeth Dupuis and Clara Fowler, taped interview by Bill Orme, University of Texas- Austin, May 25, 2001. 10. Perry, Forms of Intellectual and Ethical Development in the College Year, 62. 11. Dupuis and Fowler. 12. Nancy Dewald, “Web-based Library Instruction: What Is Good Pedagogy?” Information Technology and Libraries 18, no. 1 (Mar. 1999): 26–31. 13. Ann Turner, “Computer-assisted Instruction in Academic Libraries,” Journal of Academic Librarianship 15, no. 6 (Jan. 1990): 352–54. 14. Tess Tobin and Martin Kesselman, “Evaluation of Web-based Library Instruction Pro- grams,”.INSPEC 34, no. 2 (2000): 67–75. 15. Steven G. Lesh and Larry C. Rampp, “Effectiveness of Computer-based Education Technol- ogy in Distance Learning: A Review of the Literature,” ED440628 (2000). 16. Gavriel Salomon, David N. Perkins, and Tamar Globerson, “Partners in Cognition: Extending Human Intelligence with Intelligent Technologies,” Educational Researcher 20, no. 3 (Apr. 1991): 2–9. 17. Ibid. 18. Bransford, Sherwood, Vye, and Reiser, “Teaching Thinking and Problem Solving.”