Untitled-5 Testing the Competition 433 433 Testing the Competition: Usability of Commercial Information Sites Compared with Academic Library Web Sites Tiffini Anne Travis and Elaina Norlin Tiffini Anne Travis is the Psychology & Communication Studies Librarian at California State University at Long Beach; e-mail: ttravis@csulb.edu. Elaina Norlin is an Assistant Librarian at the University of Arizona Library; e-mail: norline@u.library.arizona.edu. With the growing size of academic library Web sites, constant updating, authentication issues, and organization are increasingly difficult for li- braries to maintain user-friendly sites. This usability study examines how students use electronic research libraries such as Questia, which has been designed to replace traditional libraries and compare it with large university library Web sites. Students were asked to perform tasks at two electronic research library sites and then at two large university li- brary Web sites. Major implications of this study are that design fea- tures incorporated by Web site designers can drastically affect the suc- cess of students doing research. nyone who has worked in an academic library over the past few years has agonized over the library home page and how best to present library resources to students. Professional Web designers have developed guidelines that work well for companies that have robust re- sources to implement extensive user studies and to hire specialists. However, for most librarians, the Web design prin- ciples and access to digital resources are still the “librarians know best model,” where users are expected to know how information is organized and to know the meaning of library terminology.1 The “librarians know best model” is advan- tageous only for the library with a lim- ited amount of digital resources that can be attractively organized and displayed and easy to understand. Yet, many times, libraries acquire more and more elec- tronic and digital databases and prod- ucts, which makes constant updating, authentication issues, and organization, while keeping things user friendly, a con- stant challenge. To answer this challenge, a new gen- eration of one-stop electronic research li- braries is cropping up with direct mar- keting designed to cut out libraries com- pletely and target university students. Do these for-profit research centers provide students easier access to online materials? Do these sites entice students to use their resources more readily than do library home pages? That is the question this ar- ticle addresses. 434 College & Research Libraries September 2002 Blackboard.com has developed an Academic Resource Center that includes full-text journal articles. The Academic Resource Center2 is being designed as a companion to the popular Blackboard course software also produced by the company. The welcome page states: “the goal of the Resource Center is to produce a world-class, online academic destina- tion, through which instructors and stu- dents can access high-quality supplemen- tal information and resources to enhance teaching and learning.”3 Questia’s marketing campaign is to target students who normally procrasti- nate and have waited until the last mo- ment to write their papers. Questia’s strat- egy is to hope that these students are will- ing to pay a small monthly fee to have access to electronic resources. Its informa- tion page states that it is the “revolution- ary online library” and provides “unlim- ited access to an extensive collection of books and journal articles in humanities and social sciences.”4 Elibrary is also a service marketed di- rectly to the student. The Elibrary infor- mation page states that Elibrary “is not a search engine … but partner with top publications to provide you with current information AND years of archives in one convenient location.”5 Although it is disturbing to see the tr end of closing the gate on the gatekeepers, perhaps we can learn from our competition and adopt some of their best Web design features. Literature Review Usability Testing: Background and Applications Usability has been applied routinely to commercial Web sites. Jakob Neilson, one of the foremost usability experts, and Donald Norman noted: “Most web sites are tough to use. Usability studies typi- cally find a success rate of less than 50%. When the average person is asked to ac- complish a simple task on the average web site, the outcome all too often is fail- ure.”6 The business model of Web site de- sign is “when people have a positive user experience, they are apt to return, and you get useful exposure if not revenue, from your ad dollar.”7 Neilson and Norman have found that users of the “Web find a low tolerance for difficult designs or slow sites. People have to be able to grasp the functioning of the site immediately after scanning the home page—for a few seconds at most.”8 Fur- thermore, Web designer Jeff Walsh noted that “even if you think it’s easy, it’s most likely still too hard for the average user out there.”9 Library Web Site Usability Testing Testing library Web sites for usability is a relatively new phenomenon. Where ac- cess issues have been applied to online catalogs and CD-ROM products for a number of years, libraries have largely overlooked the relationship between li- brary Web site design and user success. With the explosion of competing online databases and the huge expanse of library Web sites, it has become more difficult to organize the flood of information in a way that makes sense to users who never step into the library for instruction. A key com- ponent to evaluating the usability of a system is to determine how library pa- trons will approach a library Web site. Some of the most relevant research on usability in the library setting has actually been conducted on the use of Web-based information by students. From these stud- ies, it has been possible to note key func- tionality problems or enhancements that affect seeking behavior. Dania Bilal con- ducted a two-part study on how elemen- tary school students search the Internet using Yahooligans search engine. She found that “most children sought informa- tion by employing browsing strategies.”10 Weaknesses discovered in the design of Yahooligans were categories within sites However, for most librarians, the Web design principles and access to digital resources are still the “librar- ians know best model.” Testing the Competition 435 of categories, lack of description returned with links, limited help features, and lack of database size.11 Furthermore, she found that 64 percent of students used search terms to find sites, but only 36 percent browsed under subject headings. Search- ing and browsing techniques were used interchangeably as a search strategy.12 Language studies also have implica- tions for usability testing. The findings from Rachel Naismith and Joan Stein’s study of library terminology take on new importance in a Web environment. Naismith and Stein identified a set of terms that students routinely misunderstood. They noted that in the reference interview transaction, “patrons only understand 50 percent of what librarians say or write.”13 What their findings did not reflect is the increase of library presence on the Web, which frequently means that no librarian is available to define terms for students. Another study examined the use of termi- nology in a Web environment. In an ex- amination of language on academic library Web pages, Mark Spivey noted that “Since successful navigation of a large library web site depends on the clarity of the home page, its vocabulary deserves scrutiny by managers of these Internet sites.”14 A few studies have been published that have employed usability testing on li- brary Web sites. The University of Ari- zona Library was one of the first to per- form a systematic usability study of its Web site. It was found that library termi- nology frequently hindered student suc- cess at completing a task.15 Another study found that terminology such as policy, reference, borrower, and online were con- fusing to test subjects.16 A study at the University of Newfoundland found that participants were more successful when they used annotations to determine where a link would take them: “[Anno- tations] were the most effective aid in as- sisting participants to navigate the menus because they provided hints about what might be found on the next menu.”17 The researchers also found that despite suc- cess at completing tasks “[users] experi- enced difficulties in knowing where to start and with the site’s information ar- chitecture—in particular, with interpret- ing categories and their labels.”18 Like- wise, the Arizona State University West Library found that “users think differ- ently than librarians about the organiza- tion of information.”19 Studies of library site usability also have been published on the Internet. Yale University, Rogers and Williams Univer- sity, University Wisconsin-Madison, and MIT libraries have all published various data and outlines for conducting usabil- ity testing.20–23 The results of the testing have found usability problems consistent with the “librarian knows best model of design.” Only one empirical study regarding Questia has been published. The findings of this study examined the usability of content and marketing employed by Questia. Participants were asked to fol- low a set of tasks and use all the biblio- graphic tools available on the site. The participants in the usability study found Questia easy to use. However, a key prob- lem was found in the retrieval of infor- mation due to the small number of re- sources available in Questia.24 The present study takes the examina- tion one step further by comparing the usability of Questia with the usability of traditional library Web sites as well as with another portal offered on the Web. This study was designed to test the abil- ity of students to use Questia for research rather than the added bibliographic fea- tures provided on the site. The purpose of this study is to determine whether de- sign features used by Questia can be used to enhance the usability of a traditional library Web site. Usability Testing: University of Arizona, Questia, California State University-Long Beach, and Blackboard25 Methodology This study was conducted at the Univer- sity of Arizona Library during a one-week period. Announcements were posted around campus as well as on several stu- 436 College & Research Libraries September 2002 dent listservs. Students were paid a small stipend for their participation. According to Jeffrey Rubin’s publication, Handbook of Usability Testing: How to Plan, Design and Conduct Effective Tests, “The latest research indicates that testing four to five partici- pants will expose the vast majority of us- ability problems.”26 To increase the valid- ity of the study that is the subject of this article, nine students were selected. Neilsen’s Mathematical Model of Usabil- ity Problems shows that up to nine users will demonstrate 90 percent of a Web site’s usability problems.27 As indicated in the demographic forms completed by the participants, seven of the nine students had received a library orientation at the University of Arizona (UA) and had used the library site more than four times. Because the students were familiar with the site, it was ex- pected that they would perform the tasks better than they would at the other sites. All of the participants were familiar and comfortable using the Internet. There were two undergraduates and seven graduate students. All of the sites were bookmarked, and students were observed using all sites. Stu- dents were asked between three and five questions for each site. Question groups were specifically designed for individual sites. The types of questions were both simple task-oriented questions (“How much does it cost to subscribe to Questia for one year?”) and complex questions that required students to find information for a hypothetical research assignment (“Find a journal or magazine article about eating disorders and males”). The more difficult questions required students to formulate their own search terms to find the answer and were designed to mimic typical top- ics students have when they begin research in a true library setting. The questions re- quired students to use links from the home page as well as links from what was de- termined to be a “gateway page” (main page for links to electronic resources). Stu- dents were asked to complete the ques- tions only to the point of finding resources. An observer recorded the paths they took while a second individual asked the ques- tions. After completing the tasks for a site, students were asked to fill out a Likert-type scale, which measured their attitudes to- ward a site. Findings Of the four Web sites tested, Questia and the UA site fared the best. As expected, students consistently rated the university site highest. Questia and the California State University-Long Beach (CSULB) Li- brary site followed closely, and Black- board repeatedly ranked lowest (table 1). Students were able to complete 67 per- cent of the tasks overall. The participants varied in their success depending on which site they used. The least success- ful student answered 56 percent of the questions correctly, compared to the most successful student who answered 93 per- cent of the questions correctly. The worst TABLE 1 Attitudes of Students (On a scale of 1�7 (where 1 = worst and 7 = best)) Question UA Questia CSULB Blackboard Rate this site 5 4.2 3.9 3.25 Your overall productivity of site 5.1 4.1 4.2 3.1 Logic of navigation 4.9 4.1 4 2.9 Ease of finding specific information 4.6 4.1 4.2 2.8 Students do not always read the entire page. Testing the Competition 437 TABLE 2 Student Performance Student Questia UA CSULB Blackboard % Success Rate 1 2/5 3/5 4/5 1/2 59 2 3/4 4/5 2/4 1/3 63 3 4/4 2/4 3/5 0/2 60 4 3/5 4/4 2/4 2/3 69 5 3/4 2/3 1/4 2/3 57 6 3/4 4/4 4/4 3/3 93 7 3/4 3/5 2/4 1/3 56 8 4/4 3/4 3/4 2/3 80 9 2/3 4/4 1/3 n/a 70 Total success rate of all students: 27/37 29/38 22/37 12/23 90/135 Overall success rate of each site (percentage of questions answered correctly): 73% 78% 59% 52% 67% site in terms of usability was Blackboard, where 56 percent of the questions were answered. CSULB did slightly better with 59 percent, whereas Questia and UA did best with 73 and 78 percent, respectively (table 2). While observing students’ informa- tion-seeking behavior, several patterns emerged. Depending on the site design, the search patterns were hindered or en- hanced. Students Observations Reading Many observations of what students look at when reading a page emerged. Stu- dents do not always read the entire page; instead, they look for keywords or hyperlinks or search boxes. This would indicate that terminology used by librar- ies on their page is extremely important for the first-time user. This is illustrated most in searches con- ducted on the CSULB library Web site. Students had no problem determining when to click on “electronic resources” link but were stumped when it asked to find books and journal titles. This is be- cause terminology was unfamiliar. The CSULB library catalog is referred to as COAST. Because COAST meant noth- ing to UA students, they did not stop to read the description. One student said af- ter not finding the correct answer, “Oh, I probably would have clicked here [on COAST] if I had read the text underneath, but since it was not familiar to me, I didn’t read the text because I didn’t think the option was relevant.” Likewise, the anno- tation beneath the link to COAST did not mention the word “books.” In fact, noth- ing on the home page or the secondary electronic resources page ever said the word “book.” Instead, ambiguous terms such as “items” and “resources” were used. By contrast, students did read the descriptions under the Research Databases link. They noticed the words “articles” and “research.” As a result, students did not encounter the same difficulties finding ar- ticles as they did finding books when us- ing this Web site. Students had problems distinguishing journal article records from book records in Questia, especially when asked if par- ticular journal titles existed in the data- base. Seven out of nine students could not tell that an article from the Journal of Edu- cational Research was listed. The latest ver- sion of Questia added icons and verbiage that indicate what type of item a student is viewing. A unique finding of this study is that almost all students scrolled down to the 438 College & Research Libraries September 2002 bottom of each page they clicked on, re- gardless of where they were in the search. This is contrary to many earlier studies, which observed that searchers do not scroll down to the bottom of the page.28 This may be a characteristic of the scan- searching method used by the subjects of this study. Even though they scrolled down the length of the pages, most ended up using information located in the first part of the page. The only exceptions were on the CSULB and Questia sites, where they ended up using the toolbars at the bottom of the page (“search button” and “electronic resources button” for CSULB and “topic button” for Questia). Resort- ing to the buttons at the bottom banner for each site was done when satisfactory links were not provided in the contents of the page. Different students focused on different parts of a Web page. Some students used the options at the top of the screen, and others used the options in the middle. Students rarely looked at the menu items located on the left-hand column of some pages. The exception was Questia. Par- ticipants did look at the menu in Questia after a search had been completed. Stu- dents routinely clicked on the links listed there and used the search box at the top of the panel labeled “search within re- sults.” Searching Students did not modify their search in other ways on a consistent basis. In most cases, it was by accident or desperation. Although Questia offered a variety of complex search options (similar terminol- ogy to search engines) that were familiar to students, the students did not regularly or methodically make changes. They al- ways assumed the error was in the ter- minology used and not in the part of the database they were searching. The same was true for other sites. For example, when a student was looking for articles on the legalization of marijuana on the CSULB library page, he was searching the library catalog. He was able to recognize that the records he retrieved were books but determined that “the terminology is not correct. I need to have more informa- tion about the topic.” Regardless of the site they were searching, participants as- sumed the answer was somewhere and blamed their methods rather than the database or the Web site. This may be a byproduct of Internet searching, where virtually anything you type into a search engine retrieves a result. Navigational Style Two distinct patterns of searching emerged: topical searching and direct searching. Topical searching can be charac- terized as following a predetermined navigational structure provided by the site. Students given a question fit it into a framework and start their search by click- ing on topical options. Direct searching is most accurately described as “searching for the box.” Students immediately looked for a box to type in the term al- most always as a first option. When the box was unavailable, students were con- fused about how to proceed. Many searches continued and did not end until a box was found, regardless of whether it was the correct box to conduct the search. Students noticed both types of search options available in Questia. Questia was the only site of the four tested that offered both options from the main search page. Most of the time students preferred to type in the search box first and selected topical searches when the text box search was unsuccessful. Another feature of Questia was the additional options for searching by topic or keyword after the initial search. Stu- dents noticed the subject headings listed at the left of the secondary search screen and the box entitled “search within re- sults.” This feature was subsequently eliminated in later versions of the Questia search interface. Blackboard was the hardest and most frustrating for students because it forced them to use a topical search rather than offering them a keyword option from the first search page. Every search using the Blackboard interface required students to Testing the Competition 439 FIGURE 1 Blackboard Home Page FIGURE 2 Blackboard Second Page 440 College & Research Libraries September 2002 select categories twice before getting to a search box. The main problem with this design was the fact that students would invariably select an inappropriate cat- egory for a topic and end up with errone- ous or zero results. This may be the re- sult of an assumption that students have the correct level of domain knowledge about subjects. Earlier studies have indi- cated that subject browsing is influenced by an individual’s domain and topic knowledge.29 Figures 1, 2, and 3 show Blackboard’s home, second, and third pages, respectively. Students who used direct searching versus browsing had different results on the databases. User #6, who answered 93 percent of the questions correctly, used the browsing method for most questions. User #8, who answered 80 percent of the questions correctly, primarily used direct searching. However, when forced to use only one type of searching (browsing in Blackboard), all students were successful only 52 percent of the time. Two thirds of students using the CSULB site ended up at the index to the library page. One possible reason for this is the prominent link at eye level, which is called Search. Many students were ex- pecting a search box when clicking on this link when, in fact, what they found was an intermediary page that offered the option of clicking on a link to search en- gines or a link to the index to the library page. When students found a route success- ful, they tended to repeat it even if they were unsure that it would help complete a task. One student successfully found the pathway to a task via the “research data- bases” link and said he knew the link would not tell him if CSULB owned a particular journal, but he still chose that route and hoped he would stumble across the answer. Likewise, students would take a convoluted route to get to a section of a Web site because the route was fa- miliar, even when a direct link to the same page was available from the home page. FIGURE 3 Blackboard Third Page Testing the Competition 441 FIGURE 4 CSULB Library Home Page FIGURE 5 Home Page of University of Arizona 442 College & Research Libraries September 2002 Most students did not realize an elec- tronic journal would be found by look- ing in the library catalog. Because it was electronic, they separated it from other titles held by a library. The University of Arizona has a specific search box that says “E-Journals” from the Indexes to Articles page, and it is really just a disguised search of the library catalog limited to search the journal collection. Another significant result from testing was that not one student used the info, help, or tips screens on any of the Web sites. Even when they were having diffi- culty finding an answer, they never thought they were searching incorrectly, just that the terminology was incorrect. Conclusion When creating a library Web site, it is important to make it intuitive enough for first-time users. After students have gone through it once, they become familiar with a path. Adding key terms not just on the first page, but also on subsequent pages, triggers memory for the second-, third-, and fourth-time user, even if it has been a summer or semester since last us- age. Using participants who are unfamil- iar with a site to test for usability is a use- ful way to determine how undergraduate students at your university, especially first-year students, will use your site. A key feature of creating a usable in- terface is to offer students a way out. Be- cause one of the prevalent search charac- teristics on the Web is looping or using the back button, it is important to take into account the possibility of a wrong turn on the library Web page. The University of Arizona site design helps to combat this by creating multiple links that often lead to the same place. If a student is looking for articles, there is a link to them. If a student clicks on the Articles link, he or she is offered a prominent link to finding a journal title. Likewise, having a search box to narrow or modify an existing search is prevalent in the Questia inter- face. Students rarely used the back but- FIGURE 6 Questia Home Page Testing the Competition 443 FIGURE 7A Questia Search Page FIGURE 7B Questia Search Page Questia Search by Topic Page (available os a stand-alone page or by scrolling down the quick search page) 444 College & Research Libraries September 2002 ton when navigating through Questia; however, it was very prevalent in the CSULB site. It is important to anticipate task switching. Questia does this by of- fering both topic links and search options on the search screen. Visual attraction/graphic design does not help (but can sometimes hinder) us- age. Students were attracted to the visual beauty of the CSULB site as well as the Questia site. Sometimes beauty overrode the fact that they could not easily find information on the site. Two students of- fered suggestions for visual improvement on the Blackboard site, and one was im- mediately turned off by the graphics on the UA site, hating everything about it. She was so affected that she wanted to stop searching the site altogether and eventually switched to another UA li- brary Web site to answer a question. Fig- ure 4 shows the CSULB home page, and figure 5 shows the UA home page. Although a pretty site does not help navigation, it does follow the business model of Web design by increasing a student’s likelihood of using a site be- cause it is appealing. One study found that in site usability, intangibles such as colors, images, font, text size, and place- ment can be as important as actual con- tent.30 Questia was designed to attract the Internet-astute college student. Questia has followed the model of information-seek- ing behavior of undergraduate students by appealing to the principle of least effort and the “right here, right now” attitude of today’s youth. What is unique about Questia’s approach is that it combines the information needs of the target market with the structure and language of the most popular search engines. Placing sub- ject headings in the left-hand column of FIGURE 8 Questia Results Page The first point is that if librarians want to market their services to today’s youth, they must get the word out about the library’s services. Testing the Competition 445 the results list is a revolutionary approach to introducing students to Library of Con- gress subject terms to narrow their search and should be incorporated by library cata- log vendors, especially because Library of Congress subject headings are not intui- tive. Figures 6, 7a and b, and 8 show ex- amples of Questia pages. Students were not impressed with the lack of relevant results from their searches and the structure of the titles. For many students, currency of information indi- cated the quality. A book written in 1956 about World War Two may be an excel- lent source, but students were reluctant to select items because the items were “too old.” Two students were dissatisfied with results in Questia solely based on the date of publication. In addition to attitudes toward the con- tent of Questia, the lack of holdings seems to have had an effect on student search success. Small database size adversely affected retrieval in both Questia.com and Blackboard. Likewise, Blackboard never indicates the type of search that is being performed; therefore, students were puzzled by the results of a search for gun control when the entire set of articles came up and none seemed relevant. Searchers assumed their terms would be searched as a phrase, whereas, in actuality, Black- board looked in all fields adding the Bool- ean AND. What libraries can learn from Questia and other “one-stop shopping” sites is twofold. The first point is that if librar- ians want to market their services to today’s youth, they must get the word out about the library’s services. A consistent pattern found among the students in this study was that they did not identify the library with online full-text items. Instead, the library was where they could come and make copies of articles. This indicates that students do not always equate librar- ies with cutting-edge technology but, rather, as a physical place to do research and not the gateway to all types of infor- mation they truly have become. The second point is in Web design— modifying library Web sites to appeal to students’ information-seeking behavior. Library terminology and the “librarian’s know best” model of design should be avoided. Today, it is important to think from the student’s point of view. The University Library at CSULB used the results of this study to modify both the terminology and the arrangement of its library home page as well as its electronic resources. A complete redesign will have to be implemented and more usability testing conducted to make the site more intuitive for users. Multiple paths should be created to the pages the librarian wants students to find. “Focus on people’s needs in plain language in a layout that’s easy to scan.”31 Students should always be given an “out” by providing repetitive links and anticipating wrong turns. The site should be made appealing and one that students feel comfortable using. Above all, whether using a student em- ployee from the stacks or getting fund- ing to conduct testing, the library should always do some sort of usability test on its pages. “Usability isn’t a luxury on the Internet; it’s essential to survival.”32 Fig- ures 9 and 10 show before and after ver- sions of the CSULB E-resources page. Would Students Pay for Questia? The bottom line is, would students pay for Questia? A survey of students con- ducted by Questia found that “Nearly 75% of students said 24-hour access to information resources is important to them and, as a result, 66% are interested in using an Internet service to help with research.”33 When students participating in this study were asked if they would subscribe to Questia, they had a variety of answers. One student said she would not pay for the site unless it had more help screens or tips, such as “help with paper ideas.” Another student, who did not success- fully find a single answer to the questions using Questia, still said, “Yes, I would purchase this product if I was a social sci- ences major.” Yet another student said she would not purchase Questia because it was too limited in subjects and not de- 446 College & Research Libraries September 2002 FIGURE 10 CSULB E-Resources Page (after) FIGURE 9 CSULB E-Resources Page (before) Testing the Competition 447 scriptive enough, although she stated two separate times how she really thought the page “was appealing and really catches your attention.” One user would pur- chase Questia if it were a per-day access charge. He suggested a $1.00 per-day charge automatically charged to his credit card, which Questia would have on file. Finally, one student said, “No, they would just use the Internet for free or come to the library. Why pay when you can get the information for free?” Further Research Limitations of this study included the small number of participants. Although the results of the testing reflected the ma- jority of problems with each site, a larger sample might have uncovered more covert problems with the sites tested. In addition to the relatively small sample size, most participants were foreign born and all were students at the University of Arizona. Eight out of nine students had a strong command of the English language; there- fore, comprehension of the Web sites was not affected. Almost all were familiar with the University of Arizona Library site, but unfamiliar with any of the other sites tested. Any future replications of this study would benefit from a larger number of par- ticipants from various patron groups, in- cluding undergraduates and faculty from both universities. More research on the search styles of students also would be beneficial. The participants in this study displayed search style divided by gender. All the males used direct searching, whereas all but one female used topical search meth- ods. This topic has not been sufficiently covered in library science literature. An- other study found that there likely are gender and cultural differences in the ways people interact with online infor- mation, although the publication pre- sented no supporting data.34 This also should be examined on a larger scale to determine the relationship between search styles, gender, and learning styles. Notes 1. Ruth Dickstein and Victoria A. Mills, “Usability Testing at the University of Arizona Li- brary: How to Let Users in on the Design,” Information Technology and Libraries 19 (Sept. 2000): 144–51. 2. Academic Research Center (may be restricted to Blackboard subscribers). Available online from http://resources.blackboard.com/scholar/general/main.jsp. 3. Ibid. 4. About Questia. Available online from http://www.questia.com/aboutQuestia/about.html. 5. Elibrary. Available online from http://ask.elibrary.com. 6. Jakob Nielsen and Donald Norman, “Usability on the Web Isn’t a Luxury,” Informationweek 773 (Feb. 2000): 65–73. 7. Ibid. 8. Ibid. 9. Jeff Walsh, “Is Your Site Really Working?” Info World 20 (Mar. 1999): 53–56. 10. Dania Bilal, “Children’s Use of the Yahooligans! Web Search Engine: I. Cognitive, Physi- cal, and Affective Behaviors on Fact-based Search Tasks,” Journal of the American Society for Infor- mation Science 51 (May 2000): 646–65. 11. Ibid. 12. Ibid. 13. Rachel Naismith and Joan Stein, “Library Jargon: Student Comprehension of Technical Language Used by Librarians,” College & Research Libraries 50 (Sept. 1989): 543–52. 14. Mark Spivey, “The Vocabulary of Library Home Pages: An Influence on Diverse and Remote End-Users,” Information Technology and Libraries 19 (Sept. 2000): 151–56. 15. Dickstein and Mills, “Usability Testing at the University of Arizona Library.” 16. David King, “Redesigning the Information Playground: A Usability Study of the Kansas City Public Library’s Web Site,” in Usability assessment of Library-related Web Sites: Methods and Case Studies (Chicago: LITA, a division of the ALA, 2001), 77–87. 17. Louise McGillis and Elaine G. Toms, “Usability of Academic Library Web Site: Implica- tions for Design,” College & Research Libraries 62 (July 2001): 355–67. 18. Ibid. 448 College & Research Libraries September 2002 19. Kathleen Collins and José Aguiñaga, “Learning as We Go: Arizona State University West Library’s Usability Experience,” in Usability Assessment of Library-related Web Sites: Methods and Case Studies (Chicago: LITA, a division of the ALA, 2001), 16–29. 20. Yale University. Available online from http://www.library.yale.edu/~prowns/nebic/ nebictalk.html. 21. Rogers and Williams University. Available online from http://gamma.rwu.edu/users/ smcmullen/usable.html. 22. University Wisconsin-Madison. Available online from http://www.library.wisc.edu/li- braries/News/Design.html. 23. MIT Libraries. Available online from http://macfadden.mit.edu:9500/webgroup/usabil- ity/results/index.html. 24. Nicholas G. Tomaiuolo, “Deconstructing Questia: The Usability of a Subscription Digital Library,” Searcher 9 (July/Aug. 2001): 32–39. 25. University of Arizona. Available online from http://www.library.arizona.edu/; Califor- nia State University Library, Long Beach. Available online from http://www.csulb.edu/library; Questia. Available online from http://www.questia.com; Blackboard Academic Resources. Avail- able online from http://resources.blackboard.com/scholar/general/main.jsp. 26. Jeffery Rubin, Handbook of Usability Testing: How to Plan, Design and Conduct Effective Tests (New York: Wiley, 1994). 27. Jakob Nielsen, “Test with 5 Users,” AlertBox. Available online from http://www.useit.com/ alertbox/20000319.html. 28. King, “Redesigning the Information Playground.” 29. Bilal, “Children’s Use of Yahooligans! Web Search Engine.” 30. Collins and Aguiñaga, “Learning as We Go.” 31. Nielsen and Norman, “Usability on the Web Isn’t a Luxury.” 32. Ibid. 33. Questia Media, “Study of Undergraduate’s Attitudes toward Research.” Available online from http://www.questia.com/aboutQuestia/librariansWhy.html. 34. Collins and Aguiñaga, “Learning as We Go.”