Manzari USER-CENTERED DESIGN OF A WEB SITE | MANZARI AND TRINIDAD-CHRISTENSEN 163 This study describes the life cycle of a library Web site created with a user-centered design process to serve a graduate school of library and information science (LIS). Findings based on a heuristic evaluation and usability study were applied in an iterative redesign of the site to better serve the needs of this special academic library population. Recommendations for design of Web-based services for library patrons from LIS programs are dis- cussed, as well as implications for Web sites for special libraries within larger academic library settings. U ser-centered design principles were applied to the creation of a Web site for the Library and Information Science (LIS) Library at the C. W. Post campus of Long Island University. This Web site was designed for use by master’s degree and doctoral students in the Palmer School of Library and Information Science. The prototype was subjected to a usability study consisting of a heuristic evaluation and usability testing. The results were employed in an iterative redesign of the Web site to better accommodate users’ needs. This was the first usabil- ity study of a Web site at the C. W. Post library. Human-computer interaction, the study of the inter- action of human performance with computers, imposes a rigorous methodology on the process of user-interface design. More than an intuitive determination of user- friendliness, a successful interactive product is developed by careful design, testing, and redesign based on the testing outcomes. Testing the product several times as it is being developed, or iterative testing, allows the users’ needs to be incorporated into the design. The interface should be designed for a specific community of users and set of tasks to be accomplished, with the goal of creating a consistent, usable product. The LIS Library had a Web site that was simply a description of the collection and did not provide access to online specialized resources. A new Web site was designed for the LIS library by the incoming LIS librarian who made a determination of what content might be useful for LIS students and faculty. The goal was to have such content readily accessible in a Web site separate from the main library Web site. The Web site for the LIS library includes: ฀ access to all online databases and journals related to LIS; ฀ a general overview of the LIS library and its resources as well as contact information, hours, and staff; ฀ a list of all print and online LIS library journal sub- scriptions, grouped by both title and subject, with links to access the online journals; ฀ links to other Web sites in the LIS field; ฀ links to other university Web pages, including the main library’s home page, library catalog, and in- structions for remote database access, as well as to the LIS school Web site; ฀ a link to JAKE (Jointly Administered Knowledge Environment), a project by Yale University that allows users to search for periodical titles within online databases, since the library did not have this type of access through its own software. This information was arranged in four top-level pages with sublevels. Design considerations included making the site both easy to learn and efficient once users were familiar with it. Since classes are taught at four locations in the metropolitan area, the site needed to be flexible enough to serve students at the C. W. Post campus library as well as remotely. The layout of the information was designed to make the Web site uncluttered and attractive. Different color schemes were tried and informally polled among users. A version with white text on black background prompted strong likes or dislikes when shown to users. Although this combination is easy to read, it was rejected because of the strong negative reactions from several users. Photographs of the LIS library and students were included. The pages were designed with a menu on the left side; fly-out menus were used to access submenus. Where main library pages already existed for informa- tion to be included in the LIS Web site, such as LIS hours and staff, links to those pages were made instead of re-cre- ating the information in the LIS Web site. An attempt was made to render the site accessible to users with disabilities, and pages were made compliant with the World Wide Web Consortium (W3C) by using their html validator and their cascading style sheet validator.1 ฀ Literature review Usability is a term with many definitions, varying by field.2 The fields of industrial engineering, product research and development, computer systems, and library science all share the study of human-and-machine interaction, as well User-Centered Design of a Web Site for Library and Information Science Students: Heuristic Evaluation and Usability Testing Laura Manzari and Jeremiah Trinidad-Christensen Laura Manzari (manzari@liu.edu) is an Associate Professor and Library and Information Science Librarian at the C. W. Post Campus of Long Island University, Brookville, N.Y. Jeremiah Trinidad-Christensen (jt2118@columbia.edu) is a GIS/Map Librarian at Columbia University, New York, N.Y. 164 INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2006 as a commitment to users. Dumas and Reddish explain it simply: “Usability means that the people who use the product can do so quickly and easily to accomplish their own tasks.”3 User-centered design incorporates usability principles into product design and places the focus on the user during project development. Gould and Lewis cite three principles of user-centered design: an early focus on users and tasks, empirical measurement of product usage, and iterative design to include user input into product design and modification.4 Jakob Nielsen, an often-cited usability engineering specialist, emphasizes that for increased functionality, engineering usability principles should apply to Web design, which should be treated as a software development project. He advocates incorporating user evaluation into the design process first through a heuristic evaluation, fol- lowed by usability testing with a redesign of the product after each phase of evaluation.5 Usability principles have been applied to library Web-site design; however, library Web-site usability studies often do not include the addi- tional heuristic evaluation recommended by Nielsen.6 In addition to usability, consideration should also be given during the design process to making the Web site accessible to people with disabilities. Federal agencies are now required by the Rehabilitation Act to make their Web sites accessible to the disabled. Section 508 part 1194.22 of the act enumerates sixteen rules for Internet applications to help ensure Web-site access for people with various dis- abilities.7 Similarly, the Web Accessibility Initiative hosted by the W3C works to ensure that accessibility practices are considered in Web-site design. They developed the Web Content Accessibility Guidelines for making Web sites accessible to people with disabilities.8 Although articles have been written about usability testing of academic library Web sites, very little has been written about usability testing of special-collection Web sites for distinct user populations within larger academic settings.9 ฀ Heuristic evaluation methodology Heuristic evaluation is a usability engineering method in which a small set of expert evaluators examine a user interface for design problems by judging its compliance with a set of recognized usability principles or heuristics. Nielsen developed a set of ten widely adopted usability heuristics (see sidebar). After studying the use of individual evaluators as well as groups of varying sizes, Nielsen and Molich recommend using three to five evaluators for a heuristic evaluation.10 The use of multiple experts will catch more flaws than a single expert, but using more than five experts does not produce greater results. In comparisons of heuristic evaluation and usability testing, the heuristic evaluation uncovered more of the minor problems while usability test- ing uncovered more major, global problems.11 Since each method tends to uncover different usability problems, it is recommended that both methods be used complementa- rily, particularly with an iterative design change between the heuristic evaluation and the usability testing. For the heuristic evaluation, four people were approached from the Palmer LIS School faculty and Ph.D. program with expertise in Web-site design and human- computer interaction. Three agreed to participate. They were asked to familiarize themselves with the Web site and evaluate it according to Nielsen’s ten heuristics, which were provided to them. ฀ Heuristic evaluation results The evaluators were all in agreement that the language was appropriate for LIS students. One evaluator said if new students were not familiar with some of the terms they soon would be. Another thought JAKE, the tool to access full text, might not be clear to students at first, but the LIS Web-site explanation was fine the way it was. They were also in agreement that the Web site was well designed. Comments included: “the purpose and descrip- tion of each page is short and to the point, and there is a good, clean, viewable page for the users”; “the site was well designed and not over designed”; “very clear and user friendly”; “excellent example of limiting unnecessary irrelevant information.” The only page to receive a “poor layout” comment was the lengthy subject list of journals, though no suggestions for improvement were made. Concern was expressed about links to other Web sites on campus. One evaluator thought new students might be confused about the relationship between Long Island University, C. W. Post, and the Palmer School. Two evalua- tors thought links to the main library’s Web site could cause confusion because of the different design and layout. A preference for the design of the LIS library Web site over the main library and Palmer School Web sites was expressed. To eliminate some confusion, the menu options for other cam- pus Web sites were dropped down to a separate menu right below the menu of LIS Web pages. For additional clarity, some of the main library pages were re-created in the style of the LIS pages instead of linking to the original page. The evaluators made several concrete suggestions for menu changes, which were included in the redesign. It was suggested that several menu options were unclear and needed clarification, so additional text was added for clarity at the expense of brevity. Long Island University’s online catalog is named LIUCAT and was listed that way on the menu. New students might not be familiar with this name, so the menu label was changed to LIUCAT (library catalog). USER-CENTERED DESIGN OF A WEB SITE | MANZARI AND TRINIDAD-CHRISTENSEN 165 For the link to JAKE, a description, Find periodicals in online databases, was added for clarification. It was also suggested that the link to the main library Web page for All Databases could cause confusion since the layout and design of that page is different. The wording was changed to All Databases (located in the C. W. Post Library Web site). Menu options were originally arranged in order of anticipated use (see figure 1). Thus, the order of menu options from the LIS home page was databases, journals, library catalog, other Web sites, Palmer School, and main library. Evaluators suggested that putting the option for LIS home page first would give users an easy “emergency exit” to return to the home page if they were lost. The original menu options also varied from page to page. For example, menu options on the database page referred only to pages that users might need while doing database searches. At the suggestion of eva- luators, the menu options were changed to be con- sistent on every page (see figure 2). A redesign based on these results was com- pleted and posted to the Internet for public use (see figure 3). ฀ Usability testing methodology Usability testing is an em- pirical method for improv- ing design. Test subjects are gathered from the popu- lation who will use the product and are asked to perform real tasks using the prototype while their performance and reactions to the product are observed and recorded by an inter- viewer. This observation and recording of behav- ior distinguishes usability testing from focus groups. Observation allows the tes- ter to see when and where users become frustrated or confused. The goal is to Jakob Nielsen’s Usability Heuristics Visibility of system status—The system should always keep users informed about what is going on, through appropriate feedback within reasonable time. Match between system and the real world— The system should speak the user’s language, with words, phrases, and concepts familiar to the user rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order. User control and freedom—Users often choose system functions by mistake and will need a clearly marked “emergency exit” to leave the unwanted state without having to go through an extended dialogue. Support undo and redo. Consistency and standards—Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions. Error prevention—Even better than good error messages is a careful design that prevents problems from occurring in the first place. Recognition rather than recall—Make objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate. Flexibility and efficiency of use—Accelerators, unseen by the novice user, may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions. Aesthetic and minimalist design—Dialogues should not contain information that is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility. Help users recognize, diagnose, and recover from errors—Error messages should be expressed in plain language (no codes), precisely indicate the problems, and constructively suggest a solution. Help and documentation—Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information would be easy to search, focused on the user’s task, list concrete steps to be carried out, and not be too large.12 Figure 1. Original menu Figure 2. Revised menu 166 INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2006 uncover usability problems with the product, not to test the participants themselves. The data gathered are then analyzed to recommend changes to fix usability problems. In addition to recording empirical data such as number of errors made or time taken to complete tasks, active intervention allows the interviewer to question participants about reasons for their actions as well as about their opinions regarding the product. In fact, subjects are asked to verbalize their thought processes as they complete the tasks using the interface. Test subjects are usually interviewed individually and are all given the same pretest briefing from a script with a list of instructions followed by tasks representing actual use. Test subjects are also asked questions about their likes and dislikes. In most situations, payment or other incentives are offered to help recruit subjects. Four or five subjects will reveal 80 percent of usability problems.13 Messages were sent to students via the Palmer School’s mailing lists requesting volunteers. A ten-dollar gift certifi- cate to a bookstore was offered as an inducement to recruit- ment. Input was desired from both master’s degree and doctoral students. The first nine volunteers to respond—all master’s degree students—were accepted. This group included students from both the main and satellite campuses. No Ph.D. stu- dents volunteered to participate at first, citing busy schedules, but eventually a doctoral student was recruited. Testing was conducted in computer labs at the library, at the Palmer School, and at the Manhattan satellite campus. Demographic information was gathered regarding users’ gender, age range, university status, familiarity with computers, with the Internet, and with the LIS library, as well as the type of Internet connection and browser usually used. The subjects were given eight tasks to complete using the Web site. The tasks reflected both the type of assignment a student might receive in class and the type of information they might seek on the LIS Web site on their own. The questions were designed to test usability of different parts of the Web site. ฀ ฀Usability testing results The first task tested the Print Journals page and asked if the LIS library subscribes to a specific journal and whether it is refereed. (The Web site uses an asterisk next to a journal title to indicate that it is refereed.) All subjects were able to easily find that the LIS library does hold the journal title. Although it was not initially obvious that the asterisk was a notation indicating that the journal was refereed, most of the subjects eventually found the explanatory note. Many of the subjects did not know what a refereed journal was, and some asked if a definition could be provided on the site. For the second task, subjects needed to use JAKE to find the full text of an article. None of the students were familiar with JAKE but were able to use the LIS Web site to gain an understanding of its purpose and to access it. The third task asked subjects to find a library asso- ciation that required using the Other Web Sites page. All subjects demonstrated an understanding of how to use this page and found the information. The fourth task tested the Full-Text Databases page. Only one subject actually used this page to complete the task. The rest used the All Databases link to the main library’s database list. That link appears above the link to Full-Text Databases and most subjects chose that link without looking at the next menu option. Several sub- Figure 3. Final home page USER-CENTERED DESIGN OF A WEB SITE | MANZARI AND TRINIDAD-CHRISTENSEN 167 jects became confused when they were taken to the main library’s page, just as the evaluators had predicted. Even though wording was added warning users that they were leaving the LIS Web site, most subjects did not read it and wondered why the page layout changed and was not as clear. They also had trouble navigating back to the LIS Web site from the main library Web site. The fifth task tested the Journals by Subject page. This task took longer for most of the subjects to answer, but all were able to use the page successfully to find a journal on a given subject. The sixth task required using the LIS home page, and everyone easily used it to find the operating hours. The seventh task required subjects to find an online journal title that could be accessed from the Electronic Journals page. All subjects navigated this page easily. The final task asked subjects to find a book review. Most subjects did not look at the page for Library and Information Sciences Databases to access the Books in Print database, saying they did not think it would be included there. Instead, they used the link to the main library’s database page. One subject was not able to complete this task. Problems primarily occurred during testing when sub- jects left the LIS page to use a non-library science database located on the main Web site. Subjects had problems get- ting back to the LIS site from the main library site. While performing tasks, some subjects would scroll up and down long lists instead of using the toolbars provided to bring the user to an exact location on the page. Some preferred using the back button instead of using the LIS Web-site menu to navigate. These seemed to be individual styles of using the Web and not any usability problem with the site. Several people consistently used the menu to return to the LIS home page before starting each new task, even though they could have navigated directly to the page they needed, making a return to the home page unnecessary. This validated the recommendation from the heuristic study that the link to the home page always be the first menu option to give users a comfortable safety valve when they get lost. The final questions asked subjects for their opinions on what they did and did not like about the Web site, as well as any suggestions for improving the site. All subjects responded that they liked the layout of the pages, calling them uncluttered, clean, attractive, and logical. There were very few suggestions for improving the site. One person asked that contact information be included on the menu options in addition to its location right below the menu on the LIS home page. Another participant suggested adding class syllabi to the Web site each semester, listing required texts along with a link to an online bookstore. Some of the novice users asked for explanations of unfamiliar terms such as “refereed journals.” A participant suggested including a search engine instead of using links to navi- gate the site. This was considered during the initial site design but was not included since the site did not have a large number of pages. However, a search engine may be worth including. The one doctoral student had previously only used the main library’s Web page to access databases. Originally, he said he did not see the advantage of a site devoted to information science sources for doctoral candidates, since that program is more multidisciplinary. However, after completing the usability study, the student concluded that the LIS Web site was useful. He suggested that it should be publicized more to doctoral candidates and that it be more prominently highlighted on the main library Web site. Though the questions asked were about the LIS Web site, several subjects complained about the layout of the main library Web site and suggested that it have better linking to the LIS Web site to enable it to be accessed more easily. ฀ Conclusions Iterative testing and user-centered design resulted in a product that testing revealed to be easy to learn and effi- cient to use, and about which subjects expressed satisfac- tion. Based on findings that some students had not even been aware of the existence of the LIS Web site, greater emphasis is now given to the Web site and its features during new student orientations. The biggest problem users had was navigating from the Web pages of the main library back to the LIS site. It was suggested that the LIS site be highlighted more prominently on the main library Web site. Some users were confused by the different layouts between the sites, but no one expressed a preference for the design used by the main library Web site. Despite this confusion, subjects overwhelmingly expressed positive feedback about having a specialized library site serving their specific needs. Issues regarding Web-site design can be problematic for smaller specialized libraries within larger institutions. In this case, some of the problems navigating between the sites could be resolved by changes to the main library site. The design of the LIS Web site was preferred over the main campus Web site by both the heuristic evaluators and the students in the usability test. However, designers of a main library Web site might not be receptive to suggestions from a specialized or branch library. Although consistency in design would eliminate confusion, requiring the special- collection’s Web site to follow a design set by the main institution could be a loss for users. In this instance, the main site was designed without user input, whereas the specialized library serving a smaller population was able to be more dynamic and responsive to its users. Finding an appropriate balance for a site used by students new to the field as well as advanced students is 168 INFORMATION TECHNOLOGY AND LIBRARIES | SEPTEMBER 2006 a challenge. Although the students in the study were all experienced computer and Web users, their familiarity with basic library concepts varied greatly. A few novice users expressed some confusion as to the difference between journals and index databases. There actually was a description of each of these sources on the site but it was not read. (The subjects barely read any of the site’s text, so it can be difficult to make some points clearer when users want to navigate quickly without reading instructions. Several subjects who did not bother to read text on the site still suggested having more notes to explain unfamiliar terms. However, if the site becomes too overloaded with explanations of library concepts, it could become annoying for more advanced users.) A separate page with a glos- sary is a possibility—based on the study, however, it will probably not be read. Another possibility is a handout for students that could have more text for new users without cluttering the Web site. Having such a handout would also serve to publicize the site. There was some concern prior to the study that offer- ing more advanced features, such as providing access to JAKE or indicating which journals are refereed, might be off-putting for new students; therefore, test questions were designed to gauge reactions to these features. Most students in the study did express some intimidation at not being familiar with these concepts. However, all the subjects eventually figured out how to use JAKE and, once they tried it, thought it was a good idea to include it. Even new students who had the most difficulty were still able to navigate and learn from the site to be able to use it efficiently. An online survey was added to the final design to allow continuous user input. The site consistently receives posi- tive feedback through these surveys. It was planned that responses could be used to continually assess the site and ensure that it is kept responsive and up-to-date; however specific suggestions have not yet been forthcoming. How valuable was usability testing to the Web-site design? Several good suggestions were made and imple- mented, and the process confirmed that the site was well designed. It provided some insight into how subjects used the Web site that had not been anticipated by the design- ers. Since usability studies are fairly easy and inexpensive to conduct, it is probably a step worth taking during the Web-site design process even if it results in only minor changes to the design. References and notes 1. W3C, “The W3C Markup Validation Service,” validator .w3.org (accessed Nov. 1, 2005); W3C, “The W3C CSS Validation Service,” jigsaw.w3.org/css-validator (accessed Nov. 1, 2005). 2. See Carol M. Barnum, Usability Testing and Research (New York: Longman International, 2002); Alison J. Head, “Web Redemption and the Promise of Usability,” Online 23, no. 6 (1999): 20–29; International Standards Organization, Ergonomic Requirements for Office Work with Visual Display Terminals. Part 11: Guidance on Usability—ISO 9241-11 (Geneva: International Organization for Standardization, 1998); Judy Jeng, “What is Usability in the Context of the Digital Library and How Can it be Measured?” Information Technology and Libraries 24, no. 2 (2005): 47–52; Jakob Nielsen, Usability Engineering (Boston: Academic, 1993); Ruth Ann Palmquist, “An Overview of Usability for the Study of Users’ Web-based Information Retrieval Behavior,” Journal of Education for Library and Information Science 42, no. 2 (2001): 123–36. 3. Joseph S. Dumas and Janice C. Redish, A Practical Guide to Usability Testing (Portland: Intellect Bks., 1999), 4. 4. John D. Gould and Clayton H. Lewis, “Designing for Usability: Key Principles and What Designers Think,” Commu- nications of the ACM 28 no. 3 (1985): 300–11. 5. Jakob Nielsen, “Heuristic Evaluation,” in Jakob Nielsen and Robert L. Mack, eds., Usability Inspection Methods (New York: Wiley, 1994), 25–62. 6. See Denise T. Covey, Usage and Usability Assessment: Library Practices and Concerns (Washington, D.C.: Digital Library Federation, 2002); Nicole Campbell, Usability Assessment of Library-related Web Sites (Chicago: ALA, 2001); Kristen L. Garlock and Sherry Piontek, Designing Web Interfaces to Library Services and Resources (Chicago: ALA, 1999); Anna Noakes Schulze, “User-Centered Design for Information Professionals,” Journal of Education for Library and Information Science 42, no. 2 (2001): 116–22; Susan M. Thompson, “Remote Observation Strategies for Usability Testing,” Information Technology and Libraries 22, no. 3 (2003): 22–32. 7. Government Services Administration, “Section 508: Sec- tion 508 Standards,” www.Section508.gov/index.cfm?FuseActi on=Content&ID=12#Web (accessed Nov. 1, 2005). 8. W3C, “Web content accessibility guidelines 2.0,” www .w3.org/TR/WCAG20 (accessed Nov. 1, 2005). 9. See Susan Augustine and Courtney Greene, “Discover- ing How Students Search a Library Web Site: A Usability Case Study,” College and Research Libraries 63, no. 4 (2002): 354–65; Brenda Battleson, Austin Booth, and Jane Weintrop, “Usability Testing of an Academic Library Web Site: A Case Study,” Journal of Academic Librarianship 27, no. 3 (2001): 188–98; Janice Krueger, Ron L. Ray, and Lorrie Knight, “Applying Web Usability Tech- niques to Assess Student Awareness of Library Web Resources,” Journal of Academic Librarianship 30, no. 4 (2004): 285–93; Thura Mack et al., “Designing for Experts: How Scholars Approach an Academic Library Web Site,” Information Technology and Librar- ies 23, no. 1 (2004): 16–22; Mark Shelstad, “Content Matters: Analysis of a Web site Redesign,” OCLC Systems & Services 21, no. 3 (2005): 209–25; Robert L. Tolliver et al., “Web Site Redesign and Testing With a Usability Consultant: Lessons Learned,” OCLC Systems & Services 21, no. 3 (2005): 156–67; Dominique Turnbow et al., “Usability Testing for Web Redesign: A UCLA Case Study,” OCLC Systems & Services 21, no. 3 (2005): 226–34; Leanne M. VandeCreek, “Usability Analysis of Northern Illinois USER-CENTERED DESIGN OF A WEB SITE | MANZARI AND TRINIDAD-CHRISTENSEN 169 University Libraries’ Web Site: A Case Study,” OCLC Systems & Services 21, no. 3 (2005): 181–92. 10. Jakob Nielsen and Rolf Molich, “Heuristic Evaluation of User Interfaces,” in Proceedings of the ACM CHI ’90 (New York: Association for Computing Machinery, 1990), 249–56. 11. Robin Jeffries et al., “User Interface Evaluation in the Real World: A Comparison of a Few Techniques,” in Proceedings of the ACM CHI ’91 (New York: Association for Computing Machin- ery, 1991), 119–24; Jakob Nielsen, “Finding Usability Problems through Heuristic Evaluation,” in Proceedings of the ACM CHI ’92 (New York: Association for Computing Machinery, 1992), 373–86. 12. Jakob Nielsen, “Heuristic Evaluation,” 25–62. 13. Jeffrey Rubin, Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests (New York: Wiley, 1994); Jakob Nielsen, “Why You Only Need to Test with Five Users, Alert- box Mar. 19, 2000,” www.useit.com/alertbox/20000319.html (accessed Nov. 1, 2005).