Duncan.indd The Role of Information Architecture in Designing a Third-Generation Library Web Site Jennifer Duncan and Wendy Holliday Library Web sites have evolved over the past decade, from simple pages with a few links to complex sites that provide direct access to hundreds of different resources. In many cases, this evolution occurs with little overall planning, often resulting in Web sites that are hard to manage and difficult for users to navigate.Thisarticleoutlines theprocess of using Information Architecture (IA) to redesign a third-generation library Web site from the ground up.The result was a much more usable and cohesive library Web site that meets the needs of a broad range of users. n 2003, the Utah State Univer- sity (USU) Library anticipated the third major redesign of their Web site. The original design of the site simply provided basic information about library resources and services. Like many library Web sites, it had grown over the years in both size and scope. By 2003, the site included several hundred pages and provided access to hundreds of electronic resources. It had grown without overall planning and it included several different graphic looks, with “legacy” pages from previous de- signs existing alongside newer content. Between 2000 and 2003, the homepage underwent two major overhauls, yet neither redesign was quite satisfactory. Graphic elements, layout, and some labels changed, but testing showed that users found the site confusing. The major problem was the underlying architecture of the site. While the library had graphically redesigned the Web site a few times, the underlying structure remained intact. The first and second levels received a graphic makeover but re- mained mapped to years of accumulated pages that were not organized coherently. As Louis Rosenfeld, a pioneer in the field of Information Architecture, suggests, this is a common problem in the current electronic information environment: Increased scope, volume, and for- mat types result in great content ambiguity, muddier information retrieval performance, and there- fore, place additional pressures on system design.1 USU looked to the emerging field of Information Architecture (IA) to address the muddiness of their third-generation Web site. For the purposes of the design project, we used Andrew Dillon’s broad Jennifer Duncan is Electronic Resources Librarian and Wendy Holliday is Coordinator of Library In- struction in the Merrill-Cazier Library at Utah State University; e-mail: jendun@library.lib.usu.edu and wendy.holliday@usu.edu. 301 mailto:wendy.holliday@usu.edu mailto:jendun@library.lib.usu.edu 302 College & Research Libraries July 2008 definition of information architecture: “[The] process of designing, implement- ing and evaluating information spaces that are humanly and socially acceptable to their intended stakeholders.”2 Infor- mation Architecture is part of the larger user-centered design movement, but it focuses more specifically on the underly- ing structure and navigational elements of information spaces. Pioneers in the field of IA recognized that the World Wide Web required a new approach to organization and structure to help users intuitively navigate complex information environments. According to Rosenfeld and Peter Morville, in their classic text on Information Architecture, IA is: 1. The combination of organization, labeling, and navigation schemes within an information system. 2. The structural design of an informa- tion space to facilitate task completion and intuitive access to content. 3. The art and science of structuring and classifying Web sites and intranets to help people find and manage informa- tion. 4. An emerging discipline and com- munity of practice focused on bringing principles of design and architecture to the digital landscape.3 Focusing on elements of organization, labeling, and structure, we applied prin- ciples and methods of IA to the design process for a completely new Web site launched in 2006. The USU Library recog- nized that, to many users, the Web site is the library and wanted to apply the same care and a ention involved in planning a new library building to the design of the library’s Web space. Related Literature Many libraries have applied usability principles and methods to the design and redesign of library Web sites. These projects tend to focus on top-level menu items, labels, and graphical layout.4 The importance of Information Architecture has just begun to emerge in the library literature. Troy Swanson described the importance of sound IA to the redesign of the Moraine Valley Community College (MVCC) Library Web site. Like USU’s site, the MVCC site grew from its original scope and size without much planning, suffering from unnecessary menu pages and confusing labels and wording. MVCC began their redesign process by identify- ing potential users of the site and ge ing user impressions of the existing site. They then mapped out a general organizational scheme and menu hierarchy. Swanson did not elaborate, however, on design processes and methodologies used to reach these decisions.5 Several library and information science researchers have also provided assess- ments of the information architectures of Web sites. Shelley Gullikson et al. conducted user tests to analyze the effec- tiveness of the architecture of a university Web site.6 David Robins and Sigrid Kelsey conducted usability tests and a user sur- vey to assess an academic library Web site.7 Louise McGillis and Elaine Toms also used task-based user testing to assess a university library Web site.8 In all of these cases, the assessments noted problems in labeling and categorization, all of which are central to a site’s information archi- tecture. These studies provide important cautionary notes for library Web site designers. As McGillis and Toms suggest, however, there are no simple checklists or universal solutions. Web designers should employ user-centered design principles to specific cases to create the most usable sites for various user populations. Theoretical discussions of IA support this idea of user-centered processes rather than universal guidelines. Toms notes that IA is a central component of information interaction, or the ways in which users interact with the content of a Web site.9 A sound site blueprint helps communicate content to users, increasing the site’s ef- fectiveness and promoting successful user interaction. Marsha Haverty argues that IA is an inductive process because, as a relatively new field, it “supports emer- gent phenomenon.”10 The IA design pro- Designing a Third-Generation Library Web Site 303 cess is one of “Constructive Induction.” Designers create solutions to meet the overall goals or functional requirements of the system by using individual build- ing blocks of structure, navigation, and interaction. IA then takes these individual design solutions to build the overall archi- tecture. When evaluating solutions from a user-centered perspective, ease of use and findability define success.11 The notion of induction is impor- tant to the application of IA principles and methods to the design of library Web sites. Many library Web sites have grown, even metastasized, into large and complex collections of information and search applications, as was the case at USU. IA is central to managing these increasingly complex information spaces, some of which need reorganization from the ground up. Instead of redesigning existing pages, USU decided that the best solution was to start from scratch. We applied both top-down and bo om-up approaches. We developed a program requirement document to outline what the Web site needed to do for its users. We then used inductive methods, such as card-sorting, to try to discover how users approached the information and applica- tions we hoped to provide via the library Web site. This comprehensive approach helped address persistent usability prob- lems in earlier iterations of our site. The Program Requirement Document In fall 2003, the Library formed a Web Architecture Task Force to design the information architecture for the new li- brary Web site.12 The Task Force’s first goal was to produce a program requirement document. Borrowing from the field of computer science and so ware develop- ment, the Task Force wanted to create a clear picture of the required functionality of the site.13 Bob Wiggins notes that many so ware design projects fail because of poorly defined requirements and because of disagreement on the priorities for the system.14 We wanted to document and prioritize the tasks the Web site should support and the information it should convey, as defined by all library stake- holders. To borrow from Barry Mahon and Alan Gilchrist, we wanted to design “for purpose.”15 The program requirement document was important for several reasons. Web site design is o en a political process, involving the competing interests of several different departments. The USU Library Web site also suffered from legacy issues; a number of different individuals created the existing content and there was no consistent updating schedule or main- tenance. Web site design also involves trade-offs.16 There is no way to meet every user or stakeholder need with any single design. The Task Force needed a way to address the political and legacy issues and develop a list of priorities to help achieve a commonly held vision for the site. A program requirement document helps communicate and hold site designers ac- countable to a common purpose. It also helps make design decisions more trans- parent. According to Julie Rowbotham, Web design projects can be traumatic because of competing needs and narrower departmental perspectives.17 We used several methods to assess stakeholder needs to develop the program requirement document. First, between November 24, 2003, and January 12, 2004, library web site users had the option to click on a Web-based survey with one question: “What are you trying to do on the library Web site today?” In total, 132 individuals responded. Twenty-one re- spondents le the question blank and ten respondents replied that they were just surfing, killing time, or that the library Web site was set as the default homep- age. Ultimately, we coded 101 responses as usable. This was a self-selected sample and targeted only interested library users. We coded survey statements into tasks and categories and ranked them in order of frequency. We also recorded reference desk sta- tistics to determine typical user tasks in the library. Over the course of a week, http:perspectives.17 http:trade-offs.16 http:system.14 http:success.11 304 College & Research Libraries July 2008 librarians recorded a brief statement about every reference question asked during selected desk shi s. We covered each two-hour shi once. We coded these statements into tasks and ranked them in order of frequency. Like the Web survey, this was not a large, random sample. It captured information from a specific sector of our user population: people mo- tivated to ask for help in the library. Our final data collection method was library staff interviews. Task force mem- bers interviewed staff in each library department and asked them to describe what the Web site ideally needed to be effective. A complete list of questions is available in Appendix A. We compiled the results of each interview in a spreadsheet and organized them into general catego- ries. We then circulated the spreadsheet to the entire library and asked staff members to rank each requirement on a scale of one to five, one being essential and five optional. Twenty-one staff members (of approximately seventy-five) provided rankings. We calculated the mean ranking for each item. We used these three methods to gather information for the program requirement document because we wanted to get multiple perspectives. It was not pos- sible, because of time and resources, to randomly sample library users about their use of the library Web site. We selected methods that could provide a quick and efficient glimpse of user tasks and goals. This picture was enhanced by the library department staff interviews. Library staff know what users should be able to do when they visit the library Web site, while many users are likely unaware of all the possibilities. From the surveys, reference transac- tions, and stakeholder interviews, the Task Force developed a list of program requirements, collapsing the rankings into three categories: Absolutes, Recom- mended, and Extras. In most cases, our judgment matched the mean rankings. In a few cases, we reranked items as absolute, even though the collective rank- ings would have placed them as a lower priority. In some cases, this was because of additional information provided by the user survey and reference transac- tions. In some cases, it was a judgment call. The most prominent example was information on services for distance learn- ers, including remote access to library resources. We felt that the Web site is the only way to access the library for distance learners, so we ranked their needs as Absolute. We also felt that access to our government publications program, as a regional depository library, was a top re- quirement. We then divided the program requirements into four broad categories: Collection Access, Information about the Library, Services, and Help. Collection Access The three data collection methods con- firmed that the top priority for the USU Library’s Web site is to provide access to collections. This is the core mission of the library, and both stakeholder interviews and users confirmed this. From the user survey, 76 of 101 respondents were trying to access library resources in some way. Their tasks were broken down more spe- cifically as follows: General or topical research: 27 Finding a book: 17 Finding an article: 16 Accessing course reserves: 14 Finding an audio book: 1 Looking for a specific reference source: 1 The Reference Desk statistics reflect a similar breakdown. Librarians recorded 94 transactions. We discarded nine be- cause they were related to physically locating a person or place (such as the bathroom) in the building. We considered eighty-five responses related to broader li- brary tasks. Of these, sixty related to find- ing books, articles, or course reserves, or ge ing started on researching a topic. Six of the questions in the “other” category related to finding or using a specific re- source, such as a master’s thesis or phone Designing a Third-Generation Library Web Site 305 TABLE 1 Absolute Requirements for Collection Access Requirement Mean Score Access to online catalog 1 Access to electronic resources, including indexing and abstracting databases, full-text databases, specialized reference sources, and electronic books 1.05 List of all e-journals 1.57 Reserves 1.57 Special Collections 1.65 Access to Digital Library 1.71 Access to government document program, including general description of collection and how to find and locate government documents no score* *Note: We consolidated many elements for this item. book. The staff interviews confirmed that access to resources is the top priority for the web site. Access to the online catalog, article databases, e-journals, the digital library, and Special Collections all rated highly, between 1 and 1.65. See tables 1 and 2. Information about the Library The user survey and stakeholder inter- views showed that users need to find information about the library. Specific information about services was a common theme that we present separately below. From the user survey, four respondents were looking for library hours, six were looking for news about the library or the building project, and one person was look- ing for the name of an employee. From the Reference Desk statistics, one person wanted to know about library hours, six were trying to locate a library department, the computer lab, or a physical resource in the library, and one patron had a question about journal circulation policies. Information about library operations was also a high priority among library staff. Library hours, directions, and contact information all ranked highly. Library hours, mailing address, and a general telephone number all ranked between 1.1 and 1.3, and some type of staff directory ranked at 1.8. Staff members highly ranked policies as a separate category. Policies on borrowing and patron privileges ranked highest (1.62–1.95), while more specific TABLE 2 Recommended Requirements for Collection Access Requirement Mean Score Ready reference sources, including free Web resources, such as style manuals and online dictionaries 2 New databases 2.05 Trial databases 2.25 Web sites for other libraries, including a link to Utah’s Catalog 2.62 Description of Art Book Collection, including art books, CDs and music, and Beat Collection 2.79 Specific links to government publication sources, including government metasites and portals, direct links to federal Web sites, and maps 3 http:1.62�1.95 306 College & Research Libraries July 2008 TABLE 3 Absolute Requirements for Information about the Library Requirement Mean Score Library and department hours (for all libraries and departments with public service hours) 1.1 Mailing address, general phone number, and general e-mail for entire library 1.24 Circulation information, including how to check out a book, patron privileges by category, and policies 1.65 Directory of personnel with contact information, listed by name and department; include staff expertise 1.8 Maps, directions, and parking tips for visitors 1.86 List of subject selectors 1.89 Calendar for library activities 2.19 Development 2.2 policies on food, e-resource use, and com- puter use ranked between 2 and 3. Library staff ranked development/ fundraising information highly (around 2.2). Stakeholders also thought it was important that the library promote itself and tell its story on the Web site to com- municate what we do and why it is impor- tant at USU. Information on our mission, goals, and staff accomplishments ranked between 2 and 3. In general, there were “marketing” and development compo- nents to much of the suggested “about the library” content. See tables 3 and 4. Services Accessing library services was another prominent category that emerged from the data. The most frequently requested service in the Web survey was circulation. Six patrons wanted to renew books or get information on what books they had TABLE 4 Recommended Requirements for Information about the Library Requirement Mean Score Map of building and stacks guide 2 Serials cut information 2.05 Mission statement and why we are important and relevant on campus 2.24 Policies and guidelines for acceptance of gifts 2.33 Gifts 2.5 Friends of the Library 2.52 Information on the building project 2.52 Collection Development Policy 2.74 Policies and procedures, including policies on food, computer use, appro- priate use of e-resources, etc. 2.79 Current issues, such as copyright, USA PATRIOT Act, and scholarly communication 2.9 Employment information, general and for students 3.1 Designing a Third-Generation Library Web Site 307 checked out. Three in-person ref- erence transactions also related to circulation questions. We recorded printing and copying questions in both the user sur- vey and reference transactions in addition to interlibrary loan and remote access questions. From the staff interviews and rankings, it was clear that the Web site should play a vital role in providing access to particular services. Requesting interlibrary loan (1.33) and distance educa- tion materials (1.6) both ranked highly. E-mail reference service, renewing books, and contacting a librarian for help also ranked between 1 and 2. Information about library services was also a high priority. Information about interlibrary loan services ranked highest, while in- formation on the instruction program, printing and copying, and troubleshoot- ing e-resources ranked slightly lower. See tables 5 and 6. Help Using the Library None of the user survey respondents said that they were coming to the li- brary Web site for help. The reference transactions, however, suggest areas in which patrons were seeking help. Most of these questions related to finding and accessing library resources. The stake- holder interviews also suggest that the TABLE 5 Absolute Requirements for Service Access Requirement Mean Score Order Interlibrary Loan materials 1.33 Contact someone for help 1.48 Renew books online 1.48 E-mail Reference Service 1.57 Order distance education materials 1.6 Provide feedback: suggestion box 1.9 Book purchase request 2.1 Link to WebCT 2.61 Troubleshooting information about database problems 2.63 library Web site should provide some help and instructions on how to use library resources and services and how to do library research more generally. In the help category, ge ing assistance with remote access was the most highly ranked item (1.38). Information on how to get help from a librarian ranked second (1.57), with the related task of contacting a subject librarian close behind (1.9). See tables 7 and 8. The Information Architecture Task Force used these rankings to develop a Program Requirement document.18 We used this document to guide us through the development of the site architec- ture. TABLE 6 Absolute Requirements for Information about Library Services Requirement Mean Score Interlibrary Loan form entry instructions 1.81 Interlibrary Loan—how to order materials 1.9 E-mail, phone contacts for Interlibrary Loan office regarding questions 1.93 Interlibrary Loan policies to answer patron questions 1.95 Interlibrary Loan notification of materials arrival—how notification comes and how long materials can be kept 2.05 Instruction program overview (what we offer and contacts) 2.33 http:document.18 308 College & Research Libraries July 2008 TABLE 7 Absolute Requirements for Help Using the Library Requirement Mean Score How to access databases from home; use proxy services 1.38 Contacting subject specialists for research help 1.9 Distance learners—policies, procedures, and instructions for all library services and resources available to them 2.1 Determining the Site Architecture Once the Task Force had developed the program requirements, we used a series of iterative methods to design the site’s structure, allowing us to test, revise, and retest. Card Sorting The Task Force began by conducting card sorts to see how we ourselves might group the 129 program requirements. We printed brief descriptions of each require- ment on cards and conducted an initial sort of all of the cards to generate basic ideas about groupings and to identify problem areas to test more rigorously with actual users. Because 129 cards are difficult to sort quickly in a test environ- ment, we narrowed the list to 52 cards, representing key categories as well as cards that the Task Force had a hard time placing in a group. Ten students and two faculty members sorted the 52 cards.19 We asked them to place the cards in four to six groups and said that they could create a small “problem” group for items that were hard to categorize. When they were finished sorting, we asked the testers to label each group. We normed the testers’ categories by taking their labels and placing them in similar categories with a consistent name. Using a card sort analysis template,20 we calculated the total number of cards in each category, how many times the same card appeared in the same category, and TABLE 8 Recommended Requirements for Help Using the Library Requirement Mean Score Finding books 2 Finding government documents 2 Finding information on a subject 2.05 Does the USU libraries own something (any format)? How to find out 2.2 LC call numbers (how to locate and use) 2.29 How to locate a copy of an article, in print or electronic format 2.33 Library location explanation and what goes where—stacks guide 2.35 Online tutorials 2.38 How to do Boolean searches 2.43 How to cite sources 2.52 How to evaluate search results 2.57 How to pick a topic 2.62 How to read SuDOC numbers 2.69 Information on resources for distance ed. teachers 2.9 http:cards.19 Designing a Third-Generation Library Web Site 309 the levels of agreement for card placement in a category. For example, most testers had a group of items related to accessing online resources that we called Access Collections. We also created a problem cat- egory for idiosyncratic groups. The final normed categories were: Access Collec- tions; Help/How Do I; General Information; Policies; Development/Fundraising; Services; Special Materials; and Problems. There was a surprising lack of agree- ment for most cards, even for items that librarians might consider easy to categorize, such as the online catalog. While most testers (73%) placed the catalog in the Access Collections category, two placed it in other categories. Testers placed only six of the 52 cards in just two categories; testers failed to place any card unanimously in a single category. Testers placed seven items in six differ- ent categories, suggesting that they did not have a consistent approach to these items. We then conducted a closed card sort with 39 low-agreement items. We asked testers to place cards into three prede- termined categories (About the Library, Services, and Help) and told them that they could refine these broad categories by placing cards in labeled subcategories. Three students and a librarian partici- pated in this sort. The results of the closed sort suggested that when users choose from broad but specific categories, group- ing is more consistent. Testers placed 22 of the 39 cards in the same group, while only placing one item in three different catego- ries. This test confirmed that context is a key factor in enabling users to recognize what a label might mean. Task Force members independently created possible organizational schemes based on the results of the sorts and selected three schemes to present for public comment. The first was a task- and topic-based scheme that was narrow and deep with only four broad top-level categories requiring the user to drill for content (figure 1). The second model was also task- and topic-based, but it was wide and shallow, with two more top-level cat- egories. Additionally, we divided the Help category in two (figure 2). The third and FIGURE 1 Model 1: Narrow and Deep 310 College & Research Libraries July 2008 FIGURE 2 Model 2: Wide and Shallow final scheme added an audience approach to the second model, thus broadening it further (figure 3). The Task Force presented these outlines to library staff at a town hall meeting in March 2004 and asked for feedback. The meeting participants favored the first nar- row and deep model, but with audience elements added and Help subdivided. Therefore, the following categories be- came the foundation for creating and testing labels and conducting preliminary FIGURE 3 Model 3: Task and Audience Designing a Third-Generation Library Web Site 311 usability tests on the final site architecture: Find Resources, About USU Libraries/General Information, Help (subdivided), and Ser- vices for… (subdivided by audience). Label Development and Testing Once we determined the top-level divi- sions, we began the label development process. We first held a brainstorming session with members of the library staff to get fresh ideas. This group considered the four general categories above, as well as some problematic labels from sublevels of the site. We then asked them to suggest different names for each area. The group, unasked, also recommended a change to the categories themselves, suggesting that truly unique collections at USU, such as Special Collections and the Art Book Collection, remain grouped together but separate from the Find Resources area. Although this was a deviation from the planned structure, we agreed that this separation might address some problem categories and highlight what is truly special about our library. The Task Force then tested the list of label recommendations with a survey. The survey included paragraph descriptions of what a label would represent, with a list of three to four label suggestions below the description. We asked respondents to circle the label they thought best rep- resented the description or to make their own recommendations. We distributed surveys at each Reference Desk, and 29 of 50 people returned usable survey results. These participants were already library users. Based on the results, the Task Force chose the following top level labels to be- gin the final round of usability tests: Find Research Resources & Tools Unique Collections About USU Libraries Services for… (subdivided by audience) Get Assistance (formerly Help; subdi- vided) Site Architecture Development and Testing Finally, based on feedback from the town hall meeting and label testing, the Task Force was ready to create a blueprint of the complete site architecture. We took all 129 program requirements and resorted them into the proposed organizational scheme producing a comprehensive outline in- cluding every program requirement. We then tested this model with library users through rapid paper prototyping. This method was neither cost nor resource intensive and allowed us to test without distracting layout or graphic elements so that we could focus on structure and labels. We printed label outlines from the top two or three levels on one sheet of paper for each level and created a series of tasks for testers to complete using our FIGURE 4 Testing Model A 312 College & Research Libraries July 2008 FIGURE 5 Testing Model B FIND RESEARCH RESOURCES AND TOOLS Online Resources Library Catalog Special Collections Art Book Room Digital Library Government Documents Reserves ABOUT USU LIBRARIES Who We Are Policies and Procedures Employment Opportunities Information About our Collections Visiting the Library Contact Us News, Current Issues, and Events Supporting the Library FAQs Get Assistance Contact Someone for Help Help with the Research Process Help Using the Library Technical Troubleshooting Services for… Undergraduates Graduate Students Faculty Distance Learners Community members paper-only site (see Appendix B). Start- ing with the top level, we asked testers to point to the label, or “link,” that they would choose to complete that task. If a second link was required, the facilitator presented the next level of the hierarchy. Users could request to go “back” or simply give up if they were unable to complete the task. We tested two different approaches. In the first model (Model A, depicted in figure 4), we listed only the four main categories but included a brief paragraph describing each. The second model (Mod- el B, depicted in figure 5) listed the same categories but instead displayed links to all content included at the second level. Test results showed that users per- formed the tasks more quickly and suc- cessfully using Model B. For example, with Model A, only four of six students located the link for Special Collections. All six were successful when it was prominently displayed as a sublink in Model B. The tests also suggested that we needed to create additional access points to many of the general information elements (for example, circulation information or group study rooms) because users had no clear navigation pa erns to this informa- tion, spli ing fairly evenly between About and Services. In addition, the distinction between Help with the Research Process and Help Using the Library was not clear to most testers, suggesting that we should collapse these categories. In testing with faculty, there was concern about placing Interlibrary Loan (ILL) only under the Services links. Two faculty members failed to see Interlibrary Loan under Services and looked for it under Find instead. A single access point for ILL might hide this service from its largest constituency. The Task Force decided to proceed with Model B, with some modifications. The label testing suggested that Services was ineffective and did not really mean much to testers—interesting because many libraries persist in using this bit of jargon, as had we. We replaced Ser- vices with Quick Links, which the survey indicated was a clear favorite. The Task Force then created a low-fidel- ity Web prototype.21 We tested 13 students (the tasks appear as Appendix C). Usabil- ity tests confirmed many of the previous findings from card sorts and user tests. Namely, users did not consistently choose the same link, but followed two general paths for informational questions. For example, students selected Quick Links for Students, FAQs (under About), or Poli- cies and Procedures (also under About) for questions about reserving a study room or finding out about circulation periods. Most students successfully completed the tasks via one of the multiple avenues we provided. The final round of student test- ing confirmed the Task Force’s decision to build upon Model B and to build in some redundancy by providing multiple access points for information. http:prototype.21 Designing a Third-Generation Library Web Site 313 Proposed Information Architecture In June 2004, the Task Force prepared a Design and Implementation Report for the library’s Executive Council. The report proposed an organizational structure for a new Web site slated for construction dur- ing the 2005–2006 academic year, ready for deployment when our new building opened.22 The report recommended orga- nizing the site into five primary content areas as described in the revised Model B. The Task Force acknowledged that the addition of layout and graphic elements might eventually necessitate revision but proposed the following top-level subdivisions: • Find Resources and Search Our Col- lections: A central point from which to connect to information resources such as catalogs, databases, and e-journals. • USU Unique Collections: A showcase highlighting Special Collections and Ar- chives, the Digital Library, Government Documents, and our Art Book Room. • General Information: Information about the library as an organization. While this heading appeared to be a catchall, open card sort results frequently indicated that library users look for this category. • Get Help: A jumping-off point for those who have hit an impasse, provid- ing access to a wide array of contact information, tutorials, technical help, and information about library instruction. • QuickLinks for…: An audience-driv- en area providing space for communicat- ing information frequently requested by a specific demographic of our community. Implementation and Follow-Up The Web Architecture Task Force finally delivered dra schematics23 of the pro- posed Information Architecture to the Web Steering Committee, which was responsible for the design phase of the project. Library staff also received the schematics for comment. Comments were almost universally favorable, perhaps because the process had been so participa- tory and transparent. Many of the politi- cal landmines typical of such a redesign seemed to have been averted. The Task Force agreed to continue revising the schematics based on further library staff feedback and requests from the Web Steering Commi ee, especially because the Commi ee felt it might need more detailed outlines during implementation. The Task Force also made several final implementation recommendations. First, final authority to make decisions on both homepage real estate priorities, as well as the commitment of resources toward the redesign, should vest in the Library Executive Commi ee, following recommendations from the Task Force and the Web Steering Commi ee. Nei- ther Web group had sufficient authority to determine organizational priorities to negotiate link placement between departments. Nor did either Web group have the fiscal authority to determine the allocation of resources toward this project. Both of these issues were substantially po- litical in nature and best le to the library administration. The IA Task Force strongly recom- mended that the Web Steering Commit- tee should receive adequate resources to implement the proposed site architecture. The original recommendation was to use a database-driven model for content management. This proposal would cost more up front but save money and time in the long run, as well as making it easier to maintain a more current Web site. It was also critical for the Web site to have a consistent look and feel through- out and be easy to update. A database model would have facilitated this by using a single graphic design to create a “template” incorporating cascading style sheets populated by the databases, rather than having several departments create their own static pages with a different look. In the end, the commi ee did not fully develop the database model because of staffing and budget issues. Databases do populate some sections of the site; however, for the majority of content, the library uses static templates and staff add http:opened.22 314 College & Research Libraries July 2008 and edit content using an HTML editor like Dreamweaver. The IA Task Force remained involved during the testing phase of the site. Testing should be continual as any site develops, and the Task Force served as a resource for the Web Steering Commi ee, conducting usability studies throughout the design and implementation. The Task Force had substantial insight into whether a usability issue was a problem with architecture or some other design element.24 Conclusion Information Architecture is o en a for- go en element in Web site redesign. By detailing the step-by-step process that one library took to develop and test the architecture of its Web site, we hope to elucidate the importance of including IA as part of any library Web site redesign project. As the literature originally indi- cated, there is no clear and simple path to follow to arrive at a fully developed Web site. This article a empts to describe the exact processes—developing a program requirement document, grouping the requirements through card sorts with several types of users, label brainstorming and testing, rapid paper prototype testing of multiple model sites, low fidelity Web tests, and proposing implementation recommendations—that we undertook to come up with an architectural blueprint. While individual libraries must consider their own user populations and how they conduct and respond to usability tests, this project suggests specific methods to employ when designing and testing the underlying structure of a Web site. Ultimately, continual usability testing of the proposed architecture is central to ensuring that a design is, in fact, user- centered and not simply appealing to Web designers or librarians. The usability tests conducted during the IA phase of the Web design process were easy and low- tech but provided sufficient information to continue to move the process forward. More rigorous usability testing took place once graphic artists and Web developers began designing and programming. Be- cause the underlying site structure was already solid, however, we were not dis- tracted by graphics or technical bells and whistles when pu ing content to page. Perhaps the most useful thing we learned from this process was the impor- tance of multiple redundancies in link placement. When the Task Force initially met, we thought it would be best to have a “clean” site with each bit of information neatly compartmentalized in a single loca- tion. As testing progressed, however, we discovered that there was no such thing as a “typical” user following consistent paths to specific information items—card sorting, rapid-paper prototyping, and live Web tests validated this finding. Therefore, we altered our original presuppositions in favor of a design that included multiple pathways to many content areas. The design or redesign of an organiza- tional Web site is o en fraught with dis- sension and rancor. A rigorous IA process with usability testing helps eliminate this friction because the design is based on evi- dence rather than individual or commi ee preference. At Utah State University, the IA process minimized internal conflict within our organization. Additionally, because we continually requested input and feedback on the architecture process from all the library stakeholders, the level of buy-in and approval was quite high. The work of the Information Architec- ture Task Force took just over six months to complete; however, the newly designed Web site did not go live until more than two years a er this process started.25 Nonethe- less, the final implementation essentially followed the outlines recommended by the Task Force. There were a few changes in the planned site architecture. The most significant change was the loss of the Quicklinks for… subdivision. Final usability testing indicated that students, faculty, and staff made li le use of the audience com- ponent of the site, and, when they did, it was difficult to predict what users actually expected to appear there. http:started.25 http:element.24 Designing a Third-Generation Library Web Site 315 Library Web commi ees should realize was invaluable in building a Web site that that IA will slow site development; how- was clearly and logically organized, easily ever, when the site launches, the payoff navigable, and favorably received by a is enormous. Ultimately, our IA process wide range of library stakeholders. Appendix A: Stakeholder Interview Questions Please do not feel confined to what is currently available on the library Web site. Give us your ideal wish list. Focus on tasks to be supported and information that needs to be provided in a general way, rather than on specific links. 1. In an optimal world, how would staff use the Web site? What tasks does the Web site need to support so that library staff can do their jobs? (example: Finding library policies to answer patron questions) 2. In an optimal world, how would users use the Web site? What tasks does the Web site need to support so that patrons can use your department’s services and products? (example: Finding citations to articles or ordering a book from ILL) 3. What information does the Web site need to convey to users? (Hint: Think about questions that you get at service desks or via the telephone)(example: The library hours and information about fines) 4. What information does the library need to convey to other stakeholders? (ex- ample: Marketing library services or a racting donors) 5. Do you think you have discrete audiences for the Web site? What are they? 6. How much content do you provide via the Web and in what format? (ex- ample: How many Web pages of information? How many products? Can the information be placed in a database for more efficient content management?) 7. Who creates and maintains this content? How o en does content need to be updated or deleted? 8. Do you have other content management concerns? 9. What are your top three priorities for the design of the Web site? 10. What didn’t we ask that we need to ask? 316 College & Research Libraries July 2008 Appendix B: Tasks from Paper Prototype Testing Student Paper Prototype Testing 1. How many books can an undergraduate student check out and for how long? 2. How do you reserve a study room in the library? 3. How can you learn how to read a call number? 4. Find an item that your instructor has placed on reserve. 5. What can you do if you are having trouble connecting to one of the library resources or databases? 6. Contact a librarian for assistance with your business class project. 7. Locate an article for your paper on steroids and baseball. 8. When is the library open on Saturday? 9. How can I find some information on what is available in Special Collections? Faculty Paper Prototype Testing 1. Find Web of Science. 2. Request an item from interlibrary loan. 3. Contact a subject librarian for help. 4. How can you get more information about how to place an item on reserve? 5. Schedule a library session for your class. 6. How can I suggest that the library buy a book for the collection? 7. How can I find some information on what is available in Special Collections? Appendix C: Web Prototype Testing Tasks 1. How many books can an undergraduate check out and for how long? 2. How do you reserve a study room in the library? 3. Find an item that your instructor has placed on Reserve. 4. Locate an article for your psychology paper on gender stereotypes. 5. Find a definition of the word “ontology.” 6. What can you do if you are having trouble connecting to one of the library resources or databases? 7. Contact a librarian to help you with your English 1010 assignment. Designing a Third-Generation Library Web Site 317 Notes 1. Louis Rosenfeld, “Information Architecture: Looking Ahead,” Journal of the American Society for Information Science and Technology 53, no. 10 (2002): 875. 1. Andrew Dillon, “Information Architecture in JASIST: Just Where Did We Come From?” Journal of the American Society for Information Science and Technology 53, no. 10 (2002): 821. 2. Louis Rosenfeld and Peter Morville, Information Architecture for the World Wide Web, 2nd ed. (Cambridge, Mass.: O’Reilly, 2002): 4. 3. See, for example, Brenda Ba leson, Austin Booth, and Jane Weintrop, “Usability Testing of an Academic Library Web Site: A Case Study,” The Journal of Academic Librarianship 27, no. 3 (2001): 188–98; Galina Letnikova, “Usability Testing of Academic Library Web Sites: A Selective Annotated Bibliography,” Internet Reference Services Quarterly 8, no. 4 (2003): 53–68; Louise McGillis and Elaine G. Toms, “Usability of the Academic Library Web Site: Implications for Design,” Col- lege & Research Libraries 62, no. 4 (2001): 355–67; Susan McMullen, “Usability Testing in a Library Web Site Redesign Project,” Reference Services Review 29, no. 1 (2001): 7–22; Tiffini Anne Travis and Elaina Norlin, “Testing the Competition: Usability of Commercial Information Sites Compared with Academic Library Web Sites,” College & Research Libraries 63, no. 5 (2002): 433–48. 4. Troy Swanson, “From Creating Web Pages to Creating Web Sites: The Use of Information Architecture for Library Web Site Redesign,” Internet Reference Services Quarterly 61, no. 1 (2001): 1–12. 5. Shelley Gullikson et al., “The Impact of Information Architecture on Academic Web Site Usability,” Electronic Library 17, no. 5 (1999): 293–304. 6. David Robins and Sigrid Kelsey, “Analysis of Web-Based Information Architecture in a University Library: Navigating for Known Items,” Information Technology and Libraries 21, no. 4 (2002): 158–69. 7. McGillis and Toms, “Usability of the Academic Library Web Site,” 355–67. 8. Elaine G. Toms, “Information Interaction: Providing a Framework for Information Ar- chitecture,” Journal of the American Society for Information Science and Technology 53, no. 10 (2002): 855–62. 9. Marsha Haverty, “Information Architecture without Internal Theory: An Inductive Design Process,” Journal of the American Society for Information Science and Technology 53, no. 10 (2002): 839. 10. Ibid, 839–45. 11. Members included Jennifer Duncan, Wendy Holliday, Rob Morrison, Daren Olson, and Sandra Weingart. 12. For an example of how program requirement documents have been applied to IA, see Steve Toub, Evaluating Information Architecture (Argus Center for Information Architecture, 2000). Available online from h p://argus-acia.com/white_papers/evaluating_ia.pdf. [Accessed 18 October 2004]. 13. Bob Wiggins, “Specifying and Procuring So ware,” in Information Architecture: Designing Environments for Purpose, ed. Alan Gilchrist and Barry Mahon (New York: Neal-Schuman, 2004), 69–85. 14. Alan Gilchrist and Barry Mahon, Information Architecture: Designing Information Environ- ments for Purpose (New York: Neal-Schuman, 2004). 15. For an overview of design trade-offs and Web usability, see Jakob Nielsen, Usability Engi- neering (San Diego: Academic Press, 1993). 16. Julie Rowbotham, “Librarians—Architects of the Future?” Aslib Proceedings 51, no. 2 (1999): 59–63. 17. Jennifer Duncan and Wendy Holliday, “USU Web Architecture Task Force Program Requirement Document.” (Feb. 2004). Available online from h p://library.usu.edu/elecres/archi- tecture/web-program-requirements.pdf. [Accessed 28 November 2006]. 18. For a discussion of the validity of relying on small number of test subjects for usability tests, see Jakob Nielsen, “Why You Only Need to Test with 5 Users,” Alertbox (Mar. 19, 2000). Available online from www.useit.com/alertbox/20000319.html. [Accessed 27 September 2006]. 19. Joe Lamantia, “Analyzing Card Sort Results with a Spreadsheet Template” (Aug. 26, 2003). Available online from www.boxesandarrows.com/archives/analyzing_card_sort_results_with_a_ spreadsheet_template.php. [Accessed 27 November 2006]. www.boxesandarrows.com/archives/analyzing_card_sort_results_with_a www.useit.com/alertbox/20000319.html 318 College & Research Libraries July 2008 20. Available online from h p://library.usu.edu/elecres/architecture/Information_Architec- ture/index.html. [Accessed 27 November 2006]. 21. Because our new building was slated to open in October 2005, we initially intended the Web site to go live in September. Due to overwhelming demands associated with the move, the Web site did not go live until May 2006. 22. The complete blueprint is available online from h p://library.usu.edu/elecres/architecture/ Architecture_Blueprint.ppt. [Accessed 27 November 2006]. 23. A professor in the Technical Writing program who planned to teach a class on Web site usability approached the Task Force looking for a major project for the students to help test and develop in spring 2005. The Task Force strongly encouraged the Steering Commi ee to take advantage of this opportunity, which they did. Almost every major organizational change to the final site was a direct result of recommendations suggested by this class. The reports from this major usability study are available online from h p://library.usu.edu/elecres/architecture/ First%20Year%20Students%20Test%20Report.pdf; h p://library.usu.edu/elecres/architecture/ Advanced%20Undergraduates%20Test%20Report.pdf; and h p://library.usu.edu/elecres/ar- chitecture/Graduate%20Students%20Usability%20Test%20Report.pdf. [Accessed 27 November 2006]. 24. The initial graphic design phase lasted through December 2004; usability testing occurred during spring 2005; a second graphic design phase began in summer 2005 with the redesign of the university Web site; and our newly designed page went live in spring 2006.