Bracke.indd Finding Information in a New Landscape: Developing New Service and Staffing Models for Mediated Information Services Marianne Stowell Bracke, Michael Brewer, Robyn Huff-Eibl, Daniel R. Lee, Robert Mitchell, and Michael Ray In response to changing user behavior and decreased funding, the Uni- versity of Arizona Library recognized a need to reevaluate how it provided information and referral services. A project team conducted action gap surveys to determine customer satisfaction, logged questions actually asked to establish appropriate staffing needs, and calculated the cost of providing these services. As a result of the data gathered, new service and staffing models were implemented that reduced both the number of service points and reliance on professional staff without a reduction in perceived quality. he University of Arizona (UA) Library serves a large, diverse campus population of over 37,000 students and 1,500 fac- ulty members. Like those at other colleges and universities, these students, instruc- tors, and researchers have found new tools and opportunities, many provided by the library, for meeting their informa- tion needs. Library use pa erns have thus shi ed, and customer need for assistance from reference and circulation desks has declined. At the UA Library, these changes included 88% fewer ARL-defined reference transactions at reference desks from 1991 to 2004, dropping from 385,215 transactions to 44,856. By the summer of 2003, the campus climate was also changing. Because of decreases in state funding, the campus was forced to operate on a significantly smaller budget. All departments, includ- ing the library, were competing for a share of limited resources. The state and Marianne Stowell Bracke is an Agricultural Sciences Information Specialist & Associate Professor of Library Science at Purdue University; e-mail: mbracke@purdue.edu. Michael Brewer is Humanities Librarian at the University of Arizona; e-mail: brewerm@u.library.arizona.edu. Robyn Huff-Eibl is Team Leader for the Materials Access Team at the University of Arizona; e-mail: huffr@u.library.arizona.edu. Daniel R. Lee is Interim Team Leader for the Undergraduate Services Team at the University of Arizona; e-mail: leed@u. library.arizona.edu. Robert Mitchell is Undergraduate Services Librarian at the University of Arizona; e- mail: mitchellr@u.library.arizona.edu. Michael Ray is Assistant to the Dean at the University of Arizona; e-mail: raym@u.library.arizona.edu. 248 mailto:raym@u.library.arizona.edu mailto:mitchellr@u.library.arizona.edu http:library.arizona.edu mailto:huffr@u.library.arizona.edu mailto:brewerm@u.library.arizona.edu mailto:mbracke@purdue.edu Finding Information in a New Landscape 249 university budget situation resulted in the loss of 6 positions (4 librarians, 2 parapro- fessionals) in the library in 2003 and the potential for further cuts. The library staff had to rethink how it provided services, including the information and referral help provided at service desks, to stay within its budget while still meeting cus- tomer need. It was clearly time to change operations to meet changing needs. To determine what changes were needed, a process improvement project team was formed in fall 2003 through the library’s strategic planning process. This project team comprised 3 librarians involved in providing information and referral service, the team leader from the Materials Access team, and the interim Team Leader of the Undergraduate Ser- vices team. This last team also manages the Information Commons. In addition, the team was staffed with an organiza- tion development and human resources (OD/HR) professional with extensive experience in process improvement. The team named itself “Finding Information in a New Landscape,” or FINL. It was charged with analyzing and improving the effectiveness and cost-effectiveness of the library’s information and refer- ral processes wherever they took place throughout the library (including the Main Library, Science-Engineering Li- brary, Fine Arts Library, and the Special Collections). It would develop criteria for measuring and assessing information and referral services and a system for ongoing assessment resulting from these processes, including those at the large Information Commons in the Main Library, branch library reference desks, information and referral service provided at circulation service points, as well as chat reference, e- mail reference, referrals, and in-depth con- sultation. In addition, the project team was charged with assessing the current model against the criteria developed, reformulat- ing the current model to make it more effective and less costly, and developing a plan for an ongoing comprehensive model for information and referral services. Many questions needed answers. To determine the value of services provided to customers, measures of accuracy, avail- ability, approachability, and customer satisfaction were needed, as well as the level of customer self-sufficiency result- ing from the services. Other unknowns included the appropriateness of referrals from the service desks to subject and service specialists and the timeliness of responses to referrals. Finally, the library had not assessed the existing model of information and referral to determine if it made the most cost-effective use of staff. Or were there cost savings to be gleaned by improving the existing processes? The latest budget cuts had resulted in the loss of librarian and staff positions. The library needed to allocate its staff judiciously and find staff savings wherever possible while providing the quality of service needed by customers. Background In response to customer need, the library opened the Information Commons (IC) in the Main Library in January 2002. The IC is a spacious, mixed-use area containing over 250 computers with access to the In- ternet, productivity so ware, and course- specific so ware as well as 27 group study rooms, nonlibrary student services, and a multimedia computer area. The IC is staffed 24 hours a day during the semester with both library staff and student work- ers to provide information, reference, and so ware assistance. The staffing model for the IC Desk was intentionally formulated to provide a range of expertise. At the time FINL began its work, 33 librarians and paraprofessionals covered daytime and evening hours, including weekends. Paraprofessionals covered late-night and overnight hours (which are referred to as Extended Hours). A thorough training program was instituted for all library staff who worked at this desk when the IC opened.1 Circulation services in the Main Library were available one floor above the IC with staff and student assistants available 119 hours per week. 250 College & Research Libraries May 2007 Information and referral hours were scheduled at service desks in all branches. The circulation staff at these sites were increasingly relied upon to provide in- formation and referral services, as they were available more hours and were o en the first staff members customers encoun- tered when they entered any library site. Subject and service specialists provided in-depth consultations on request. Staff at all service desks were expected to refer difficult questions to these specialists. Additionally, chat reference had been in place since October 2002, and e-mail refer- ence continued to be offered to growing demand. Having so many different individuals staffing reference desks and reference services such as e-mail and chat reference offered some benefits, such as provid- ing a large and diverse pool of trained individuals. However, the disadvantages to this model were becoming more pro- nounced. The total number of hours any individual worked at a service site was very small, as it was spread across the pool (an average of 3–4 hours/week). Staff complained that this was not enough to keep needed skills sharp, especially in new or constantly changing areas like chat reference or so ware troubleshoot- ing. It was difficult to provide and main- tain training for that many individuals. Additionally, the decrease in hours also led to decreasing feelings of a achment to, and identity with, reference work. Trading these hours with colleagues was frequently the first choice in gaining some flexibility in busy schedules. The library was also asking librarians to take on new work (such as grant writing, fundraising, digitization projects, and increasing fac- ulty collaborations) identified through the strategic planning process. These factors, combined with changing user behaviors, declines in reference questions, circulation statistics and gate counts, and decreases in staffing and resources led the library staff to reevaluate a service that may or may not have been meeting the needs of customers in the new millennium. Methodology The team knew from experience that unstructured, trial and error approaches o en lead to the first available solution or fad, rather than finding the best solution. These approaches had contributed to the Library’s inability to measure success. Additionally, there was no good baseline from which to measure information and referral performance from a customer perspective. FINL did not want to opti- mize the solving of a local problem at the expense of the larger system (pushing the problem to someone else). The team agreed to use a structured problem-solving approach. A variety of terms are used to describe structured problem solving: “Plan, Do, Check, Act” and “Systems Requirement Analysis” are a couple of common examples. The FINL team selected a method based on the Six Sigma approach to eliminating defects in processes that is increasingly applied in service industries.2 The approach identi- fies five phases to address in improving an existing process: Define, Measure, Analyze, Implement, and Control, also known as DMAIC (pronounced “Duh- MAY-ick”). The first three phases—Define, Mea- sure, and Analyze—depend on the ability of the organization to monitor its processes and provide characterization of the process “as it is.” In the Define phase, the scope of the process is agreed upon and a clear and compelling reason for improvement is made based on the voice of the customer. Data are collected that gauge the performance and capabilities of the current process in the Measure phase. In the Analyze phase, it is important to make sense of all the data gathered and to use those data to discover areas of delay, waste, and poor quality. The final two phases—Implement and Control—pro- vide the ability to optimize the process through control systems that reflect the ability of the process to meet the require- ments of the customer. The goal of the Implement phase is to pilot solutions that improve quality, manage or reduce costs, Finding Information in a New Landscape 251 and ensure improvements are acceptable to the larger organization. In the Control phase, procedures are put in place to en- sure the improvements can be sustained. This includes continued monitoring of process capabilities compared with cus- tomer requirements. Support for Change The University of Arizona Library began applying process improvement methods to its operations in the early 1990s. The ability of FINL to make use of DMAIC as a structured problem-solving tool reflected the infrastructure of support provided by the organization. This infrastructure is described by Phipps as including a “Leadership System,” a “Team System,” a “Planning System,” a “Process Im- provement System,” and a “Performance Management System.”3 These system elements are what Sveiby describes as the “Internal Structural Assets” of the organization.4 Each system comprises a part of the “Internal Structure,” which, along with “Employee Competence” and the relationship with customers and stakeholders in the “External Structure,” comprise three types of intangible (vs. tangible physical and financial) assets of an organization. • The “Leadership System” involved development of a shared leadership ethic and practice. Facilitation and structured problem-solving skills are a key leader- ship competency required for this shared leadership approach to work. • The “Team System” reinforced the evolution of the team leader’s facilitative role, but its primary purpose is to increase organizational performance. Creating the team system involved the establishment of performance goals and standards, as well as the redesign of work units around processes and related customer services. Teams were asked to focus on what cus- tomers wanted. Assessment competencies are key to understanding customer needs and establishing team performance goals. • The “Planning System” was based on the practice of “Hoshin Kanri”—a Japanese term describing planning that focuses and aligns the organization to achieve the highest quality for customers by conducting continuous improvement and discovering innovative break - throughs.5 It was this planning process that identified the information and refer- ral process as a focus for assessment. • Central to continuous improvement to achieve quality as the customer sees it is the “Process Improvement System.” A process improvement (PI) system contains research and action capabilities that sup- port an organization’s ability to assess what customers value, to analyze performance of services and products, and to use the analysis to make “value-added” improve- ments that would exceed customers’ ex- pectations. The existence of this PI system supported the library’s Cabinet allocation of staff time and resources to learning the DMAIC approach. To strengthen the knowledge of PI methods and tools on the FINL team, a member of the Human Resources and Organization Effectiveness team was assigned specifically to provide PI knowledge and support. • The library had appointed a Process Improvement Resource (PIR) Group to support teams in learning and utilizing a structured approach to problem solving and provided funding to train the group in this methodology. • The “Performance Effectiveness Management System” (PEMS) served to support all the other system elements by ensuring direction and opportunities for individual growth, team learning, feedback, reward, and recognition. It also emphasizes se ing measurable standards for individuals and processes to provide the means for assessment of success or need for improvement. Each member of the FINL team knew going into the project that the learning and results of their work would find support in the PEMS process through individual developmental re- views, promotion and continuing status reviews, and team reviews. It was in this context of these organiza- tional assets that the FINL team pursued I 252 College & Research Libraries May 2007 its work. The DMAIC phases describe the activities and results gained as the team proceeded to study the information and referral process. Using Data to Make Decisions about Reference Processes As noted above, FINL was charged with investigating the information and referral processes wherever these activities took place. When FINL began its study of these processes, the only real data the team had were the annual totals of reference and directional questions that were collected for, and submi ed to, the Association of Research Libraries (ARL) each year. This was sufficient, as noted in the introduc- tion, to indicate gross trends. The library was experiencing a marked decline in the volume of questions—but these data did not give FINL much detail from which to work. The library did not set institutional standards for the success or quality of the information and referral process. Furthermore, any data gathered about refer- ence and referral were often incomplete and questionable due to lack of consistent un- derstanding or application across the library. To define and measure the use of the in- formation and referral service, one of the first tasks faced by the FINL team was the creation of a process map from which specific service transactions were identified, out of which a focus emerged for gathering customer perceptions about transactions that would pro- vide actionable data. A er considering a variety of data points that might bear on these issues, FINL ulti- mately determined that what was needed most was more information about the nature of the questions customers were asking at service sites, along with more information about what users thought about the quality of the specific services the library ser- vice desks were providing. At the time, information and referral data were not gathered consistently across areas within the library, and the data gathered were somewhat suspect. FINL realized that it needed to improve the data infrastructure to be able to make quality, data-driven decisions. This led the team to update and consolidate the data collection infrastruc- ture to complete the measurement phase. Additionally, since cu ing costs was a significant part of the team charge, FINL also needed a fundamental understand- ing of the major drivers of the cost of pro- viding reference service. Thus, FINL used three approaches—question logging, surveying the customer perspective, and analyzing the cost of the services—to bring together an understanding of the whole context. FIGURE 1 Current State I&R Process Map nstruction for Self- Sufficiency Start: Initiate Interaction Conduct Interview Frame Questions (understand root request) Technical Questions Trouble- Shooting Select Source Run Queries Direct Referral Employ Search Strategy Model and Explain Evaluate Sources (Relevance, Accessibility) Failure/ Incomplete Follow-up Contact Instruction for Self- Sufficiency Explain Process of Evaluation Answer Question Stop Finding Information in a New Landscape 253 Logging and Analyzing Questions FINL chose to log every question asked at each service site to learn more about the nature of the questions being asked. Ultimately, three rounds of question logging were conducted at each site. Each round of question logging lasted two weeks and was scheduled during representative periods of the fall and spring se- mesters. The first round of logging took place in February 2004 to get a firmer understanding of the types of questions ac- tually being asked at each site and the skill level needed to answer them. FINL asked the service providers at each desk to record each question. This included relatively easy (or even trivial) questions such as “where is the bath- room?” and “can I use the stapler?” as well as more complex questions. Additionally, since the team was aware that a skillful reference interaction can turn a relatively general initial question into something more specific to the customer ’s actual needs, the team asked staff to record the question they actually answered, not the question as it was initially posed. Refer- ence staff also recorded the length of time it took them to answer the question (in increments of less than a minute, 1–5 min- utes, 5–10 minutes, and over 10 minutes); whether the question came in by phone or in person; and whether a referral to a subject specialist was made. The first round of question logging used handwritten forms. This proved cumbersome in the extreme, given the variations in legibility of individual handwriting styles, as well as the need to type each entry into an Access database. For subsequent rounds of logging, team members developed a Web-based ques- tion logging form, which eliminated the TABLE 1 Log Data: % by Staff Level STAFF LEVEL with % by site (total# of responses) Site Student Generalist Specialist Info Commons 70% (2072) 25% (747) 4% (121) Science Ref. 65% (383) 30% (175) 5% (28) Special Coll. 57% (146) 39% (99) 4% (10) Fine Arts Ref. 28% (13) 41% (19) 30% (14) Avg. Ref. 68% (2614) 27% (1040) 5% (173) Main Info 70% (1348) 27% (527) 2% (44) Circ. Main 74% (500) 24% (163) 2% (14) Circ. Science 81% (266) 17% (57) 1% (4) Circ. Fine Arts 67% (263) 28% (110) 5% (22) Avg. Circ/Info 72% (2377) 26% (857) 3% (84) Overall Average 70% (4991) 27% (1897) 4% (257) data entry work and also (after some initial apprehension on the part of some reference providers) proved popular with desk staff. FINL quickly learned to capitalize on any success in this difficult process by introducing other online data collection tools. With two weeks’ worth of question logs completed, the next step was to analyze the data. A separate Access database file was created for each service site log. Two people reviewed each of the logs—either two FINL members, or, if the site in ques- tion was one at which no FINL member worked, a FINL member and someone who worked at that site. The team was looking at two variables in this analy- sis—the kind of question and the level of employee needed to answer it based upon existing training. All questions were placed into one of the following six categories: • Reference (as defined by ARL) • Directional (as defined by ARL) • Technology use (e.g., how to use Excel, how to burn a CD) • Technology problems (e.g., paper jams in printers, copy card stuck in card reader) • Circulation questions • Questions about nonlibrary campus 254 College & Research Libraries May 2007 services (e.g., how to drop/add a class at the beginning of a semester) The relative proportion of each type of question varied by site, but reference questions made up no more than a third of the questions at any site. When FINL looked at the level of skill needed to answer the questions, the team found that students and trained general- ists could answer over 95% of the ques- tions asked at each service site. This had a great influence on the changes FINL recommended. The Customer Perspective One important piece of information missing from the question logs was how customers perceived the quality and importance of the services that the library provided. For example, FINL did not ask staff to record the answers to the questions they logged. Nor did the team know, except in a hit-or-miss anecdotal way, what customers felt about the full spectrum of information and referral ser- vices provided. What did the library staff do well? What did customers want most from this service? And, conversely, what did the library staff do poorly and which services did customers value least? The traditional way of ge ing at an- swers to these questions is to administer a user satisfaction survey, and the library had conducted several of these over the years. But traditional satisfaction surveys (“On a scale of 1–5, with 1 being unsat- isfactory and 5 being outstanding, how would you rate X?”) do not indicate what the customer thinks is important. Though a library may do a lot of things well, the library may still have a customer relations problem that it does not even recognize if those are not the things that customers value highly. FINL opted to utilize a survey technique called the Customer-driven Action Process (CAP),6 which provided respondents with a list of service components available. They were then asked to indicate which items on that list they considered to be the five most important; which five items they believed the organization did best; and which five items they considered most in need of improvement. Responses might range from having the same five items in all three categories (e.g., giving “accurate answers is the most important thing you do; it’s also the thing you do best; but it’s also in need of improvement”) to having five different services rise to the top in each category. FINL’s first step was to develop a list of 20–25 service components that described the services offered at each site. A dra list was developed and shared with the rest of the library. Modifications were made to produce the final lists. Certain items (e.g., “publicizing our services,” “providing accurate answers to factual questions”) were common to all sites. But mission differentiation among the three reference sites, the four circulation/infor- mation desks, and the Special Collections unit, led to specialized surveys for these areas. The survey also asked respondents to tell us whether they were undergradu- ates, graduate students, faculty, staff, or other. The survey questions asked at the Information Commons were identical to those used at the Science-Engineering Library and the Fine Arts Library. (See Appendix A – Help Us Improve our Ser- vice in the IC) FINL administered the CAP tool for the first time in February 2004. The team used both paper surveys and, at the three reference sites, online versions of the sur- veys via an icon on the desktop of each public computer at the three sites. Paper surveys were handed out to customers who approached the desks and were also placed near the computers in the Informa- tion Commons. Paper surveys proved problematic in two ways. Unlike their online equivalents, they required some data entry. But, more important, while FINL could program the online survey form to accept only five responses in each category, there was nothing to stop users from checking more (or fewer) than five choices on paper forms. FINL made a decision to reject pa- Finding Information in a New Landscape 255 per forms where more than 10 items were checked in any one category. Response rates varied, ranging from the Main Library Information Com- mons—the largest and busiest site—col- lecting over 200 responses, to Special Collections, which collected fewer than 30 responses. Since the nature of the re- sponses was similar (allowing for mission differentiation) at all sites, FINL was com- fortable drawing conclusions based on the results from each site, including those for which there was a low rate of return. Once the results were in, FINL used Excel spreadsheets to create graphic dis- plays of the survey responses for each site. The team created charts for Importance responses, a Net Performance chart (Do Best responses minus Needs Improve- ment responses); and finally an Action Gap Analysis chart, which displays all three response types together. (See Ap- pendix B for examples of all the Impor- tance, Net Performance and Action Gap Analysis charts from the 2004 Information Commons survey.) This Action Gap Analysis chart dis- plays all the results in one place. • Service elements are placed in the order of their importance to customers, as indicated by the number of respondents who selected the service component as one of their top five choices on the survey. • The service elements are arrayed from most important at the top to least important at the bo om. • The light grey bars indicate the number of respondents who listed that service element as one that the service desk does best. • The dark grey bars indicate the number of respondents who listed that item as one the service desk needs to improve most. • The black line lists the net perfor- mance, which results from subtracting ‘needs improvement’ from ‘does best.’ • The dotted line indicates the ex- pected performance. This value is not derived from customer responses—it is a symbolic indication that the library should be performing best on those items that customers value most. In this case, the upper limit was placed at about the same place as that element gaining the highest response to “does best” (in this example, the highest rated service element is the fi h from the top—“Showing an inter- est”). This “symbolic” method for se ing the performance expectation was used because neither a benchmark from prior years nor any clear preexisting quality standard for these items was available. Service elements third and sixth from the top were the two items with lowest net performance that also rated high on importance, suggesting these items required a ention first. Coincidentally, other groups in the library were already addressing aspects of service that were identified. For in- stance, a group from the Circulation department was making help available for providing alternative access to items the library does not own or are not currently available on the shelf. FINL shared the results with these groups but mostly le it to these groups to suggest improvements in areas where they had expertise. The third item from the top: “Assistance finding an open computer when the Information Commons is full” was a more difficult problem. IC staff could help students look around the fa- cility for an open computer, but behind this rating was a call for more computers. This could not be accomplished in the IC without negatively affecting the flex- ibility that makes it a popular place for students to work. FINL was also unwill- ing to limit customers’ time on machines or to institute a sign-on process, both of which would conflict with library poli- cies of providing the widest open access possible. In response, the library began working to create similar spaces in the branch libraries so that there would be more such facilities on campus. The Cost of Providing Service To determine the cost of delivering services at each site, both physical and 256 College & Research Libraries May 2007 virtual, FINL collected a variety of cost elements. These included: • The cost of staffing, which was calculated from the actual salaries of the librarians, paraprofessionals, and tem- porary employees staffing each site. This also included wages for student employ- ees at those sites where this category of employee worked. • The cost of training, both for train- ers and trainees (including student em- ployees). This tracked the time devoted to training (including preparation time and posttraining evaluation) for each individual involved during the fall 2004 training cycle and calculated both the total cost and cost per hour of training for reference work. • The cost of scheduling each desk. FINL tracked the time spent on schedul- ing each desk for fall 2004 to determine this figure. • The cost of trading desk hours. FINL used a sample two-week period during fall 2004 to calculate this figure, assuming that each person on the listserv for a particular site averaged one minute on each trade request (which would range from deleting without reading to check- ing one’s calendar and responding with an offer to trade). • The cost of the hiring process for student and temporary employees (who were hired specifically for reference work). • The cost of supervising student employees. In each situation, FINL determined which individuals were involved in the delivery of each aspect of a service or support for those services. The team then calculated their hourly salary or wage (including benefits for permanent employees) and averaged those numbers to create an average cost per hour of staff- ing particular sites, training, scheduling, and so on. The team discussed overhead costs but ultimately decided not to include costs such as electric bills, the cost (both initial and refresh) of computers for information commons sites, IT support for those com- puters, the cost of server space, heating and cooling costs for the areas, and similar items. The team reasoned that since the library was not going to close any of the public sites, those costs would be ongo- ing; and that, while there was some added cost to providing and maintaining com- puters for reference staff, the number of those computers was so small compared to the hundreds of public computers that knowing that particular cost would not impact the large cost-effectiveness pic- ture. The team acknowledged that, by not including these costs, it did not discover the full cost of providing services; rather, it learned the cost of staffing the physical and virtual reference sites. (See Appendix C for a chart of the cost data collected.) It quickly became apparent that the largest single cost of staffing service sites was the salaries of the permanent em- ployees deployed to provide service at each site. This was intuitively obvious in retrospect, but now data were available to demonstrate it. Clearly, the only way to save significant amounts of money on reference services without closing an entire library was to reduce reliance on subject specialist librarians at service sites, who, no ma er how senior they may be, are higher paid than paraprofessionals. Fortunately, as noted earlier, such a change was also supported by the question-logging data. Implementing Changes Service and Staffing Model As FINL was collecting data, they were shared widely with the library. Through- out the process, the team regularly re- ported findings and results of discussions. In both the Science-Engineering Library and the Fine Arts Library, circulation desks and reference desks stood virtually side by side. As noted above, demand for reference service had been dropping steadily at these sites over the years, and circulation had been declining as well. The implementa- tion of self-checkout at these and other sites further decreased the work of the paraprofessionals at circulation desks. Finding Information in a New Landscape 257 FINL decided to pilot a new service model by combining the reference and circulation desks at the Science-Engineer- ing and Fine Arts Libraries and a new staffing model that put circulation staff on the newly combined desks. Circula- tion staff who were identified for the pilot were given training for the specialized reference skills required in the Science- Engineering and Fine Arts Libraries. While this training was underway, they also shadowed librarians at the respective reference desks. Librarians also observed the service providers during customer interaction three times in a one-year period. Observing desk staff was new and brought anxiety for many, but over time both the librarians and staff grew to appreciate the process, which was in place to support further learning for desk staff—the observations were not meant to be punitive. When the training was completed (approximately two months into the fall 2004 semester), one of the two service desks at each site was closed. In the Main Library Information Commons, the situation was different. The circulation desk and the reference desk were on different floors, so offer- ing both services from a single desk was not feasible. The Information Commons, however, was the busiest of all the sites and had always had a mix of librarians and paraprofessionals providing ser- vice. Furthermore, the overnight hours (Extended Hours) were staffed entirely by paraprofessionals hired especially to cover those hours. FINL speculated that, if the Extended Hours staff were expanded, they could cover not only the “graveyard shi ,” but evening and weekend hours as well. Thus FINL sought and obtained funds from the library to hire four temporary half-time paraprofessionals. Beginning in the middle of the fall 2004 semester, the Information Commons Help Desk was staffed by a small cadre of eight librarians from 9:00 a.m. to 6:00 p.m. Monday–Friday. The small group would allow these individuals to have enough hours to continually build and maintain skills, as well as allowing the members of the group to have a set schedule each week. Permanent parapro- fessionals covered all other hours. Referral System To support those now working on the service desks and to provide some help for those few questions that required a subject specialist, an upgraded referral procedure was also introduced. In the past, if reference staff could not answer a subject-related question, there was no clear process on what they should do next. Some specialists le their business cards at the desk, so these could be given to customers as a next step. Sometimes customers were directed to the library’s Web site, where they could find the e-mail address or phone number of the specialist to contact later. Some staff would search out the specialist’s office number and send the customer directly to that person. This haphazard approach did not serve the customer, the desk staff, or the specialist. The onus was on the customer to contact the specialist and their a empts o en proved fruitless. The desk staff had no clear process to follow and were o en berated by special- ists for either not referring customers or for sending them directly to their offices unannounced. And the specialist was often interrupted by unexpected and demanding customers in the middle of other time-sensitive work. For FINL to create a model that would take librarians off most of the service desks, the team would need to ensure that desk staff knew how and when to refer customers to specialists; that customers were guaranteed to be connected quickly and directly with the appropriate special- ist; and that specialists were given the opportunity to manage their time be er while still answering referred questions promptly. The team also wanted to be able to collect enough data to periodically as- sess and, if necessary, modify the referral system and procedures. This required a new approach that included both real- 258 College & Research Libraries May 2007 time support and deferred service when necessary. Real-time support is available to desk providers at all service sites. Real-time support prescribes that, before refer- ring a customer to a specialist, the desk staff call the specialist for support and advice. The specialist may ask desk staff to immediately make a referral using the automated referral system (as described below); suggest additional resources or search strategies before making a referral; or simply send the customer directly to his or her office for immediate help. If the specialist is not available, or is occupied and unable to provide support, the desk service provider makes a referral using the automated system Deferred reference services are man- aged through a robust automated refer- ral system using Web e-mail forms. To streamline processes and improve the consistency of data collection across sites, a single Web interface was created from which staff could make referrals, collect ongoing reference statistics, and, periodically, log questions electronically. Another benefit of this integrated, Oracle- based system is that it provides a variety of prescripted, up-to-the-minute reports sorted and cross-tabulated by service site, question type, time, and date (to mention a few criteria). Data collected from electronic referrals made since bringing the system online confirm several assumptions but also reveal some surprises. About a third of all referrals have been made from the Sci- ence Reference desk. This is not surpris- ing, considering that the new staff at this desk had the largest learning curve and are now covering the desk at all hours. Another third of the referrals have come from Circulation or Information desks, and the rest came from the two remaining reference desks (Fine Arts & Information Commons) or from the desktops of indi- vidual librarians. While it initially seemed somewhat surprising that very few refer- rals were made on weekends or evenings from any of our service sites, a review of our most recent question-logging data showed that only a small number of specialist-level reference questions were being asked during these times. Another surprise was the large number of refer- rals made to traditionally nonreference service sites: more than half of all referrals were made to specialists in areas such as circulation, fines, cataloging, technology, or facilities management. Chat Reference Staffing FINL piloted one other change during the 2004–2005 academic year. The library implemented a campuswide chat refer- ence service in 2002. Librarians primarily staffed this service. They were scheduled to provide this service from their offices (over and above their reference desk shifts), making the service available for a total of 38 hours during the week. FINL decided to try offering chat reference from the IC Help Desk. This would involve training Extended Hours staff to provide chat refer- ence but carried with it the risk that adding chat to walk-in and phone reference could lead to overload. On the other hand, chat statistics were not prohibitively high, and there were two major advantages to offer- ing chat reference from the IC Help Desk. First, it would free 38 hours per week of librarian time for other work. And second, it would expand the number of hours that chat reference was available from 38 per week to 142 hours per week—the number of hours that the IC Desk was staffed. These piloted changes would address both the goal of staffing to need—that is, the questions actually asked—and the need to reduce costs. In the Science-En- gineering Library, FINL’s pilot eliminated all 60 hours per week of librarian time on the desk. Science-Engineering librar- ians made an average of $28.00 per hour including benefits, so the pilot savings at this library amounted to $1,680 per week. In the Fine Arts Library, FINL’s pilot eliminated all 21 hours per week of librarian time at the desk. Fine Arts librarians made an average of $26.99 per hour including benefits, so the pilot Finding Information in a New Landscape 259 savings amounted to $567 per week. The average salary of the librarians (including one paraprofessional) who provided chat reference prior to FINL’s pilot was $30.79 per week including benefits. Eliminating the separate schedule for chat reference and providing the service at the IC Desk saved $1,170.22 per week. In the Main Library IC, the average salary of those providing the 58 hours (double-staffed at all times) of early evening and weekend reference service was $30.26 per hour including benefits. During FINL’s pilot, they were replaced by paraprofessionals (also double-staffed at all times) making an average of $16.91 per hour including benefits. For those 58 hours, the Library ne ed a savings of $774.30 per week. Overall savings during the FINL pilot were $4,191.30 per week, during normal weeks (that is to say, no holiday closures) of the fall and spring semesters. It is important to recognize that the savings noted above was to the information and referral process. What the library really saved was $4,203.30 per week of staff time, which could be redeployed to other work such as increased faculty collaborations, grant writing, and digitization projects identified earlier, as well as survive staff cuts. Testing the Pilot Even these savings to the process, how- ever, could only be realized if FINL’s pilot succeeded in delivering reference service at these sites with no decrease in customer satisfaction at a minimum. In spring 2005, FINL repeated its CAP tool at each site and logged questions again to ensure that earlier logs were not providing anoma- lous data. The last round of question log- ging yielded results that were remarkably consistent with earlier rounds, both as to the nature of the questions asked and to the level of expertise needed to answer them. With respect to the analysis of the results of the CAP tool, while various ser- vice components shi ed somewhat both in relative importance and degree of sat- isfaction, the data showed that the library was able to implement these cost savings without an overall negative impact in the quality of reference services. Additional survey tools being used by the library such as LibQUAL+TM and a homegrown online library report card survey, as well as anecdotal evidence from conversations with faculty, further demonstrated that the changes implemented by FINL were not having a negative impact on the qual- ity of reference services. Conclusion Once FINL took a serious look at the “cost of reference,” it soon became clear that the underlying issue was how the time of professional librarians was utilized. This, in turn, was related to a recurring theme in the library’s strategic planning efforts, that of “new work,” and how to get it done. This new work included such things as making sure that the library was a player in scholarly communications issues on campus; shi ing our instructional role from simply responding to requests from faculty for single library instruction ses- sions to a curricular approach to informa- tion literacy that also integrates reusable learning objects to eliminate duplicative work; and augmenting revenue flow into the library by radically ramping up grant- seeking activity. While the value of this new work was clear to most staff, the perennial question about what could be given up remained. Identifying new work was straightfor- ward, but identifying legacy work that could be stopped was more challenging. Staffing traditional reference desks was perhaps the biggest legacy system of all. Change Management During the nearly two years that FINL was actively working, its members had several occasions to speak publicly to col- leagues at other libraries about the team’s work. Two questions that always came up, especially from library managers, were how FINL was able to implement such significant changes without engender- http:4,203.30 http:4,191.30 http:1,170.22 260 College & Research Libraries May 2007 ing resentment on the part of customers and also without inciting a revolt on the part of staff. Change has been a constant in the li- brary for well over a decade, dating from the initial transformation into a team- based organization.7 One change in par- ticular that ultimately worked to FINL’s benefit was the organization’s paradigm shi in the early 1990s. During this period, the library moved from the traditional expectation that librarians would function as specialists in a particular library pro- cess (for instance, reference, cataloging, acquisitions) to a more holistic approach, asking librarians to engage in a variety of processes targeting a specific customer group, such as chemistry faculty and stu- dents. For instance, librarians shi ed from being part of the Reference Department or the Cataloging Department to being part of the Science-Engineering Team, the Undergraduate Services Team, and so on. Librarians were now responsible for a broad range of activities, such as refer- ence, collection management, and instruc- tion, within their new team. By the time FINL began its work, there were many librarians working a few hours per week at one reference desk or another, but no librarian’s professional identity was tied to being only a “reference librarian.” The UA Library also made extensive use of process improvement (PI) tech- niques during the mid-1990s.8 While these early PI efforts were focused on technical services and access services functions rather than on reference, they introduced staff to the concept of study- ing various kinds of library work in terms of discrete processes, which could then be modified as appropriate. When the time came for FINL to apply PI tech- niques to information and referral, staff were familiar with the PI approach and prepared to accept analysis of reference in that context. It is important to acknowledge that the library’s early a empts at organizational transformation, via process improvement or other approaches were o en greeted with considerable unhappiness by the staff. But over the years, the library ad- ministration and staff—often from its mistakes—learned various ways to make change less painful. FINL was able to make use of these lessons when it began to change the library’s approach to deliv- ering reference. However, FINL was unwilling to se le for mere compliance from staff. The team goal was to convince as many staff as pos- sible that the conclusions were reasonable and worthy of active support. To accom- plish this, the team made use of many of the change management tools that the library had adopted over the years. One of the foundations of effective change management is communication, and FINL made a concerted effort to com- municate early and frequently. During the twenty months of the team’s existence, FINL made thirteen presentations at meetings to which all library staff were invited. These presentations typically consisted of a summary of work to date, followed by a question and answer ses- sion. In addition, FINL made multiple visits at various stages of work to stake- holder teams (primarily those supplying staff at various reference sites), assimilat- ing feedback from these forums. The team even experimented, albeit with limited success, with a dedicated intranet blog for staff to engage in an online dialogue on various issues the team was confronting. Thus, staff who cared to follow the work were well aware of the progress of team thinking and planning. But perhaps just as important as the fact that FINL communicated were the things the team chose to communicate. The team did not simply present staff with brainstormed solutions to a problem or with plans that were already set in stone. Initially, FINL spent a good deal of time publicly reviewing its charge with the li- brary. These dialogues ultimately resulted in a redra emphasizing that FINL was definitely a process improvement team whose goal was to cut costs—without lowering quality. Finding Information in a New Landscape 261 FINL’s next major communication phase involved sharing with library staff the analysis of the data collected on various information and referral processes. FINL’s sense was that some staff, and particularly librarians who had worked at a reference desk for many years, tended to view the demand for reference through rose-col- ored glasses, as it was in the 20th century rather than as it is now in the 21st. The data FINL gathered made it clear that, whatever impression some librarians may have had, the library was no longer ge ing lots of challenging traditional ref- erence questions at any of the service sites. Consistent data made it difficult for the few remaining reference traditionalists to claim that their absence from the desk would result in a catastrophic decline in the quality of service. And, indeed, from the perspective of our customers, it was difficult for FINL to determine that they noticed any of the piloted changes. Although faculty and students are given multiple mechanisms to give input to the library, there was no rise in the number of complaints about service. The action gap survey during the FINL pilot showed only minor changes in customer priorities and views on the quality of work. Minimizing any negative impact of our changes on customers was an important part of our charge. It is instructive to compare the pub- lic reaction to FINL’s modifications of the library’s reference staffing models to their reaction to a major overhaul of the library’s Web site, which took place a few months later. The changes to the library’s Web presence generated sig- nificant amounts of customer feedback, both positive and negative. Changes to the Web site drew immediate a ention. Comparable changes to reference desks went virtually unnoticed. FINL interprets the difference in these responses as confirmation of our conclu- sion that our traditional reference desks do indeed represent a legacy service. And while this certainly does not mean that traditional reference desks can be aban- doned immediately or even in the near future, it does suggest that it is reasonable to keep searching for less expensive ways to deliver this service. Benefits to the Library FINL’s work resulted in multiple benefits to the library. Obviously, the cost savings (more than $200,000 projected over a full year) were significant. Coincident with the completion of FINL’s work in the spring of 2005, the library faced another significant budget cut. As before, as a result of recent resignations, the library had several va- cant librarian positions. The time savings created by FINL’s work were consolidated into several of these vacant lines, which were turned back to the university to pay for the majority of the budget cut. Thus, the library was able to weather a serious financial crisis without resorting to layoffs, thanks in part to FINL’s work. Beyond the fortuitous cost savings, other benefits resulting from FINL’s work included the following: • Chat reference transactions tripled, chiefly because this service was reliably offered across many more hours each day. We were able to serve more custom- ers while substantially reducing the cost of this service simply by changing the delivery model. • Paraprofessionals from the Materi- als Access Team (circulation) were able to take on more challenging work. • Librarian time (obviously not in- cluding vacant lines lost) was freed up for other work. • The new online reference statistics collection system eliminated a great deal of routine data entry work and provided more accurate data on reference pro- cesses. • The new online referral system has enabled the library for the first time to collect accurate data on the number of re- ferrals made to subject specialists, as well as ensuring that referred customers’ infor- mation needs are promptly addressed. • The Information Commons Desk went from 33 reference providers work- 262 College & Research Libraries May 2007 ing only a handful of hours during the regular work week to a cadre of 10 refer- ence providers working 9 hours per week. The increased hours on the desk allows individuals to build their skills and work a set schedule. With this set schedule, customers see familiar faces at the desk and can anticipate when specific special- ists are going to be available. • Reducing the number of staff pro- viding reference and circulation services at various sites has made it easier for these staff to communicate with colleagues, both at individual desks and throughout the system. It is enabling the library to begin to create a “reference ethos” and service model that is consistent at all the sites throughout the system. • The process improvement study increased librarians’ knowledge of the many non–value-added procedures that had built up over years. Their ability to conceptualize the information and refer- ral process as a system of related inputs, tasks, activities, and outputs paved the way for instituting measurable improve- ments and cost reductions. • This process improvement study provided additional information about customers’ perceptions of their biggest problems using the library resources. “Help identifying articles and/or books for your research topic” and “Help providing alternative access to missing or checked-out items, or items we don’t own” were among the top concerns in im- portance and need for improvement in the study. These findings reinforced efforts in other process areas being addressed by the library staff: education, materials access, and document delivery. The sum total of these changes allowed the Library to provide improved services at an appropriate cost. Additionally, the efficient data-gathering methods will allow the services in the future to be monitored and studied further as changes warrant. Notes 1. Ruth Dickstein, Vicki Mills, and Robyn Huff-Eibl, “Scaling Tall Mountains and Crossing the Great Divide: Training and Staffing the Information Commons” (2002). Available online from www.library.arizona.edu/users/vamills/azla2002.htm. [Accessed 1 November 2005]. 2. iSixSigma, “DMAIC / Existing Product or Service Improvement” (2005). Available online from www.isixsigma.com/me/dmaic/. [Accessed 1 November 2005]. 3. Shelley Phipps, “The System Design Approach to Organizational Development: The University of Arizona Model,” Library Trends 53, no. 1 (2004): 77. 4. K.E. Sveiby, The New Organizational Wealth: Managing and Measuring Knowledge-Based Assets (San Francisco: Berre -Koehler Publishers, Inc., 1997). 5. M.L. Bechtell, The Management Compass: Steering The Corporation Using Hoshin Planning (New York: American Management Association, 1995). 6. John Cravenho and Bill Sandvig, “Survey for Action, Not Satisfaction,” Quality Progress 36 (March 2004): 63–68. 7. Laura J. Bender, “Team Organization—Learning Organization: The University of Arizona Four Years into It,” Information Outlook: The Monthly Magazine of the Special Libraries Association 1, no. 9 (1997): 19–22; Joseph Diaz and Shelley Phipps, “The Evolution of the Roles of Staff and Team Development in a Changing Organization: The University of Arizona Library Experience,” in Finding Common Ground: Creating the Library of the Future Without Diminishing the Library of the Past, ed. C. LaGuardia and B. Mitchell (New York: Neal-Schuman, 1998); Joseph Diaz and Chestalene Pintozzi, “Helping Teams Work: Lessons Learned from the University of Arizona Library Reorganization,” Library Administration & Management 13, no. 1 (1999): 27–36. 8. Catherine A. Larson, “Customers First: Using Process Improvement to Improve Service Quality and Efficiency,” RSR: Reference Services Review 26, no. 1 (1998): 51–60+; Shelley E. Phipps, “Performance Measurement as a Methodology for Assessing Team and Individual Performance: The University of Arizona Library Experience,” in Proceedings of the 3rd Northumbria International Conference on Performance Measurement in Libraries and Information Services: held at Longhirst Man- agement and Training Centre, Longhirst Hall, Northumberland, England, 27 to 31 August 1999, ed. Pat Wressell (Newcastle upon Tyne, England: Information North for the School of Information Studies, University of Northumbria at Newcastle, 1999): 113–117. www.isixsigma.com/me/dmaic www.library.arizona.edu/users/vamills/azla2002.htm Finding Information in a New Landscape 263 APPENDIX A Help Us Improve our Service in the IC 264 College & Research Libraries May 2007 APPENDIX B1 Importance, Net Performance and Action Gap Analysis Charts Action Gap Analysis - Information Commons Feb. 2004 N=170 APPENDIX B2 0 1 0 2 0 3 0 4 0 5 0 6 0 7 0 1 3 d M o s m p o a n c e 1 3 d M e d u m m p o a n c e 1 3 d L o w m p o a n c e Importance, Net Performance and Action Gap Analysis Charts Finding Information in a New Landscape 265 Importance Rank of Services at ­ the Information Commons ­ # of Responses ­ Help Id articles Hardware/software Finding computer Provide at times you need Show an interest Provide alt. access Help finding if we own Refer you Teach how to solve Provide all infor Physically locate Info that helps answer Q's Understand question Explaining search Other equipment Appropriate time Accurate answers Campus info Publicize services Time to wait Specialist responds Check to ensure Provide brochures Policies / r t I rt / r i I rt / r I rt 5 0 4 0 3 0 2 0 1 0 0 1 0 2 0 3 0 4 0 266 College & Research Libraries May 2007 APPENDIX B2 Importance, Net Performance and Action Gap Analysis Charts Information Commons Net Performance Does Best (Top) to Needs Improvement (Bottom) ----- Show an interest Provide brochures Appropriate time Other equipment Understand question Explaining search Campus info Info that helps answer Q's Help finding if we own Policies Time to wait Accurate answers Provide all infor Hardware/software Physically locate Help Id articles Check to ensure Refer you Provide at times you need Specialist responds Teach how to solve Publicize services Provide alt. access Finding computer Top 1/3rd Services in Importance Middle 1/3rd Services in Importance Bottom 1/3rd Services in Importance Finding Information in a New Landscape 267 A P P E N D IX C C os t D at a C ol le ct ed U A L IB R A R Y S E R V IC E S IT E S In fo C o m m o n s R e fe re n c e S c ie n c e F in e A rt s S p e c ia l C o ll e c ti o n s C h a t E m a il M a in C ir c & In fo C O S T E L E M E N T S D ay E xt en de d H ou rs R ef C ir c R ef C ir c C ir c R ef R ef R ef /C ir c S al ar ie s F ul l- ti m e $6 0, 48 1. 41 $1 3, 66 1. 88 $1 8. 78 0. 48 $1 6, 26 2. 81 $9 ,5 44 .4 4 $2 0, 85 7. 66 $1 3, 23 0. 74 $1 9, 59 8 $2 ,4 44 .5 1 $1 7, 87 7. 87 S al ar ie s P ar t- ti m e $1 1, 50 9. 40 $3 1, 59 1. 10 $6 ,3 98 .4 0 $2 ,8 18 .2 0 $0 $0 $0 $0 $0 $2 ,8 18 .2 0 T ot al w /B en ef it s $7 1, 99 0. 81 $4 5, 25 2. 98 $2 5. 17 8. 88 $1 9, 08 1. 01 $9 ,5 44 .4 4 $2 0, 85 7. 66 $1 3, 23 0. 74 $1 9, 59 8 $2 ,4 44 .5 1 $2 0, 69 6. 07 S tu de nt W ag es $3 2, 46 2 $0 $0 $5 ,9 79 .7 5 $0 $3 ,8 37 .7 5 $0 $0 $0 $6 ,3 36 .7 5 T ot al S al ar ie s & W ag es $1 04 ,4 52 .9 1 $4 5, 25 2. 98 $2 5, 17 8. 88 $2 5, 06 0. 76 $9 ,5 44 .4 4 $2 4, 69 5. 41 $1 3, 23 0. 74 $1 9, 59 8 $2 ,4 44 .5 1 $1 7, 32 0. 82 T ra in in g $1 0, 45 5. 02 $9 25 .2 2 $5 ,1 69 .1 4 $1 54 .6 3 $1 ,1 78 .1 4 N o D at a $3 80 .9 6 $0 $4 ,0 25 .0 6 S ch ed u li n g $1 38 4. 40 $0 $3 45 .8 8 $5 33 .2 7 $4 11 .0 3 $7 54 .3 4 $8 06 .9 3 $5 1. 36 $7 5. 84 $2 46 .2 1 A d m in S u p p or t $1 ,1 60 .8 0 $3 69 .2 4 $2 26 .5 4 $0 $0 $0 $0 $0 $0 $0