Johnson, Megan, Louise Ochoa, and Geraldine Purpur Johnson, Megan, Louise Ochoa, and Geraldine Purpur. “Virtually Usable: a Test of the Information Gardens.” The Journal of Academic Librarianship. 33:5 (September 2007). P. 593-601. The original version of this article may be accessed at http://www.elsevier.com/wps/find/journaldescription.cws_home/620207/description#des cription. Virtually Usable: A Test of the Information Gardens Megan Johnson(a) , Louise Ochoa(a), and Geraldine Purpur(a) (a)Belk Library and Information Commons, Appalachian State University, Boone, NC 28608, USA Available online 27 June 2007. Abstract This paper presents the results of a usability study conducted to determine the functionality of a desktop, three-dimensional virtual library designed and supported by the Appalachian State University Distance Learning Library Services team. Formative evaluations were performed with representative students utilizing Morae software. Results influenced the final design of the library. Introduction This paper will discuss the results of a usability study conducted to determine the functionality of a three-dimensional, desktop virtual reality library (the Information Gardens). The Information Gardens supports three graduate programs in the Reich College of Education Leadership and Educational Studies Program at Appalachian State University. It is designed and supported by the Appalachian State University Distance Learning Library Services team. The purpose of the usability testing was to identify areas for improvement while still in the design stage. A series of three formative evaluations were conducted on portions of the Information Gardens utilizing representative users. This information led to improvements in the virtual library. There are three main VR (virtual reality) categories: text-based, desktop, and sensory- immersive virtual reality. In a networked, text-based VR, users interact solely through typing messages on their computer keyboards. Desktop VR is similar to interactive multimedia; using three-dimensional images without being immersive. Sensory- immersive VR immerses the user in a three-dimensional computer generated world. Visual, auditory, and touch technologies create the illusion of real user presence in the simulated environment.1 The usability testing of a virtual environment can vary widely depending on the type of virtual reality being evaluated, but in all cases the goal is to evaluate the ease of use and learning, the degree of error tolerance of the system, and the overall user satisfaction.2 The AET Zone is an example of desktop VR. Like interactive multimedia, combinations of visual, audio and textual representation are utilized. In this environment, avatars represent the individual as they travel through a computer graphic display of a three- dimensional world. The input devices used for travel and interaction with graphic objects are the computer mouse or keyboard arrow keys. A text-based chat allows communication with other individuals present in the AET Zone. Auditory output is another feature that can be incorporated. A unique feature of desktop VR is the extensive use of metaphors to represent real world objects and functions. Because selection of appropriate metaphorical models can improve user efficiency and recall, the evaluation of metaphorical design is an important aspect of usability testing in desktop VR.3 Many methods exist for the usability evaluation of human–computer interfaces although they have primarily been designed for GUIs (graphical user interfaces). Depending upon the type of VR, unique characteristics of the interaction styles in VR may make use of these methods ineffective. For example, in an immersive VR the user may be utilizing whole body movements as an input device rather than a computer keyboard, mouse or joystick. In addition, multiple input modes such as voice, gestures, and text chatting can occur simultaneously. This can cause processing difficulty for a single evaluator, necessitating multiple modes of observation or multiple evaluators. The sense of presence, or feeling of being physically located in a simulated environment, is also something that is not encountered in traditional user interfaces. To accurately assess presence, the evaluator must not be seen or heard as they are not part of the virtual world and may interfere with the user's perception of presence. Literature Review A review of the literature reveals that a number of usability evaluation methods have been applied to virtual reality systems. Most of these are common methods originally designed for human–computer interaction or two-dimensional systems. A listing of the major methods accompanied by a sampling of the researchers utilizing the methods follows. The cognitive walkthrough is an approach taken by Polson, Lewis, Rieman, and Wharton for GUI usability and is modeled on their CE+ theory of exploratory learning.4 In this type of evaluation, a group of evaluators analyzes each task required to achieve a user's goal, particularly examining the cognitive processes involved in each step. It attempts to identify design errors that interfere with a user's ease of learning. This approach is especially suited to understanding the needs of first-time users or exploratory learners. Nielson described the use of a heuristic evaluation for usability inspection.5 It was further developed by Sutcliff and Gault in their evaluation of a virtual reality environment.6 In this method, several independent experts analyze a design utilizing a set of relevant guidelines or heuristics. Like the cognitive walkthrough, no users are involved. Results are then combined and ranked to prioritize any redesign issues. The formative evaluation is a term originally coined by Scriven7 and utilized by Hix et al.8 Scriven developed formative evaluation for use in the instructional design process and it was later adopted by the field of human–computer interaction.7 Formative evaluation is performed early in the design process and assesses a user interface by having representative users complete tasks while observing and recording their performance data. The data collected can be qualitative (user comments, critical incidents, reactions) or quantitative (timed tasked, counted errors). Summative evaluation was also coined by Scriven7 and has been utilized by Bowman and Hodges,9 among others. Summative evaluation is generally performed after a design has been completed and may statistically compare one or more designs side by side. Representative users may be utilized in a method analogous to the formative evaluation process or an expert review by specialists may be employed. A post hoc questionnaire is a technique utilized by Slater, Usoh, and Steed in their analysis of the walking metaphor in a virtual environment.10 The questionnaire is a written set of questions administered after participation in a usability evaluation. It can be the sole form of data collection or used to supplement another form of evaluation. Data gathered may be demographic information and user experiences, interests, or reactions. Questionnaires are convenient and relatively easy to administer. The interview is described as a usability technique by Hix and Harston11 and one of the techniques utilized by Bowman et al.12 An interview is much more personal than a questionnaire as the interviewer works directly with the respondent. They are useful for obtaining in-depth information on a topic. Interviews may be formally structured with the same questions asked of each respondent, or informal, with no predetermined questions. In addition to the main evaluative techniques, there are several commonly used data collection techniques. Verbal protocol taking or “thinking aloud” is described by Hix and Hartson in their usability text.11 In concurrent protocol taking the participant is encourage to talk aloud about what they are doing during an evaluation session. This can also be done in a post hoc or retrospective session. An advantage of post hoc protocol taking is that it offers less opportunity to interfere with task performance or timing during an evaluation. Videotaping and audiotaping are both frequently employed data collection techniques.11 Videotaping is especially useful as it enables accurate recording of the participant's physical motions, facial expressions, and speech. Both techniques provide valuable backups to other data collection techniques and assist an evaluator in observing multiple data inputs during an evaluation. To take advantage of the best features of individual research techniques, researchers frequently utilize a combination of techniques.[12.], [8.] and [13.] In their longitudinal study of the design of a collaborative virtual environment, Tromp, Steed, and Wilson13 made use of the cognitive walkthrough, heuristic evaluation, and video observations in their formative evaluations. Hix and colleagues have developed a cost effective approach for design and evaluation of navigation in a virtual environment consisting of iterative use of expert heuristic evaluation, followed by formative and summative evaluations.8 Bowman and colleagues developed a test bed approach to evaluation which incorporated both interviews and a summative evaluation.12 Due to the variety of virtual reality systems and associated user interfaces, it has not been possible to define a set method or methods that will be applicable to all. Also, techniques that are suitable for evaluation of one aspect of the design may not be suitable for others. While many researchers have successfully applied human–computer interface usability techniques to virtual environments, there is a need for usability engineering research specifically for virtual environments. Methodology The Information Gardens is the second library created within the AET Zone. A decision was made to depart from the more traditional look and feel of the original library. The overall metaphor of the Information Gardens is that of a garden where growth and exploration take place. Individually themed gardens house a variety of resources within the building. In keeping with the garden theme, non-traditional objects frequently link to library services or resources. These secondary metaphors may not be as obvious as those in the original library within the AET Zone. The goal of our study is to evaluate the usability of the new Information Gardens design which includes user response to the metaphorical design (see Figure 1, Figure 2 and Figure 3). Although it may influence the test, we are not concerned with the functioning of user interfaces since we have no control over the overall systems design. The evaluation technique consisted of iterative formative evaluations which included the following data collection techniques: questionnaires, post hoc interviews and recording of the screen, audio, and video with Morae software. Morae software, produced by TechSmith, was set up to record live screen shots of the test users navigating through the Information Gardens. Audio was also recorded along with a single video camera utilized to record facial expressions of the participant. A brief demographic questionnaire was given the participants as well as a post hoc interview consisting of both structured and informal questions. Participants were all asked to perform the same set of three specific tasks. They were encouraged to talk aloud during the testing. Figure 1. Information Gardens Library. http://www.sciencedirect.com/science?_ob=MiamiCaptionURL&_method=retrieve&_udi=B6W50-4P2J38B-2&_image=fig1&_ba=1&_user=634929&_rdoc=1&_fmt=full&_orig=search&_cdi=6556&view=c&_acct=C000033778&_version=1&_urlVersion=0&_userid=634929&md5=4bedfbccd916b294e8e838cfd3ceb658� Figure 2. Initial Zen Garden. Figure 3. Redesigned Italian Renaissance Garden. A pilot test was run with three individuals to ensure that all aspects of the usability test would operate satisfactorily. Minor changes were made in the testing procedure and in the VR design as a result of the pilot test. Following the pilot test, an additional formative evaluation was completed with three new participants. We redesigned the elements in the virtual library space based on the results of this evaluation and then re-tested with two more students. Results from the final formative evaluation may spur additional design changes. A total of nine representative participants were selected to act as evaluators. The majority were students, and were chosen so that a wide range of ability and ages could be represented. The target audience for the AET Zone is adult distance education students, so we were particularly interested in the subjects who match a profile of a “typical” distance learner (female, over thirty-five). Since we were testing human subjects, our first step was to obtain an Institutional Review Board waiver (from the Office of Research and Grants). After this step we recruited test participants, scheduled testing times, and devised an observational test (see Appendix C). The software that we planned to use to record the sessions, Techsmith's Morae, had some compatibility issues with the AET Zone (they both use the computer's videocard so the AET Zone was not appearing) but we received technical assistance to troubleshoot this issue and were able to successfully record. At the beginning of each session we: • had the user sign a consent form (see Appendix B); • had the user answer the questionnaire: were they familiar with AET Zone environment and if they game, their age range (twenties, thirties, forties, or fifties), and if they are familiar with libraries; http://www.sciencedirect.com/science?_ob=MiamiCaptionURL&_method=retrieve&_udi=B6W50-4P2J38B-2&_image=fig2&_ba=2&_user=634929&_rdoc=1&_fmt=full&_orig=search&_cdi=6556&view=c&_acct=C000033778&_version=1&_urlVersion=0&_userid=634929&md5=10bf63189c7686ae7f8b2376256b30fd� http://www.sciencedirect.com/science?_ob=MiamiCaptionURL&_method=retrieve&_udi=B6W50-4P2J38B-2&_image=fig3&_ba=3&_user=634929&_rdoc=1&_fmt=full&_orig=search&_cdi=6556&view=c&_acct=C000033778&_version=1&_urlVersion=0&_userid=634929&md5=e036cf3b05b8664c183adb79d7a8e5ef� • reassured users it is a test of the design—not of them; • encouraged the user to talk aloud; • told the user the test was brief, there was no pressure to rush; • as needed, gave them an introduction to the AET Zone (for example, holding the shift and arrow keys when your avatar becomes stuck); and • allowed them to practice navigating in the environment. Generally, with observational usability testing, if you test four to seven users, they all tend to “stumble” at the same place. This demonstrates a problem with the interface (not the users) that needs to be clarified. The designers address the “problem area” of the interface, then, ideally, they retest with a new group of users to gauge the success of the adjustments. Observational tests can be extremely illuminating to the designers, since a designer cannot see the interface with “new eyes” after having worked with it over a length of time. One downside of this type of testing is the “Hawthorne effect.” This refers to a psychological phenomenon that occurs when people know they are being observed. Because they know they are being watched, they may change their behavior. For example, they may try to figure out an interface for longer than they would if they were alone. We had a subject state, “If I were away from here [the testing room] I would probably get very frustrated with this.” Another aspect of observational testing is watching a user for a brief period of time when they are new to an interface does not help a designer understand how a user's behavior changes over time.14 As a user becomes familiar with an interface, some things become clearer, while other aspects of the interface may become annoying or problematic. Subject Profiles The target audience for the AET Zone is distance education students in the Instructional Technology, Higher Education and Library Science graduate school programs in the Reich College of Education. These students typically vary in age from their twenties through fifties with a wide range of technical skills. Because it can be difficult by definition to test “distance” students, we recruited local test subjects who would closely match the demographics of our target audience. Participants for the pilot study were: Subject A, a female student in her thirties who was familiar with the AET Zone environment, Subject B, a male in his twenties who was unfamiliar with the environment and a regular gamer, and Subject C, a female in her fifties who was unfamiliar with the AET Zone, and not a gamer. The test subjects who represented the target distance learning population were three women and one man. Participants in the second formative evaluation were Subject D, a female student in her thirties who was unfamiliar with the AET Zone and not a gamer, Subject E, a female student in her forties who was unfamiliar with the AET Zone and not a gamer, and Subject F, a female student in her fifties who was unfamiliar with the AET Zone and not a gamer. All four are graduate students in the College of Education, though none had taken classes in the AET Zone. The participants in the final formative evaluation were Subject G, a male student in his twenties who was unfamiliar with the AET Zone and also a gamer and Subject H, a female in her fifties who was unfamiliar with the AET Zone and not a gamer. Task List Summation and Commentary Go to the Zen Garden and Find the Link to the Web Version of the Charlotte Observer Two subjects (F and C) found this difficult; they did not understand mousing over objects or did not recognize the symbol for the newspaper. Younger or more tech savvy users had relatively little difficulty. Go to the Italian Renaissance Garden and Find the Academic Search Premier database. Find an Article on Virtual Learning Environments and E-mail it to Yourself Participants were drawn to the metaphor of the computer, but confused as to why all computers were not the same. They figured out, eventually, that each was a database, but as one participant stated, “You'd think a computer would go to all, no?” The second part of this question was asked just to see how well subjects navigate the online databases. No one had difficulty with this task, which is refreshing since in the early days of database interfaces this was often a challenge. Find the Object in the Information Gardens which Links You to the Library Catalog. What is it? This was problematic for users. It is the same object in both the Zen Garden and the Italian Renaissance Garden: a treasure chest. However, users continued to expect it to be a computer or a book. The days of the physical card catalog are long gone, and no one was expecting a metaphor that is not a computer, and intuitively, a treasure chest does not scream “catalog.” However, part of the purpose of a virtual world is exploration and once users discovered the catalog they did not have a problem with the metaphor of a treasure chest, though one participant said it “maybe a pile of books would be better.” Questionnaire Results Would you Want to Take a Class in this Environment? Follow Up with Why? Almost every user said yes, although subject D replied in the negative. Subject E said, “not my type of format, but I would not ‘not' take one.” Every person over thirty said they thought it would be good for younger users and perceived this kind of environment as having future potential. Overall Comments or Suggestions? Four users commented on wanting signage. Our youngest participant (Subject B—a male in his twenties who is a regular gamer) missed the alternate “gaming” keys (w-a-s-d act as arrows in many games). Four users did not notice the corresponding Web page that appeared when they clicked on an object. One suggestion was to enlarge the Web page screen. Another was to have an audio cue. Two users wished they had more time to explore. Gratifyingly, all of the users thought the Information Gardens was visually attractive and more interesting than a traditional-looking library. Conclusions and Observations Not surprisingly, the younger test subjects acclimated quickly to the Information Gardens. They were comfortable using the metaphors in the Gardens and when asked to travel from one Garden to the next, Subject C responded “What would a Zen Garden look like?” and “What would an Italian Renaissance Garden look like?” They did not have any navigational problems and consequently experienced a high level of satisfaction and comfort. They completed the tasks fairly quickly resulting in increased productivity. However, it was a different story with the more mature test subjects (those over forty). It took them much longer to acclimate. Granted, they were not in the Information Gardens very long, but the difference between the two age groups was marked. Most of the subjects in this group experienced lower levels of satisfaction, comfort, and took longer to accomplish the tasks, due to awkwardness with navigation and a general lack of experience in a virtual environment. From our observations, we were able to draw several conclusions. The first is that, just like in the real world, users do not read signage. The first participants in the pilot study asked for signage, so we added it, but when testing with the next round of users the signage was ignored (see Issues for Future Consideration). The second result was that, after observing actual users, we determined that offering orientation tours of the Information Gardens space would help users and hopefully cut down on frustration. The first of these orientation tours have taken place and seem to be helpful to students. A number of the subjects had a tendency to stay in one location physically and use the mouse to click on objects within their line of sight. We observed that from certain angles and distances, tool tips (the windows that pop up when you mouse over objects) do not work and inexperienced users found that frustrating. Younger users and those with more experience moved closer to the object, so the tips worked. In a related issue, this resulted in difficulty noticing objects that were placed near walls or in corners. With experience, we expect that most users will gain confidence in navigation that will eliminate these problems. We are now aware of the problem and can angle and move objects away from less visually accessible areas to help alleviate this difficulty. Also, people had difficulty seeing across spaces (from the Zen Garden to the Italian Renaissance Garden, the Italian Garden does not come into view until about half way across the lobby). This is really a limitation of the AET Zone software, not so much an error on the designer's part, and perhaps in future releases the graphics will be more sophisticated. User C commented that she is not a visual learner and preferred text. This is a reminder that designers should be conscious of different learning styles in creation of virtual environments. Another observation (see Appendix A) was that the sundial object in the Information Gardens is used to link to the Belk Library Web site (subject F, when she noticed this, said, “Oh, the sundial will always go to the library page.”) However, in other classes in the AET Zone, it goes to the discussion board. Conformity of metaphorical symbols across all virtual worlds in the AET Zone would be helpful to users. A conclusion of these researchers is that a Web-based clearinghouse of symbols used, and what they represent for all designers in the AET Zone will strengthen these worlds. In the first round of testing, we had the user perform in the “first person.” In this mode, your avatar is not seen by you, only by others in the virtual world. This is for the ease of use for the participant. It improves navigation and allows the avatar to get physically closer to objects. However, in making a video to show others, it is helpful to actually see the subjects' avatars. In the final round of testing, we attempted to use the “third person” avatar since it is more visually interesting for presenting and easier for the researcher to observe what the subject is doing. However, using the “third person” actually makes it more difficult for the user (they have a harder time getting an angle for tool tips, for example), so we gave up on this idea and returned to recording the user in “first person.” Issues for Future Consideration The usability study has been beneficial in that it has helped us identify areas for improvement in the design of the Information Gardens as well as areas that are functioning well and do not need to be changed. One area that needs improvement is the arrangement of electronic resources in the Italian Renaissance Garden. Users were confused by the organization of the library databases and the fact that each computer workstation only links to one specific database. We will re-design the resources in this garden to improve functionality based on the comments of the users and our observations. Some users appeared lost when they first entered the Information Gardens—what can we do to change that? Maybe the users were expecting a traditional looking library and that is the cause of their initial confusion. This is an extremely important issue. We want users to have success in the library, not leave out of frustration. We have already started to discuss this and plan to experiment with different aids, tools or devices. We will continue to get feedback from the students to determine the most effective way to guide people through the Information Gardens. Because users requested signage, we added it, but this did not greatly increase orientation. We are contemplating changing the text in the directory and directional signs to images. Perhaps images will be more intuitive. Another area that deserves further consideration involves the metaphorical design of the Information Gardens. Some of our users expected the name of each garden to coincide with the resources that were in the Garden. If this was not the case, they became “befuddled.” The Zen Garden is an example of a Garden in which the materials within correspond with the name. The Zen Garden, a place for quiet and contemplative reading contains links to relevant reading material such as popular reading, newspapers, and journal articles. On the other hand, the image of an Italian Renaissance Garden does not bring to mind electronic resources and library databases. While we may not be able to match resources as well as we would like with garden names, one thing we can do is change the name of the garden to better describe the resources it contains. One garden which illustrates this is The NorthWoods Special Collections Garden. Our special collections can be accessed in this garden and the metaphors were clear and even amusing to test subjects. One of the users suggested adding HELP windows or pop-ups throughout the Information Gardens. Perhaps this would also aid in alleviating frustration. Although more mature users were generally less productive and experienced greater frustration in this environment, we are fairly confident, based on our own experiences that this will improve with time spent in the AET Zone. A longitudinal user study of this user group would be a potential subject of further research. Notes and References 1. Klaus-Peter Beier, “Virtual Reality: A Short Introduction” (2004) (http://www- vrl.umich.edu/intro/index.html). 2. Foraker Design, “Introduction to Usability,” http://www.usabilityfirst.com/intro/index.txl. 3. C. Borgman, “The User's Mental Model of an Information Retrieval System: An Experiment on a Prototype Online Catalog”, International Journal of Man-Machine Studies 24 (1986), pp. 27–64. 4. Peter Polson, Clayton Lewis, John Rieman and Cathleen Wharton, “Cognitive Walkthroughs: A Method for Theory-Based Evaluation of User Interfaces”, International Journal of Man-Machine Studies 36 (1992), pp. 741–773. Abstract | PDF (2651 K) | View Record in Scopus | Cited By in Scopus (82) 5. Jakob Nielsen, “Heuristic Evaluation”. In: Jakob Nielsen and Robert Mack, Editors, Usability Inspection Methods, John Wiley & Sons, New York (1994), pp. 25–62. 6. Alistair Sutcliffe and Brian G. Notes, “Heuristic Evaluation of Virtual Reality Applications”, Interacting with Computers 16 (4) (2004), pp. 831–849. Article | PDF (347 K) | View Record in Scopus | Cited By in Scopus (3) 7. M. Scriven, “The Methodology of Evaluation”. In: R.E. Stake, Editor, Perspectives of Curriculum Evaluation American Educational Research Association Monograph, Rand McNally, Chicago (1967). 8. Deborah Hix, J., Edward Swan, Joseph L. Gabbard, Mike McGee, Jim Durbin, and Tony King, “User-Centered Design and Evaluation of a Real-Time Battlefield Visualization Virtual Environment,” Paper presented at the IEEE Virtual Reality 1999. 9. Doug A Bowman and L. Hodges, “An Evaluation of Techniques for Grabbing and Manipulating Remote Objects in Immersive Virtual Environments,” Paper presented at the Proceedings of the ACM Symposium on Interactive 3D Graphics 1997. 10. Mel Slater, Martin Usoh and Anthony Steed, “Taking Steps: The Influence of a Walking Metaphor on Presence in Virtual Reality”, ACM Transaction on Computer Human Interaction 2 (3) (1995), pp. 201–219. Full Text via CrossRef 11. Deborah Hix and H. Hartson, Developing User Interfaces: Ensuring Usability Through Product & Process, John Wiley and Sons, New York (1993). 12. Doug Bowman, D. Johnson, and L. Hodges, “Testbed Evaluation of Ve Interaction Techniques,” Paper presented at the Proceedings of the ACM Symposium on Virtual Reality Software and Technology 1999. 13. Jolanda Tromp, Anthony Steed and John Wilson, “Systematic Usability Evaluation and Design Issues for Collaborative Virtual Environments”, Presence 12 (3) (2003), pp. 241–267. Full Text via CrossRef | View Record in Scopus | Cited By in Scopus (14) 14. Jonas Löwgren, Thoughtful Interaction Design: A Design Perspective on Information Technology, MIT Press, Cambridge, MA (2004). Appendix A. Changes Made to the Information Gardens After Each Test Table A1. Pilot Study Problem Resolution Subjects had trouble locating Zen Garden Added directional signs for Zen Garden Subjects had trouble locating Renaissance Garden Added directional signs for Renaissance Garden Table A2. Formative Evaluation No. 1 Problem Resolution Future Resolution Subject D became stuck in open space between Zen and English Gardens Removed open space Subjects were not using the directory sign Enlarged the directory sign and changed the background color from blue to green Subjects were not using directional signs Changed the background color from blue to green. and changed the color of the text from Subjects were unable to determine the arrangement of the databases in the Italian Renaissance Garden Added signs which identified groupings of the databases by subject area Subjects expected the sundial to link to a discussion board instead Removed the link to the Library Web site from the sundial and We will link the sundial to a Problem Resolution Future Resolution of the library Web site (the sundial links to discussion boards in the courses in the AETZone) added it to the white statue discussion board Subject recommended “one-stop shopping,” to facilitate finding resources more quickly Placed the large white statue with a link to the library Web site in multiples places throughout the Information Gardens (all library resources can be accessed from this link) Table A3. Formative Evaluation No. 2 Problem Resolution Future Resolution Subjects were still not reading/using the directory and directional signs Replace the text on the signs with images or add audio cues Subject did not see the signs showing the arrangement of the databases in the Italian Renaissance Garden. Assumed they would be arranged alphabetically Re-position the signs which organize the databases by subject area so they are more obvious Major reorganization of this area to eliminate confusion and facilitate ease of use Subjects had trouble finding the treasure chest (object representing the catalog) Changed the color of the treasure chest so it is more noticeable Table A4. Task No. 1: Zen Garden Task No. 2: Italian Renaissance Garden Task No. 3: Library Catalog Results Results Results Subjects wandered around until they found the Zen garden Subjects thought the computers would link to all of the databases, not just one specific database Subjects expected the object to be a book or a computer Task No. 1: Zen Garden Task No. 2: Italian Renaissance Garden Task No. 3: Library Catalog Results Results Results Two subjects did not understand the cue for the newspaper Subjects were unable to figure out the arrangement of the databases Subjects found the link to the library Web site instead Four subjects missed the corresponding Web page Subjects were able to find an article without any trouble, once they located the Academic Search Premier database Sundial Appendix B. Usability Test Consent form Belk Library and Information Commons Appalachian State University Please read and sign this form. In this usability test: • You will be asked to perform certain tasks in virtual library. • We will conduct an interview with you. • You will be asked to fill in a questionnaire or survey. • Your voice or a video may be recorded. These recordings will be used only by the library web committee. Participation in this usability study is voluntary. All information will remain strictly confidential. The descriptions and findings may be used to help improve the web site. However, at no time will your name or any other identification be used. You can withdraw your consent to the experiment and stop participation at any time. If you have any questions after today, please contact Geri Purpur at 828-2626-6903 or purpurgm@appstate.edu. I have read and understood the information on this form and had all of my questions answered You may contact the Appalachian State University Institutional Review Board at the following address and telephone number at any time during this study if you feel your rights have been violated: Chairperson, Institutional Review Board c/o Graduate Studies and Research BB Dougherty Administration Building Appalachian State University Boone, NC 28608 828-262-2130 Appendix C. AET Zone Observational Test Date: Usability Study Questions for the AET Zone Information Gardens You will begin out in front of the Information Gardens. To maneuver around, click in the window and use your arrow keys. You will practice navigating here before you go into the Gardens and attempt the tasks. If you get stuck, use the shift/arrow keys to get unstuck. For this Usability Study you will be using the Zen Garden and Renaissance Garden only. Please describe your experience with Web browsers and tools: □ beginner □ intermediate □ advanced Participant is: □ Freshman □ Sophomore □ Junior □ Senior □ Staff □ Faculty □ Other Department (if applicable): _____________________________________ 1. Go to the Zen Garden and find the link to the Web Version of the Charlotte Observer. 2. Go to the Italian Renaissance Garden and find the Academic Search Premier database. Find an article on virtual learning environments and e-mail it to yourself. 3. Find the object in the Information Gardens which links you to the library catalog. What is it? 4. Would you want to take a class in this environment? Follow up—why? 5. Overall comments or suggestions?