Hitting the Road Towards a Greater Digital Destination: Evaluating and Testing DAMS at University of Houston Libraries Annie Wu, Santi Thompson, Rachel Vacek, Sean Watkins, and Andrew Weidner INFORMATION TECHNOLOGY AND LIBRARIES | JUNE 2016 5 ABSTRACT Since 2009, tens of thousands of rare and unique items have been made available online for research through the University of Houston (UH) Digital Library. Six years later, the UH Libraries’ new digital initiatives call for a more dynamic digital repository infrastructure that is extensible, scalable, and interoperable. The UH Libraries’ mission and the mandate of its strategic directions drives the pursuit of seamless access and expanded digital collections. To answer the calls for technological change, the UH Libraries administration appointed a Digital Asset Management System (DAMS) Implementation Task Force to explore, evaluate, test, recommend, and implement a more robust digital asset management system. This article focuses on the task force’s DAMS selection activities: needs assessment, systems evaluation, and systems testing. The authors also describe the task force’s DAMS recommendation based on the evaluation and testing data analysis, a comparison of the advantages and disadvantages of each system, and system cost. Finally, the authors outline their DAMS implementation strategy comprised of a phased rollout with the following stages: system installation, data migration, and interface development. INTRODUCTION Since the launch of the University of Houston Digital Library (UHDL) in 2009, the UH Libraries have made tens of thousands of rare and unique items available online for research using CONTENTdm. As we began to explore and expand into new digital initiatives, we realize that the UH Libraries’ digital aspirations require a more dynamic, flexible, scalable, and interoperable digital asset management system that can manage larger amounts of materials in a variety of formats. We plan to implement a new digital repository infrastructure that accommodates creative workflows and allows for the configuration of additional functionalities such as digital exhibits, data mining, cross-linking, geospatial visualization, and multimedia presentation. The Annie Wu (awu@uh.edu) is Head of Metadata and Digitization Services, Santi Thompson (sathompson3@uh.edu) is Head of Repository Services, Rachel Vacek (evacek@uh.edu) is Head of Web Services, Sean Watkins (slwatkins@uh.edu) is Web Projects Manager, and Andrew Weidner (ajweidner@uh.edu) is Metadata Services Coordinator, University of Houston Libraries. mailto:awu@uh.edu mailto:sathompson3@uh.edu mailto:evacek@uh.edu mailto:slwatkins@uh.edu mailto:ajweidner@uh.edu HITTING THE ROAD TOWARDS A GREATER DIGITAL DESTINATION: EVALUATING AND TESTING DAMS AT UNIVERSITY OF HOUSTON LIBRARIES | WU ET AL. | doi:10.6017/ital.v35i2.9152 6 new system will be designed with linked data in mind and will allow us to publish our digital collections as linked open data within the larger semantic web environment. The UH Libraries Strategic Directions set forth a mandate for us to “work assiduously to expand our unique and comprehensive collections that support curricula and spotlight research. We will pursue seamless access and expand digital collections to increase national recognition.”1 To fulfill the UH Libraries’ mission and the mandate of our Strategic Directions, the UH Libraries administration appointed a Digital Asset Management System (DAMS) Implementation Task Force to explore, evaluate, test, recommend, and implement a more robust digital asset management system that would provide multiple modes of access to the UH Libraries’ unique collections and accommodate digital object production at a larger scale. The collaborative task force comprises librarians from four departments: Metadata and Digitization Services (MDS), Web Services, Digital Repository Services, and Special Collections. The core charge of the task force is to: • Perform a needs assessment and build criteria and policies based on evaluation of the current system and requirements for the new DAMS • Research and explore DAMS on the market and identify the top three systems for beta testing in a development environment • Generate preliminary recommendations from stakeholders' comments and feedback • Coordinate installation of the new DAMS and finish data migration • Communicate the task force work to UH Libraries colleagues LITERATURE REVIEW Libraries have maintained DAMS for the publication of digitized surrogates of rare and unique materials for over two decades. During that time, information professionals have developed evaluation strategies for testing, comparing, and evaluating library DAMS software. Reviewing these models and associated case studies provided insight into common practices for selecting systems and informed how the UH Libraries DAMS Implementation Task Force conducted its evaluation process. One of the first publications of its kind, “A Checklist for Evaluating Open Source Digital Library Software” by Dion Hoe-Lian Goh et al., presents a comprehensive list of criteria for library DAMS evaluation.2 The researchers developed twelve broad categories for testing (e.g., content management, metadata, and preservation) and generated a scoring system based on the assignment of a weight and a numeric value to each criterion.3 While the checklist was created to assist with the evaluation process, the authors note that an institution’s selection decision should be guided primarily by defining the scope of their digital library, the content being curated using the software, and the uses of the material.4 Through their efforts, the authors created a rubric that can be utilized by other organizations when selecting a DAMS. INFORMATION TECHNOLOGY AND LIBRARIES | JUNE 2016 7 Subsequent research projects have expanded upon the checklist evaluation model. In “Choosing Software for a Digital Library,” Jody DeRidder outlines major issues that librarians should address when choosing DAMS software, including many of the hardware, technological, and metadata concerns that Goh et al. identified.5 Additionally, she emphasizes the need to account for personnel and service requirements with a variety of activities: usability testing and estimating associated costs; conducting a formal needs assessment to guide the evaluation process; and a tiered-testing approach, which calls upon evaluators to winnow the number of systems.6 By considering stakeholder needs, from users to library administrators, DeRidder’s contributions inform a more comprehensive DAMS evaluation process. In addition to creating evaluation criteria, the literature on DAMS selection has also produced case studies that reflect real-world scenarios and identify use cases that help determine user needs and desires. In “Evaluation of Digital Repository Software at the National Library of Medicine,” Jennifer L. Marill and Edward C. Luczak discuss the process that the National Library of Medicine (NLM) used to compare ten DAMS, both proprietary and open-source.7 Echoing Goh et al. and DeRidder, Marill and Luczak created broad categories for testing and developed a scoring system for comparing DAMS.8 Additionally, Marill and Luczak enriched the evaluation process by implementing two testing phases: “initial testing of ten systems” and “in-depth testing of three systems.”9 This method allowed NLM to conduct extensive research on the most promising systems for their needs before selecting a DAMS to implement. The tiered approach appealed to the task force, and influenced how it conducted the evaluation process, because it balances efficiency and comprehensiveness. In another case study, Dora Wagner and Kent Gerber describe the collaborative process of selecting a DAMS across a consortium. In their article “Building a Shared Digital Collection: The Experience of the Cooperating Libraries in Consortium,”10 the authors emphasize additional criteria that are important for collaborating institutions: the ability to brand consortial products for local audiences; the flexibility to incorporate differing workflows for local administrators; and the shared responsibility of system maintenance and costs.11 While the UH Libraries will not be managing a shared repository DAMS, the task force appreciated the article’s emphasis on maximizing customizations to improve the user experience. In “Evaluation and Usage Scenarios of Open Source Digital Library and Collection Management Tools,” Georgios Gkoumas and Fotis Lazarinis describe how they tested multiple open-source systems against typical library functions—such as acquisitions, cataloging, digital libraries, and digital preservation—to identify typical use cases for libraries.12 Some of the use cases formulated by the researchers address digital platforms, including features related to supporting a diverse array of metadata schema and using a simple web interface for the management of digital assets.13 These use cases mirror local feature and functionality requests incorporated into the UH Libraries’ evaluation criteria. HITTING THE ROAD TOWARDS A GREATER DIGITAL DESTINATION: EVALUATING AND TESTING DAMS AT UNIVERSITY OF HOUSTON LIBRARIES | WU ET AL. | doi:10.6017/ital.v35i2.9152 8 In “Digital Libraries: Comparison of 10 Software,” Mathieu Andro, Emmanuelle Asselin, and Marc Maisonneuve discuss a rubric they developed to compare six open-source platforms (Invenio, Greenstone, Omeka, EPrints, ORI-OAI, and DSpace) and four proprietary platforms (Mnesys, DigiTool, YooLib, and CONTENTdm) around six core areas: document management, metadata, engine, interoperability, user management, and Web 2.0. 14 The authors note that each solution is “of good quality” and that institutions should consider a variety of factors when selecting a DAMS, including the “type of documents you will want to upload” and the “political criteria (open source or proprietary software)” desired by the institution.15 This article provided the UH Libraries with additional factors to include in their evaluation criteria. Finally, Heather Gilbert and Tyler Mobley’s article “Breaking Up with CONTENTdm: Why and How One Institution Took the Leap to Open Source,” provides a case study for a new trend: selecting a DAMS for migration from an existing system to a new one.16 The researchers cite several reasons for their need to select a new DAMS, primarily their current system’s limitations with searching and displaying content in the digital library.17 They evaluated alternatives and selected a suite of open-source tools, including Fedora, Drupal, and Blacklight, which combine to make up their new DAMS.18 Gilbert and Mobley also reflect on the migration process and identify several hurdles they had to overcome, such as customizing the open-source tools to meet their localized needs and confronting inconsistent metadata quality.19 Gilbert and Mobley’s article most closely matches the scenario faced by the UH Libraries. Our study adds to the limited literature on evaluating and selecting DAMS for migration in several ways. It demonstrates another model that other institutions can adapt to meet their specific needs. It identifies new factors for other institutions to take into account before or during their own migration process. Finally, it adds to the body of evidence for a growing movement of libraries migrating from proprietary to open-source DAMS. DAMS EVALUATION AND ANALYSIS METHODOLOGY Needs Assessment The DAMS Implementation Task Force fulfilled the first part of its charge by conducting a needs assessment. The goal of the needs assessment was to collect the key requirements of stakeholders, identify future features of the new DAMS, and gather data in order to craft criteria for evaluation and testing in the next phase of its work. The task force employed several techniques for information gathering during the needs assessment phase: • Identified stakeholders and held internal focus group interviews to identify system requirement needs and gaps • Reviewed scholarly literature on DAMS evaluation and migration • Researched peer/aspirational institutions • Reviewed national standards around DAMS INFORMATION TECHNOLOGY AND LIBRARIES | JUNE 2016 9 • Determined both the current use of UHDL as well as its projected use of UHDL • Identified UHDL materials and users Task force members took detailed notes during each focus group interview session. The literature research on DAMS evaluation helped the task force to find articles with comprehensive DAMS evaluation criteria. The NISO criteria for core types of entities in digital library collections were also listed and applied to the evaluation after reviewing the NISO Framework of Guidance for Building Good Digital Collections.20 More than forty peer and aspirational institutions’ digital repositories were benchmarked to identify web site names, platform architecture, documentation, and user and system features. The task force analyzed the rich data gathered from needs assessment activities and built the DAMS evaluation criteria that prepared the task force for the next phase of evaluation. Evaluation, Testing, and Recommendation The task force began its evaluation process by identifying twelve potential DAMS for consideration that were ultimately narrowed down to three systems for in-depth testing. Using data from focus group interviews, literature reviews, and DAMS best practices, the group generated a list of benchmark criteria. These broad evaluation criteria covered features in categories of system functionality, content management, metadata, user interface, and search support. Members of the task force researched DAMS documentation, product information, and related literature to score each system against the evaluation criteria. Table 1 contains the scores of the initial evaluation. From this process, five systems emerged with the highest scores: ● Fedora (and, closely associated, Fedora/Hydra and Fedora/Islandora) ● Collective Access ● DSpace ● RosettaCONTENTdm The task force eliminated Collective Access from the final systems for testing because of its limited functionality. It is based around archival content only, and is not widely deployed. The task force decided not to test CONTENTdm because of the system’s known functionalities that we identified through firsthand experience. After the initial elimination process, Fedora (including Fedora/Hydra and Fedora/Islandora), DSpace, and Rosetta remained for in-depth testing. HITTING THE ROAD TOWARDS A GREATER DIGITAL DESTINATION: EVALUATING AND TESTING DAMS AT UNIVERSITY OF HOUSTON LIBRARIES | WU ET AL. | doi:10.6017/ital.v35i2.9152 10 DAMS Evaluation Score* Fedora 27 Fedora/Hydra 26 Fedora/Islandora 26 Collective Access 24 DSpace 24 Rosetta 20 CONTENTdm 20 Trinity (iBase) 19 Preservica 16 Luna Imaging 15 RODA† 6 Invenio‡ 5 Table 1. Evaluation scores of twelve DAMS using broad evaluation criteria The task force then created detailed evaluation and testing criteria by drawing from the same sources used previously: focus groups, literature review, and best practices. While the broad evaluation focused on high-level functions, the detailed evaluation and testing criteria for the final three systems closely analyzed the specific features of each DAMS in eight categories: ● System Environment and Function ● Administrative Access ● Content Ingest and Management ● Metadata ● Content Access ● Discoverability ● Report and Inquiry Capabilities ● System Support * Total Possible Score: 29. † Removed from evaluation because the system does not support Dublin Core metadata. ‡ Removed from evaluation because the system does not support Dublin Core metadata. INFORMATION TECHNOLOGY AND LIBRARIES | JUNE 2016 11 Prior to the in-depth testing of the final three systems, the task force researched timelines for system setup. Rosetta’s timeline for system setup proved to be prohibitive. Consequently, the task force eliminated Rosetta from the testing pool and moved forward with Fedora and DSpace. To conduct the detailed evaluation, the task force scored the specific features under each category utilizing systems testing and documentation. A score range from zero to three (0 = None, 1 = Low, 2 = Moderate, 3 = High) was assigned for each feature evaluated. After evaluating all features, the score was tallied for each category. Our testing revealed that Fedora outperformed DSpace in over half of the testing sections: Content Ingest and Management, Metadata, Content Access, Discoverability, and Report and Inquiry Capabilities. See table 2 for the tallied scores in each testing section. Testing Sections DSpace Score Fedora Score Possible Score System Environment and Testing 21 21 36 Administrative Access 15 12 18 Content Ingest and Management 59 96 123 Metadata 32 43 51 Content Access 14 18 18 Discoverability 46 84 114 Report and Inquiry Capabilities 6 15 21 System Support 12 11 12 TOTAL SCORE: 205 300 393 Table 2. Scores of top two DAMS from testing using detailed evaluation criteria After review of the testing results, the task force conducted a facilitated activity to summarize the advantages and disadvantages of each system. Based on this comparison, the DAMS Task Force recommended that the UH Libraries implement a Fedora/Hydra repository architecture with the following course of action: ● Adapt the UHDL user interface to Fedora and re-evaluate it for possible improvements ● Develop an administrative content management interface with the Hydra framework ● Migrate all UHDL content to a Fedora repository HITTING THE ROAD TOWARDS A GREATER DIGITAL DESTINATION: EVALUATING AND TESTING DAMS AT UNIVERSITY OF HOUSTON LIBRARIES | WU ET AL. | doi:10.6017/ital.v35i2.9152 12 Fedora/Hydra Advantages Fedora/Hydra Disadvantages Open source Steep learning curve Large development community Long setup time Linked data ready Requires additional tools for discovery Modular design through API No standard model for multi-file objects Scalable, sustainable, and extensible Batch import/export of metadata Handles any file format Table 3. Fedora/Hydra advantages and disadvantages The primary advantages of a DAMS based on Fedora/Hydra are: a large and active development community; a scalable and modular system that can grow quickly to accommodate large scale digitization; and a repository architecture based on linked data technologies. This last advantage, in particular, is unique among all systems evaluated, and will give the UH Libraries the ability to publish our collections as linked open data. Fedora 4 conforms to the World Wide Web Consortium (W3C) recommendation for Linked Data Platforms.21 The main disadvantage of a Fedora/Hydra system is the steep learning curve associated with designing metadata models and developing a customized software suite, which translates to a longer implementation time compared to off-the-shelf products. The UH Libraries must allocate an appropriate amount of time and resources for planning, implementation, and staff training. The long-term return on investment for this path will be a highly skilled technical staff with the ability to maintain and customize an open-source, standards-based repository architecture that can be expanded to support other UH Libraries content such as geospatial data, research data, and institutional repository materials. INFORMATION TECHNOLOGY AND LIBRARIES | JUNE 2016 13 Dspace Advantages DSpace Disadvantages Open source Flat file and metadata structure Easy installation / ready out of box Limited reporting capabilities Existing familiarity through Texas Digital Library Limited metadata features User group / profile controls Does not support linked data Metadata quality module Limited API Batch import of objects Not scalable / extensible Poor user interface Table 4. DSpace advantages and disadvantages The main advantages of DSpace are ease of installation, familiarity of workflows, and additional functionality not found in CONTENTdm.22 Installation and migration to a DSpace system would be relatively fast, and staff could quickly transition to new workflows because they are similar to CONTENTdm. DSpace also supports authentication and user roles that could be used to limit content to the UH community only. Commercial add-on modules, although expensive, could be purchased to provide more sophisticated content management tools than are currently available with CONTENTdm. The disadvantages of a DSpace system are the same long-term, systemic problems with the current CONTENTdm repository. DSpace uses a flat metadata structure, has a limited API, does not scale well, and is not customizable to the UH Libraries’ needs. Consultations with peers indicated that both CONTENTdm and DSpace institutions are exploring the more robust capabilities of Fedora-based systems. Migration of the digital collections in CONTENTdm to a DSpace repository would provide few, if any, long term benefits to the UH Libraries. Of all the systems considered, implementation of a Fedora/Hydra repository aligns most clearly with the UH Libraries Strategic Directions of attaining national recognition and improving access to our unique collections. The Fedora and Hydra communities are very active, with project management overseen by Duraspace and Hydra respectively.23,24 Over the long term, a repository based on Fedora/Hydra will give the UH Libraries a low cost, scalable, flexible, and interoperable platform for providing online access to our unique collections. HITTING THE ROAD TOWARDS A GREATER DIGITAL DESTINATION: EVALUATING AND TESTING DAMS AT UNIVERSITY OF HOUSTON LIBRARIES | WU ET AL. | doi:10.6017/ital.v35i2.9152 14 Cost Considerations To balance the current digital collections production schedule with the demands of a timely implementation and migration, the task force identified the following investments as cost effective for Fedora/Hydra and DSpace, respectively: Fedora/Hydra DSpace Metadata Librarian: annual salary ● manages daily Metadata Unit operations during implementation ● streamlines the migration process Metadata Librarian: annual salary ● manages daily Metadata Unit operations during implementation ● streamlines the migration process @Mire Modules: $41,500 ● Content Delivery (3): $13,500 ● Metadata Quality: $10,000 ● Image Conversion Suite: $9,000 ● Content & Usage Analysis: $9,000 ● These modules require one-time fees to @Mire that recur when upgrading to a new version of DSpace Table 5. Start-up costs associated with Fedora/Hydra and DSpace The task force determined that an investment in one librarian’s salary is the most cost-effective course of action. The new Metadata Librarian will manage daily operations of the Metadata Unit in Metadata & Digitization Services while the Metadata Services Coordinator, in close collaboration with the Web Projects Manager, leads the DAMS implementation process. In contrast to Fedora, migration to DSpace would require a substantial investment in third party software modules from @Mire to deliver the best possible content management environment and user experience. IMPLEMENTATION STRATEGIES The implementation of the new DAMS will occur in a phased rollout comprised of the following stages: System Installation, Data Migration, and Interface Development. MDS and Web Services will perform the majority of the work, in consultation with key stakeholders from Special Collections and other units. Throughout this process, the DAMS Implementation Task Force will INFORMATION TECHNOLOGY AND LIBRARIES | JUNE 2016 15 consult with the Digital Preservation Task Force* to coordinate the preservation and access systems. Phase One System Installation Phase Two Data Migration Phase Three Interface Development Set up production and server environment Formulate content migration strategy and schedule Reevaluate front-end user interface Rewrite UHDL front-end application for Fedora/Solr Migrate test collections and document exceptions Rewrite UHDL front end as a Hydra head OR . . . Create metadata models Conduct the data migration . . . Update current front end Coordinate workflows with Digital Preservation Task Force Create preservation metadata for migrated data Establish inter- departmental production workflows Begin development of administrative Hydra head for content management Continue development of the Hydra administrative interface Refine administrative Hydra head for content management Table 6. Overview of DAMS phased implementation Phase One: System Installation During the first phase of DAMS implementation, Web Services and MDS will work closely together to install an open-source repository software stack based on Fedora, rewrite the current PHP front-end interface to provide public access to the data in the new system, and create metadata content models for the UHDL based on the Portland Common Data Model,25 in consultation with the Coordinator of Digital Projects from Special Collections and other key stakeholders. The DAMS Task Force will consult with the Digital Preservation Task Force† to determine how closely the preservation and access systems will be integrated and at what points. The two groups will also jointly outline a DAMS migration strategy that aligns with the preservation system. Web Services and MDS will collaborate on research and development of an administrative interface, based on the Hydra framework, for day-to-day management of UHDL content. * An appointed task force to create a digital preservation policy and identify strategies, actions, and tools needed to sustain long-term access to digital assets maintained by UH Libraries. † A working team at UH Libraries that enforces the digital preservation policy and maintains the digital preservation system.[convert these footnotes to endnotes?] HITTING THE ROAD TOWARDS A GREATER DIGITAL DESTINATION: EVALUATING AND TESTING DAMS AT UNIVERSITY OF HOUSTON LIBRARIES | WU ET AL. | doi:10.6017/ital.v35i2.9152 16 Phase Two: Data Migration In the second phase, MDS will migrate legacy content from CONTENTdm to the new system and work with Web Services, Special Collections, and the Architecture and Art Library to resolve any technical, metadata, or content problems that arise. The second phase will begin with the development of a strategy for completing the work in a timely fashion, followed by migration of representative sample collections to the new system to test and refine its capabilities. After testing is complete, all legacy content will be migrated from CONTENTdm to Fedora, and preservation metadata for migrated collections will be created and archived. Development work on the Hydra administrative interface will also continue. After the data migration is complete, all new collections will be ingested into Fedora/Hydra, and the current CONTENTdm installation will be retired. Phase Three: Interface Development In the final phase, Web Services will reevaluate the current front-end user interface (UI) for the UHDL by conducting user tests to better understand how and why users are visiting the UHDL. Web Services will also analyze web and system analytics and gather feedback from Special Collections and other stakeholders. Depending on the outcome of this research, Web Services may create a new UI based on the Hydra framework or choose to update the current front-end application with modifications or new features. Web Services and MDS will also continue to develop or adopt tools for the management of UHDL content and work with Special Collections and the branch libraries to establish production workflows in the new system. Continued development work on the front-end and administrative interfaces, for the life of the new Digital Asset Management System, is both expected and desirable as we maintain and improve the UHDL infrastructure and contribute to the open source software community in line with the UH Libraries Strategic Directions. Ongoing: Assessment, Enhancement, Training, and Documenting Throughout the transition process MDS and Web Services will undergo extensive training in workshops and conferences to develop the skills necessary for developing and maintaining the new system. They will also establish and document workflows to ensure the long-term viability of the system. Regular consultation with Special Collections, the branch libraries, and other stakeholders will be conducted to ensure that the new system satisfies the requirements of colleagues and patrons. Ongoing activities will include: ● Assessing service impact of new system ● User testing on UI ● Regular system enhancements ● Establishing new workflows ● Creating and maintaining documentation ● Training: conferences, webinars, workshops, etc. INFORMATION TECHNOLOGY AND LIBRARIES | JUNE 2016 17 CONCLUSION Transitioning from CONTENTdm to a Fedora/Hydra repository will place the UH Libraries in a position to sustainably grow the amount of content in the UH Digital Library and customize the UHDL interfaces for a better user experience. Publishing our data in a linked data platform will give the UH Libraries the ability to more easily publish our data for the semantic web. In addition, the Fedora/Hydra architecture can be adapted to support a wide range of UH Libraries projects, including a geospatial data portal, a research data repository, and a self-deposit institutional repository. Over the long term, the return on investment for implementing an open-source repository architecture based on industry standard software will be: improved visibility of our unique collections on the Web; expanded opportunities for aggregating our collections with high- profile repositories such as the Digital Public Library of America; and increased national recognition for our digital projects and staff expertise. REFERENCES 1. “The University of Houston Libraries Strategic Directions, 2013–2016,” accessed July 22, 2015, http://info.lib.uh.edu/sites/default/files/docs/strategic-directions/2013-2016- libraries-strategic-directions-final.pdf. 2. Dion Hoe-Lian Goh et al., “A Checklist for Evaluating Open Source Digital Library Software,” Online Information Review 30, no. 4 (July 13, 2006): 360–79, doi:10.1108/14684520610686283. 3. Ibid., 366. 4. Ibid., 364. 5. Jody L. DeRidder, “Choosing Software for a Digital Library,” Library Hi Tech News 24, no. 9 (2007): 19–21, doi:10.1108/07419050710874223. 6. Ibid., 21. 7. Jennifer L. Marill and Edward C. Luczak, “Evaluation of Digital Repository Software at the National Library of Medicine,” D-Lib Magazine 15, no. 5/6 (May 2009), doi:10.1045/may2009- marill. 8. Ibid. 9. Ibid. 10. Dora Wagner and Kent Gerber, “Building a Shared Digital Collection: The Experience of the Cooperating Libraries in Consortium,” College & Undergraduate Libraries 18, no. 2–3 (2011): 272–90, doi:10.1080/10691316.2011.577680. 11. Ibid., 280–84. http://info.lib.uh.edu/sites/default/files/docs/strategic-directions/2013-2016-libraries-strategic-directions-final.pdf http://info.lib.uh.edu/sites/default/files/docs/strategic-directions/2013-2016-libraries-strategic-directions-final.pdf http://dx.doi.org/10.1108/14684520610686283 http://dx.doi.org/10.1108/07419050710874223 http://dx.doi.org/10.1045/may2009-marill http://dx.doi.org/10.1045/may2009-marill http://dx.doi.org/10.1080/10691316.2011.577680 HITTING THE ROAD TOWARDS A GREATER DIGITAL DESTINATION: EVALUATING AND TESTING DAMS AT UNIVERSITY OF HOUSTON LIBRARIES | WU ET AL. | doi:10.6017/ital.v35i2.9152 18 12. Georgios Gkoumas and Fotis Lazarinis, “Evaluation and Usage Scenarios of Open Source Digital Library and Collection Management Tools,” Program: Electronic Library and Information Systems 49, no. 3 (2015): 226–41, doi:10.1108/PROG-09-2014-0070. 13. Ibid., 238–39. 14. Mathieu Andro, Emmanuelle Asselin, and Marc Maisonneuve, “Digital Libraries: Comparison of 10 Software,” Library Collections, Acquisitions, & Technical Services 36, no. 3–4 (2012): 79–83, doi:10.1016/j.lcats.2012.05.002. 15. Ibid., 82. 16. Heather Gilbert and Tyler Mobley, “Breaking Up with CONTENTdm: Why and How One Institution Took the Leap to Open Source,” Code4Lib Journal, no. 20 (2013), http://journal.code4lib.org/articles/8327. 17. Ibid. 18. Ibid. 19. Ibid. 20. NISO Framework Working Group with support from the Institute of Museum and Library Services, A Framework of Guidance for Building Good Digital Collections (Baltimore, MD: National Information Standards Organization (NISO), 2007). 21 . “Linked Data Platform 1.0”, W3C, accessed July 22, 2015, http://www.w3.org/TR/ldp/. 22. “DSpace,” accessed July 22, 2015, http://www.dspace.org/. 23. “Fedora Repository Home,” accessed July 22, 2015, https://wiki.duraspace.org/display/FF/Fedora+Repository+Home. 24. “Hydra Project,” accessed July 22, 2015, http://projecthydra.org/. http://dx.doi.org/10.1108/PROG-09-2014-0070 http://dx.doi.org/10.1016/j.lcats.2012.05.002 http://journal.code4lib.org/articles/8327 http://www.w3.org/TR/ldp/ http://www.dspace.org/ https://wiki.duraspace.org/display/FF/Fedora+Repository+Home http://projecthydra.org/ INTRODUCTION LITERATURE REVIEW DAMS EVALUATION AND ANALYSIS METHODOLOGY Needs Assessment Evaluation, Testing, and Recommendation Cost Considerations Implementation Strategies Phase One: System Installation Phase Two: Data Migration Phase Three: Interface Development Ongoing: Assessment, Enhancement, Training, and Documenting CONCLUSION