Guest Editorial 2 Peer review is the most common indicator of quality used in academic disciplines. It is primarily employed in the review of research submitted as articles or books. While peer review for research is the mainstay in academia, there is no corollary process for reviewing practice: in other words, the evaluation of practice or instruction does not usually follow an objective peer review model. In spite of the fact that “best practice” is a term that is widely used in a variety of settings, it has very little meaning. It may be done in the course of an annual review by a supervisor or in a classroom by a colleague or department or in the course of tenure and promotion review. However, each of these uses local norms or standards that may or may not be consistent with disciplinary values and priorities. Identifying best practices is a significant endeavor; having an explicit and consistent stan- dard that is applied evenhandedly across the discipline or profession is invaluable. The ACRL Instruction Section’s Peer-Reviewed Instructional Materials Online Committee has developed a standard and process that provides a model for instructional materials in librarianship; it is a model that could effectively be adopted for other areas of practice in the profession. ~Wendi Kaspar PRIMO: Peer-Reviewed Instructional Materials Online Jodie Borgerding, Webster University Megan Hodge, Virginia Commonwealth University Bill Marino, Eastern Michigan University The authors would like to express their gratitude to Lori DuBois, Jennifer Knievel, and Mark Szarko, who provided valuable information about the history of the PRIMO Committee for this editorial. Background The Peer-Reviewed Instructional Materials Online (PRIMO) Committee “promotes and shares peer-reviewed instructional materials created by librarians to teach people about discovering, accessing and evaluating information in networked environments.”1 In doing so, it reviews librarian-created online tutorials dealing with information literacy and critical thinking skills, and highlights the highest-caliber projects through its “Site of the Month” posts on the ACRL Instruction Section blog (http://acrl.ala.org/IS/ category/committees/primo). Formed as a spin-off of the Emerging Technologies in Instruction (ETI) Committee and originally known as the Internet Education Project, the PRIMO Committee and database were created to provide a solution to a problem looming large for instruction librarians in the mid-1990s: libraries creating duplicative online tutorials around the same learning objectives, which were located and shared in disparate locations. The PRIMO database provided an “effective means for librarians to design and dissemi- nate instruction materials for teaching the academic and research community about information seeking information sharing, and information evaluation in the networked environment presented by the Internet.”2 Via the database and site interviews, PRIMO allows librarians to browse for qual- ity tutorials on a particular topic to use as part of their instruction, saving the time, Peer Review in Practice: Introduction doi:10.5860/crl.78.1.2 Guest Editorial 3 cost and effort affiliated with creating online learning objects in-house. Additionally, PRIMO provides librarians with a means to examine models of best practices in online information literacy materials. Today, the PRIMO database (http://primodb.org) has over 306 tutorials, with the eight most recent tutorials added in July 2016. Scope of PRIMO PRIMO currently accepts interactive online instructional material that is created for undergraduate or graduate-level audiences and is publicly accessible. However, PRIMO does not review promotional materials and, as stated on its website, “materi- als may be refused because the technology that they teach and the methodology that they use are already well represented on the site.”3 This statement allows the PRIMO committee to reflect on the state of instructional practices and web content and adjust its scope as new developments arise—”quality is a greater goal than comprehensive- ness.” For example, as web content in general has evolved, the type of materials that PRIMO has accepted has shifted away from text-heavy, passive instruction, to more visually appealing content that emphasizes interactivity; static webpages, simple pathfinders/LibGuides that employ no interactivity or assessment techniques, and passive screencasts/videos are no longer reviewed for inclusion in the database. PRIMO’s scope statement defines what is included in the database and, in turn, the content of the database determines what material is highlighted via the In- struction Section’s blog. Each review period, the four highest-scoring tutorials are invited to participate in Site of the Month interviews. These interviews detail the methods employed in creating the most successful PRIMO projects and are meant to inspire other librarians to use, adapt, or build upon the existing content to meet their unique needs. http://primodb.org 4 College & Research Libraries January 2017 Criteria As detailed below, PRIMO’s selection criteria are currently under revision. The last time the criteria were revised was in 2002, at which time they become more rubric-like with the addition of several statements providing specific examples of how the crite- rion can be met; many of these examples mirror instructional best practices prevalent at the time of their development in the early 2000s. In the intervening fourteen years, the design and form of online learning objects changed dramatically. The 2015–2016 chairs of the committee requested and were granted permission by the Instruction Sec- tion’s Executive Committee to update the selection criteria to bring them up-to-date. The length of time between revisions was too long in this case; it is clear that regular review of the criteria ought to be part of the committee’s timeline to ensure that they do not fall so out of line with current practices again. The criteria as they currently stand are as follows: 1. The instructional design is pedagogically effective, i.e. it teaches well accord- ing to the scope and learning objectives stated by the submitter. Like a good lesson plan, each project should include clearly defined learning objec- tives and support those objectives through its organization and content. Additionally, the best projects will exemplify instructional best practices by engaging students in higher-order thinking skills, supporting different learning styles, and assessing the extent of student learning. 2. The technology used to create the material enhances the learning experience, i.e. is appropriate and effective. This criterion specifies that technology used to run the tutorial is compatible across browsers, supports rather than detracts from student learning; and that plug-ins, if required, are easily acquired, to ensure the greatest number of students are able to use the tutorial. 3. This material provides instruction using technology in an innovative manner. To be considered models of online pedagogical best practices, submitted projects should make use of newer technologies and/or use existing ones in creative ways. 4. The content and language of the material are clear and effective. While all projects included in the PRIMO database are geared towards audiences in higher education, the needs and understanding of graduate engineering students will be different from first-semester freshmen. For this reason, the language and content of projects ought to be consistent with the projects’ specified audience(s). 5. All information included within the material is accurate. To ensure the continued relevance and utility of submitted projects, they should show evidence of regular maintenance, contain no factual errors, and list contact in- formation for a webmaster or the content creator. 6. Organization of the material is clear and easy to use. As many tutorials are made up of multiple parts, it is important that students be able to navigate easily between those parts as well as the tutorial start page. For that reason, projects submitted to PRIMO must be clearly organized and easily navigable. 7. This material demonstrates unique or creative use of graphics, examples, interactive elements such as programmed feedback and flexible learning paths, and other supporting elements. Exemplifying the types of materials for which PRIMO is best known, this criterion specifies that the best projects are creative and interactive in their design, promoting active over passive learning. 8. This material is relevant to those outside of the developer’s institution because it presents a model for other developers. Ultimately, the intent of PRIMO is to serve as a repository for exceptional tutorials that can be easily used as-is outside of their home institutions, and to inspire the cre- Guest Editorial 5 ation of similarly high-quality tutorials. The technology, structure, and/or instructional approach should therefore be easily adaptable. Reviewers Each member of the PRIMO committee serves as a project reviewer. Committee mem- bership is determined via the traditional ACRL appointment cycle, with the Instruction Section vice chair appointing members based upon their application materials. Since the number of volunteers usually exceeds the number of available committee slots, members usually have experience in instruction and must be ALA and ACRL members. The Peer Review Process Submitted projects are first examined by the committee co-chairs to filter out pro- motional materials, spam, and other submissions not included in the database’s scope. Authors whose projects do not pass this first round are e-mailed and given specific reasons for their rejection: for example, static webpages with no interactivity or assessment that are beyond PRIMO’s scope. The authors of those projects which do pass this first round are notified that their projects will be reviewed by the com- mittee, at which time the project is assigned to a team of three or four reviewers (committee members). The members of each team are randomly determined, as are the projects assigned to the team. PRIMO maintains a strict conflict-of-interest policy: members of the PRIMO committee are prohibited from submitting projects while serving their term, and the co-chairs also check to ensure that no reviewer is assigned a project submitted by their institution. Each team uses the predefined rubric of eight criteria (http://acrl.ala.org/ IS/instruction-tools-resources-2/pedagogy/primo-peer-reviewed-instruction-materials- online/primo-selection-criteria/) to review their assigned projects. Reviewers submit individual scores for each project, and an average score is de- termined from those individual scores. Submissions receiving an average score of 32 or above are added to the PRIMO database, and at this time acceptance and rejection letters are distributed. Additionally, the four accepted projects with the highest scores from each semiannual review cycle are invited to provide Site of the Month interviews. From start to finish, the total review process takes about six weeks. http://acrl.ala.org/IS/instruction-tools-resources-2/pedagogy/primo-peer-reviewed-instruction-materials-online/primo-selection-criteria/)%20 http://acrl.ala.org/IS/instruction-tools-resources-2/pedagogy/primo-peer-reviewed-instruction-materials-online/primo-selection-criteria/)%20 http://acrl.ala.org/IS/instruction-tools-resources-2/pedagogy/primo-peer-reviewed-instruction-materials-online/primo-selection-criteria/)%20 6 College & Research Libraries January 2017 It is important to note that acceptance into the PRIMO database is very competitive, with an acceptance rate that varies depending on the number of submissions received, but is typically around 34 percent. When creators of rejected projects are notified that their work will not be included in the PRIMO database, they are pointed towards PRIMO’s public selection criteria and encouraged to revise and resubmit their projects during a future review cycle. Assigning multiple reviewers to each project and basing acceptance decisions upon a minimum average score ensures that accepted projects are of consistently high quality. Additionally, reviewers’ use of clear criteria to evaluate projects reduces the likelihood of those reviewers reaching contradictory decisions about a submission, as sometimes happens with traditional peer review models. To ensure that the PRIMO committee’s criteria keep up with current trends in online pedagogy, an additional charge of the committee is to do a periodic review of its selection criteria. Reviewers may suggest changes to the criteria, which are then passed by the co-chairs to the Information Section’s Executive Board for consideration. Recently, it became clear that there was a need for the selection criteria as a whole to be updated, as is apparent in the absence of language addressing compatibility with mobile devices and in regards to accessibility/universal design, and in the inclusion of now disproven theories such as learning styles. For this reason, a task force was created to do an extensive review of the criteria and to bring them more in-line with current technology and trends. As technology has evolved since the committee’s incep- tion and now allows for greater interaction, the task force also felt it critical to raise the importance of project interactivity from encouraged to required. In order to ensure that the revisions are as comprehensive as possible, ACRL IS chair Jennifer Knievel appointed a combination of current and recent former members of the PRIMO committee to serve on the task force. This variety in composition ensures familiarity with both the criteria and the need for their revision, as well as diversity of perspective. The revision process has been collaborative, with each section of the criteria assigned to a different task force member but with all offering feedback and suggestions on those revisions. After the group drafted the revised criteria, they were used to review four previously accepted and rejected tutorials already in the PRIMO database in order to evaluate their comprehensiveness and efficacy. During the review process, it was also found that the PRIMO submission guidelines (http:// primodb.org/php/submit.php) were in need of revision, so those have also been significantly updated. Next Steps While the revision process is still underway, the task force will submit its final recommendations to the ACRL IS Executive Committee for review at Midwinter. Moving forward, it would be advisable for the committee to review its selection criteria on a more frequent basis to ensure that its practices do not become stale. This is a practice which can and should be adopted by other groups using a peer review model as well. The PRIMO database’s longevity as a source for quality online learning material is the direct result of a peer review process that evolves to meet new developments in pedagogy and technology. Its success is largely due to a system that was intention- ally developed to be robust, relying on best practices. The committee’s structure, with two-year staggered terms for members, allows new talent to cycle onto the commit- tee frequently, and its strong ethic of open communication allows members to scan the environment and provide meaningful feedback. This peer review model ensures that projects added to the PRIMO database are of consistently high quality, but also http://primodb.org/php/submit.php http://primodb.org/php/submit.php Guest Editorial 7 that they remain relevant and continue to serve as exemplars of the best practices the committee seeks to advance. Notes 1. ACRL Instruction Section. (n.d.a) PRIMO; Peer-Reviewed Instructional Materials Online. Retrieved from http://www.ala.org/acrl/aboutacrl/directoryofleadership/sections/is/iswebsite/ projpubs/primo. 2. Hupp, S., Lee, D., MacAdam, B., Morgan, K., Tenofsky, D., & Taylor, T. (1994). User Edu- cation for the Internet: Report and Recommendations. Retrieved from https://web.archive.org/ web/20020106081751/http://cooley.colgate.edu/etech/iep/report.html. 3. ACRL Instruction Section. (n.d.b) PRIMO Selection Criteria. Retrieved from http://www. ala.org/acrl/aboutacrl/directoryofleadership/sections/is/iswebsite/projpubs/primo/criteria. http://www.ala.org/acrl/aboutacrl/directoryofleadership/sections/is/iswebsite/projpubs/primo http://www.ala.org/acrl/aboutacrl/directoryofleadership/sections/is/iswebsite/projpubs/primo https://web.archive.org/web/20020106081751/http://cooley.colgate.edu/etech/iep/report.html https://web.archive.org/web/20020106081751/http://cooley.colgate.edu/etech/iep/report.html http://www.ala.org/acrl/aboutacrl/directoryofleadership/sections/is/iswebsite/projpubs/primo/criteria http://www.ala.org/acrl/aboutacrl/directoryofleadership/sections/is/iswebsite/projpubs/primo/criteria