brewer.p65 Program Evaluation for Internship/Residency Programs 307 Program Evaluation for Internship/ Residency Programs in Academic and Research Libraries Julie Brewer and Mark D. Winston Academic libraries are turning increasingly to internship/residency pro­ grams to enhance their recruitment efforts. Yet, little evaluative informa­ tion is available to measure the effectiveness of these programs or to justify funding for them. This article outlines the necessary components of an evaluation model for internship/residency programs based on a survey of academic library deans/directors and program coordinators. The study identifies the key evaluation factors that library administrators consider most important for measuring internship/residency programs, as well as the frequency, format, and sources of input for effective pro­ gram evaluation. valuating the effectiveness of library programs is a basic part of good management. Admin­ istrators need data from such evaluation to guide future decisions re­ garding the continuation or modification of library programs. In addition, program evaluation criteria often are needed to justify funding for new programs. Unfor­ tunately, measuring program effective­ ness is not always given high priority. Administrators often lack adequate infor­ mation for judging whether library pro­ grams are working as planned or how well they are serving organizational ob­ jectives. Program data, where they do exist, are often anecdotal, incomplete, and difficult to share with other institutions. With the number of internship/resi­ dency programs in academic libraries in­ creasing in the past decade, the need for effective program evaluation is becoming more and more apparent. Academic li­ brary administrators need to understand how such programs affect recruitment and retention goals, organizational productiv­ ity and flexibility, the quality of library services, and the career development of program participants. With so little infor­ mation, administrators interested in start­ ing such programs have difficulty design­ ing them and justifying their cost. This article provides information re­ lated to program evaluation for library administrators interested in enhancing the effectiveness of existing internship/resi­ dency programs or in starting new intern­ ships/residencies. It reports the results of a research study on internship/residency programs in academic and research librar­ ies in the United States. The primary fo­ cus of this research was to identify the nec­ essary components of an evaluation model for such programs. The study outlines key Julie Brewer is Coordinator of Personnel and Staff Development in the University of Delaware Library; e- mail: jbrewer@udel.edu. Mark D. Winston is an Assistant Professor in the School of Communication, Information and Library Studies, at Rutgers University; e-mail: mwinston@scils.rutgers.edu. 307 mailto:mwinston@scils.rutgers.edu mailto:jbrewer@udel.edu 308 College & Research Libraries July 2001 evaluation criteria and guidelines as iden­ tified by current program coordinators and library administrators. Suggestions for both formal and informal feedback mea­ sures also are provided. Background and Review of the Literature A review of the library literature reveals few systematic attempts to evaluate in­ ternship/residency programs.1 The im­ portance of evaluation for such programs is discussed in two publications. The As­ sociation for Library and Information Sci­ ence Education (ALISE) published a set of guidelines for the administration of residency programs, identifying program evaluation as a key component. However, the ALISE guidelines related to program evaluation are very general, recognizing that residencies vary from library to li­ brary. The guidelines focus on how ex­ plicit, formal evaluative procedures ben­ efit individual residency programs. They suggest that establishing and defining specific programmatic goals is a key part of the evaluation process. Communicat­ ing these goals prior to the recruitment and selection process also is important in planning for program evaluation.2 The target population of the study included all academic and research libraries known to “host” post­ master’s internship/residency programs. Julie Brewer discussed program evaluation in a 1998 publication titled “Implementing Post-Master ’s Residency Programs.”3 In addition to identifying various evaluation methods and criteria, she suggested that each program compo­ nent—such as the interview process, the seminar series, and placement assis­ tance—be evaluated. Moreover, both di­ rect costs (salary and travel allotments) and indirect costs (staff time, equipment, and supplies) need to be considered. An ARL survey of residency programs in 1992 identified a number of methods and criteria used for evaluating the suc­ cess of residencies in member libraries.4 Informal feedback from residents and li­ brary staff was the most common method of gathering evaluative information. Ex­ amples of criteria used to evaluate the suc­ cess of internship/residency programs included placement in a full-time profes­ sional position following the residency, publication in the library literature, or ac­ tive involvement on a national committee. A more recent study of internships/ residencies in ARL member libraries by Teri Switzer and William Gentz identified similar evaluation methods and criteria.5 Feedback from interns/residents and suc­ cessful completion of the program were prime indicators. Switzer and Gentz also suggested longitudinal measures, such as follow-up with former interns/residents and their subsequent employers after sev­ eral years to understand the programs’ long-term value. The perspectives of past interns/resi­ dents are very valuable in evaluating in­ ternship/residency programs. Many in­ dividual accounts of internship/resi­ dency experiences are found in the library literature.6 These personal accounts offer valuable insights for library educators, students, and program coordinators. They can assist graduate school advisors and placement officers in informing stu­ dents about alternative career choices and professional opportunities. Students in­ terested in applying to internship/resi­ dency programs can learn what to look for in selecting various types of programs. For library administrators, this type of documentation offers qualitative feed­ back that may not be captured with more traditional evaluation methods. The most comprehensive study evalu­ ating internship/residency programs from the perspective of former program participants was conducted in 1994 by ALA’s Office for Library Personnel Re­ sources (OLPR).7 More than one hundred former interns/residents provided feed­ back describing and evaluating their in­ ternship/residency experiences. The OLPR study summarized qualitative in­ formation about various program compo­ Program Evaluation for Internship/Residency Programs 309 nents. Attributes of effective supervisors, program coordinators, and assignments were identified. The study provided less guidance on how to evaluate the impact of these programs on career progression. The literature review for this study shows growing interest in internship/ residency programs. Descriptive informa­ tion about individual programs and per­ sonal experiences with them is more available today than ever before. Yet, al­ though library educators and administra­ tors acknowledge the need for program evaluation, there is little evidence of sys­ tematic, in-house evaluation processes for internship/residency programs. Only the University of Minnesota has reported on a formal review of its residency program.8 The research described in this article iden­ tifies the evaluation criteria and guide­ lines considered most important for li­ brary administrators embarking on an internal program evaluation. Methodology To gather data for the study, survey meth­ odology was used, with a direct mailing of questionnaires to the library directors and program coordinators of institutions that have internship and/or residency programs in place. The survey instrument used was developed to measure percep­ tions regarding the importance of a num­ ber of factors in providing a comprehen­ sive evaluation of pre- and post-master ’s internship and residency programs. It was designed to address issues such as the program’s nature and duration, the importance of various factors in its evalu­ ation, and staff participation in and fre­ quency of evaluation. Because the questionnaire developed for this study had not been used and its validity proven in prior research, a pilot study was undertaken to address the is­ sues of intelligibility, ease of answering, and time needed to complete the survey instrument. A draft was sent to three li­ brary directors and two personnel/hu­ man resources specialists in a total of five different university libraries that do not have residency/internship programs in place. They were asked to complete the questionnaire and to answer additional questions related to the clarity of the ques­ tions posed, the overall level of difficulty involved in completing the instrument, and the amount of time required to com­ plete it. All of the pilot study participants indicated that the survey instrument was “easy” or “very easy” to complete and that the questions posed were “under­ standable.” In addition, they made a number of comments regarding format and how to reword certain questions to make them clearer and to gather further information. Their comments formed the basis for the revisions that led to the final questionnaire that was sent to the survey participants. The pilot study participants indicated that it took less than fifteen min­ utes to complete the questionnaire. Participants were asked about the im­ portance of various factors in the evalua­ tion of internship/residency programs, with a specific focus on size, diversity, and quality of the pool of applicants; the work-related performance of the resi­ dents/interns; and resident participation in scholarly and service activities. In ad­ dition, participants were asked about the importance of resident completion of the program and placement in subsequent positions in the host institution or other academic libraries, the level of involve­ ment of former residents in the current program, and the change in the minority composition of the library staff and the pool of applicants for other positions. Fi­ nally, respondents were asked about what members of the library staff should be involved in the evaluation process. The target population of the study in­ cluded all academic and research librar­ ies known to “host” post-master’s intern­ ship/residency programs. Since her par­ ticipation in the ARL and OLPR studies, Brewer has maintained an informal ros­ ter of such programs in the United States, with contact information and program specifics such as duration of the program and name and mailing address of the pro­ gram coordinator. Much of this informa­ tion is now available to the public on the 310 College & Research Libraries July 2001 ARL Research Library Residency & In­ ternship Programs database on the Internet.9 In total, twenty-two institutions were identified, including nineteen college and university libraries, one law library, an ar­ chives, and a federal agency. The Ameri­ can Library Directory and the institutional Web sites were used to identify the cur­ rent directors of these libraries and their mailing addresses. Questionnaires, with cover letters and stamped, self-addressed, return envelopes, were mailed to each of the directors and program coordinators. A follow-up mailing was done, as well. In total, twenty institutions replied, reflect­ ing an overall rate of return of 90.9 per­ cent. One of the respondents noted that the institution has discontinued the residency program and did not complete the ques­ tionnaire. As a result, the rate of return of questionnaires that were usable repre­ sented nineteen institutions (86.36%). Two questionnaires were sent to each institution to allow for responses from the library dean/director and the coordinator of the residency/internship program so as to provide an “understanding (of) multiple administrative needs and perspectives, in the event that your responses vary.” Fur­ ther, participants were informed that “What is most important to us [the re­ searchers] is receiving a response from each institution.” Thus, in some cases, one response was provided and in other cases, two. Thirty responses were received, in­ cluding one questionnaire that was not completed, which gave an individual rate of return of 65.9 percent (29/44). The data are described on the basis of the nineteen institutional responses or the twenty-nine individual responses (and usable question­ naires returned), depending on the nature of the issue being addressed. Characteristics of Programs and Respondents Respondents were asked to characterize the internship/residency programs on the basis of duration, focus, and number of years the programs have been in exist­ ence. Fifteen of the nineteen programs represented in the study (78.9%) were of a duration of more than one year and of­ fered only a post-master’s in library and information science (MLS) experience for the residents. Two of the post-MLS-only programs were one year in length but of­ fered the option or opportunity for a sec­ ond year. Three programs were more than one year in length and included both pre- and post-MLS components. Only one of the programs was a one-year post-MLS­ only program. In terms of characterizing the recruit­ ment focus of the programs, nearly two- thirds (63.2%) of the respondents described their programs as focusing on recruiting minority residents, with the other 36.8 per­ cent focusing on “open recruitment.” The focus of one of the programs was chang­ ing to open recruitment after having fo­ cused on minority recruitment for a num­ ber of years. And one of the programs was described as involving “open recruitment with a focus on minority recruiting.” While nearly half (47.4%) of the pro­ grams had been in operation for four years or less, nearly three-quarters (73.7%) had been in place for ten years or less. On average, the programs had been in existence for approximately nine years. One program had been in place for forty years and a number of others for between ten and eighteen years. In terms of those who completed the questionnaires, almost 25 percent were deans or directors, 27.6 percent each were either assistant deans/directors or human resources/personnel directors, and 17.2 percent were internship/residency pro­ gram coordinators.10 Findings and Discussion Library administrators and program co­ ordinators were asked to indicate the im­ portance of a number of factors related to their evaluation of the programs in their institutions. In addition, they were asked about the importance of input from li­ brary staff members involved with the programs, about the importance of a writ­ ten evaluation or assessment, and how often programs should be evaluated. http:coordinators.10 Program Evaluation for Internship/Residency Programs 311 TABLE 1 Factors Considered in Program Evaluation Evaluation Factor % Indicating "Very Important� or "Somewhat Important� Placement in other academic libraries 100.0% Quality of applicant pool 96.6% Completion of program by residents 93.1% Work performance of residents 89.7% Diversity among applicant pool 89.7% Participation of department heads in development of resident assignments 89.6% Change in minority representation 86.2% Extent to which program supports library's diversity plan 86.2% Former residents' involvement in refining the program 82.8% Continuation of second year, if applicable 75.8% Resident's committee activities 72.4% Placement in host institution 62.1% Resident's research activities 62.0% Size of applicant pool for the program 58.6% Former residents' assistance with recruitment 55.2% Program coordinator's contact with former residents 55.1% Evaluation Factors The evaluation factors to be considered by the study participants related to the nature of the pool of applicants for the programs, the job performance and other activities of the residents, placement, and so on. Table 1 lists the evaluation factors considered in this study in order of importance. The fact that 100 percent of the respon­ dents indicated that a very important or somewhat important measure of their pro­ gram is the placement of residents in per­ manent professional positions in academic libraries indicates a clear commitment to the recruitment of new librarians. Prepar­ ing new librarians for continuing success­ ful careers in academic librarianship seemed to be a primary objective of all the internship/residency programs repre­ sented in this study. Although former in­ terns/residents may excel in nonacademic library careers after completing their pro­ gram, this would not be considered suc­ cessful placement in terms of program objectives. A much smaller percentage of the respondents (26.7%) indicated that in­ tern/resident placement in the host insti­ tution after completion of the program was a very important measure of the program. Thus, internship/residency programs ap­ pear to support the overall professional interest of attracting new graduates to aca­ demic libraries. Other factors that appear to be most important for program evaluation relate to quality of the applicant pool, completion of the program by the residents, and work performance. The factor that was identi­ fied as “very important” by the largest percentage, or nearly 90 percent of the re­ spondents, was that of quality of the pool of applicants for the program. Attracting the highest-qualified graduates reflects positively on internship/residency pro­ grams. Resident/intern completion of the program was rated as “very important” by 72.4 percent of the respondents and “some­ what important” by 20.7 percent. With re­ gard to resident work performance, as evaluated by supervisors, more than two- thirds of the respondents viewed this fac­ tor as very important and an additional 20.7 percent believed it to be somewhat important. 312 College & Research Libraries July 2001 Diversity-related factors ranked high among the respondents’ concerns. More than three-quarters noted that ethnic di­ versity among the pool of applicants was very important. An additional 13.8 per­ cent identified diversity among the ap­ plicant pool as being “somewhat impor­ tant,” with slightly more than 10 percent not responding or noting that it was not important at all. Change in minority rep­ resentation on the library staff also was noted as very important by nearly two- thirds of the respondents. In addition, more than half indicated the extent to which the program supports the library’s diversity plan as very important. Specifically, all the deans/directors noted that the continuation of the residents for a second year was very important… As might be expected, in a number of instances the nature of program recruit­ ment (i.e., focusing on the recruitment of minority interns/residents in contrast to open recruitment) was associated with significant differences in terms of the im­ portance associated with program evalu­ ation factors. Change in minority repre­ sentation on the library staff was very important to 75 percent of the study par­ ticipants associated with minority resi­ dency programs, with the remaining 25 percent noting that change in minority representation was somewhat important. All the respondents who indicated that this factor was not important at all or was of minimal importance or who indicated neutrality had programs that focus on open recruitment. However, 44.4 percent of the respondents with programs that did not focus specifically on minority resi­ dents indicated that change in minority representation was very important. The extent of the difference in responses was represented by a chi-square value of 0.035. Similarly, the extent to which the pro­ gram supports the library’s diversity plan was identified as being very important by fourteen of the nineteen respondents with minority intern/residency programs and as somewhat important by the remainder of those with such programs. Again, all the study participants who noted that this evaluation factor was either not impor­ tant or of minimal importance had pro­ grams that did not focus on minority resi­ dents. However, 66.7 percent of those with open-recruitment programs indi­ cated that supporting the diversity plan was somewhat or very important. The associated chi-square representing the level of difference on the basis of type of program and importance of the evalua­ tion factor was 0.020. In relation to the importance of ethnic diversity within the applicant pool, 85 percent of those with minority intern- ship/resident programs said this factor was very important, with the remaining 15 percent categorizing it as somewhat important. As represented by a difference that was approaching significance (0.099), slightly more than half (55.6%) of those respondents with open-recruitment pro­ grams viewed diversity among the pool of applicants as a very important evalua­ tion factor and 22 percent each identified it as somewhat important or not impor­ tant at all. The nature of the program also was as­ sociated with a difference in terms of the importance of placement of residents in permanent professional positions in the host institution. As represented by a chi- square of 0.105, 75 percent of the respon­ dents with internship/residency programs that focused on minority recruitment in­ dicated that professional placement in the host institution was very important (30%) or somewhat important (45%). However, 33.3 percent of the study participants with open-recruitment programs described pro­ fessional placement of the residents in the host institution as not important at all, with 22.2 percent of this group noting that it was of minimal importance. In the case of the minority internship/ residency programs, 57.9 percent of the respondents said that the level of involve­ ment of former residents in refining the program was very important, as compared with slightly more than 10 percent of those Program Evaluation for Internship/Residency Programs 313 with open-recruitment programs, as rep­ resented by a chi-square of 0.087. However, 77.9 percent of the respondents with open- recruitment programs noted that former resident involvement with refining the program was somewhat important. In terms of the importance attached to the program evaluation factors, there was some distinction on the basis of the job title of the person completing the ques­ tionnaire. Specifically, in relation to the level of involvement of former residents in assisting with recruitment, the human resources/personnel directors’ responses differed from those of the other respon­ dents to a significant degree, as repre­ sented by a chi-square of 0.017. This evaluation factor was identified as very important by six of the eight human re­ sources/personnel directors, whereas none of the deans/directors, assistant deans/directors, or program coordinators identified it as very important. It was de­ scribed as not important at all by one dean/director, of minimal importance by one assistant dean/director; and nine re­ spondents, representing all of the posi­ tions with the exception of human re­ sources/personnel directors, indicated neutrality with regard to its importance. In terms of the importance of residents continuing in the program for a second year (where applicable), there was a dif­ ference that was approaching significance (i.e., a chi-square of 0.07>0.05) in relation to the level of importance as identified by the deans/directors in comparison with their colleagues in other positions in the organization and in relation to the pro­ gram. Specifically, all the deans/directors noted that the continuation of the resi­ dents for a second year was very impor­ tant, whereas two of eight assistant deans, one of four program coordinators, and two of eight human resources/personnel directors did not provide a response. In addition, three of eight assistant deans/ directors and half the program coordina­ tors noted that this evaluation factor was somewhat important. Generally, respondents with programs that were designed for more than one year and were post-MLS-only programs indi­ cated that program completion by the residents was a very important factor to a greater extent than did respondents with other types of programs, as repre­ sented by a chi-square of 0.000. This dis­ tinction was likely based on the potential challenges associated with the partici­ pants’ completion of this type of program, in comparison to a one-year post-MLS program or a program that involves pre- and post-MLS components. Factors such as resident involvement in research/scholarly activities and com­ mittee activities, size of applicant pool, involvement of former residents in refin­ ing the program, and so on were not iden­ tified as being of primary importance. However, these factors were identified as at least s somewhat important by a large percentage of the respondents. Respondents also were asked to iden­ tify other factors, not already addressed by the survey instrument, they consid­ ered important in terms of program evaluation. Other factors they suggested included: quality of the experience for the intern/resident, quality and appropriate­ ness of assignments available, “overall acceptance of the program by the staff at large,” as well as “growth in acceptance of diversity in staff and cultural aware­ ness on [the] part of staff of [the] library,” “recurring funding of the program,” “vis­ ibility of the program within the univer­ sity community [in] further establish(ing) librarianship as a professional academic discipline to others in the university com­ munity,” “reputation” of the program via former residents, resident growth in terms of understanding academic librar­ ies and personal confidence and ability, effectiveness of the mentoring provided, and retention of minority librarians in the profession after initial placement. Sources of Input In terms of the individuals whose input might be important in program evalua­ tion, more than 96 percent of the respon­ dents indicated that the input of residents, supervisors, and deans/directors was http:0.07>0.05 314 College & Research Libraries July 2001 TABLE_2 Sources_of_Input_in__ro_r___E___u_tion very or somewhat important (table 2), with input from resi­ dents apparently having the most weight. Their input was thought to be very important by almost 90% of respondents. Whereas the smallest percentage (55.2%) iden­ tified input from mentors as very important, more than 80 percent noted that input from the men­ tors is at least somewhat impor- Evaluation Factor % Indicating "Very Important� or "Somewhat Important� Residents 96.6% Supervisor 96.6% Dean!director 96.5% Program coordinator 89.6% Mentor 86.2% tant, as was the case in terms of the importance of the input of program coordinators. In addition, all the respondents with more than one-year pre- and post-MLS programs and two-thirds of those with post-MLS programs intended for more than one year said that input from super­ visors was very important. A third of those with more than one-year post-MLS programs indicated that input from su­ pervisors in program evaluation was somewhat important. The importance of input from mentors in program evaluation appeared to reflect that mentors were more a part of the mi­ nority resident programs. In fact, one- third of the individuals with open-recruit­ ment programs indicated “no response” to this survey item, whereas none of those with minority internship/residency pro­ grams did so, as represented by a chi- square of 0.022. Frequency and Type of Evaluation It is interesting to note the respondents’ perceptions regarding the frequency of program evaluation. Nearly half (48.3%) identified annual evaluation cycles as appropriate for internship/residency pro­ grams. An additional 24.1 percent noted less-frequent biennial evaluations as the ideal. Far fewer recommended quarterly, semiannual, or some other evaluation cycle. “Ongoing” evaluation or “continu­ ous feedback” also was considered impor­ tant. Some respondents identified distinc­ tions with regard to frequent, “informal” evaluation, as compared with more “for­ mal” program evaluations that should be periodic, but less frequent. Respondents were asked about the importance of written evaluation/assess­ ment of the programs. Whereas 21 per­ cent of those who responded to this sur­ vey item indicated that written evaluation is very important, half described it as be­ ing only somewhat important. In fact, 17.2 percent were neutral in terms of this fac­ tor and the remaining individuals noted that it was of minimal importance or not important at all. Summary and Conclusion Because internship/residency programs exist to enhance recruitment and provide entry-level professional opportunities in academic and research library settings, often with a particular focus on minority recruitment, evaluating the extent to which the programs are successful in accomplish­ ing these ends is an important consider­ ation. This study identified several evalu­ ation factors that library administrators consider most important for measuring residency programs, such as quality of the applicant pool, completion of the residency program, and subsequent placement in an academic library. These factors will be use­ ful in designing program evaluations for internship/residency programs. The study also identified what library staff should participate in program evalu­ ations. Participants considered input from residents, supervisors, and deans/direc­ tors as most important. Input from pro­ gram coordinators and mentors was viewed as less important in program evaluation. Interestingly, participants as­ sociated with programs focusing on the Program Evaluation for Internship/Residency Programs 315 recruitment of minority interns/residents value input from mentors significantly more than do participants associated with open-recruitment internship/residency programs. Study participants recommended that program evaluations be conducted annu­ ally or biennially. The responses suggest that formal program evaluation be docu­ mented in writing at intervals relative to the intended length of the program. Yet, respondents cautioned that formal, writ­ ten evaluations should not take the place of more frequent, informal feedback gath­ ered from residents, supervisors, and oth­ ers involved in the residency program. The evaluation criteria and guidelines identified in this study offer a model for assessing internship/residency programs in the future. Although concerns about cost (primarily in terms of staff time), the daily pressure of addressing immediate program demands, and in many cases the small size and newness of the program may hinder extensive evaluation method­ ologies, implementing an evaluation com­ ponent does not need to be an elaborate process. Documenting select criteria at regular intervals, along with continuing informal feedback, can provide valuable program data. Understanding and documenting how effectively internships/residencies achieve programmatic goals is essential for plan­ ning. Library administrators need infor­ mation related to program successes and shortcomings to plan more effectively and to justify continued funding. In addition, they will be more persuasive in seeking new funding for internships/residencies when they are able to demonstrate a sys­ tematic process for program evaluation. Notes 1. In most, though not all, instances, the term intern refers to the pre-MLS component of the program and the term residency to the post-MLS component. 2. “Guidelines for Practices and Principles in the Design, Operation, and Evaluation of Post­ Master ’s Residency Programs,” Library Personnel News 10 (May/June 1996): 1–3. 3. Julie Brewer, “Implementing Post-Master ’s Residency Programs,” Leading Ideas 4 (Sept. 1998): 2–7. 4. Internship, Residency, and Fellowship Programs in ARL Libraries, SPEC Kit #188 (Washington, D.C.: ARL, 1992). 5. Teri Switzer and William Gentz, “Increasing Diversity: Programs and Internships in ARL Libraries,” in Advances in Librarianship, ed. Frederick C. Lynden and Elizabeth A. Chapman (New York: Academic Pr., 2000), 169–88. 6. For example, see Jon E. Cawthorne and Teri B. Weil, “Internships/Residencies: Exploring the Possibilities for the Future,” in In Our Own Voices: The Changing Face of Librarianship, ed. Teresa Y. Neely and Khafre K. Abif (Lanham, Md.: Scarecrow, 1996): 45–71; Jose Diaz and Kristina Starkus, “Increasing Minority Representation in Academic Libraries: The Minority Librarian In­ tern Program at the Ohio State University,” C&RL 55 (Jan. 1994): 41–46; Deborah Hollis, “On the Ambiguous Side: Experiences in a Predominantly White and Female Profession,” in In Our Own Voices: The Changing Face of Librarianship, ed. Teresa Y. Neely and Khafre K. Abif (Lanham, Md.: Scarecrow, 1996): 139–54; Joseph A. Boisse and Connie V. Dowell, “Increasing Minority Librar­ ians in Academic Research Libraries,” Library Journal 112 (Apr. 1987): 52–54; Betty Glass, “A Time of Transition,” Library Journal 111 (Feb. 1986): 127–28; Sarah Shoemaker, “A Unique Experience,” Library Journal 111 (Feb. 1986): 125–26; Molly Mahony, “Preparation for the Future,” Library Jour­ nal 111 (Feb. 1986): 129—30. 7. Julie Brewer, “Post-Master’s Residency Programs: Enhancing the Development of New Professionals and Minority Recruitment in Academic and Research Libraries,” College & Research Libraries 58 (Nov. 1997): 528–37. 8. Marilyn H. McClaskey, Obianuju Mollel, and Linda DeBeau-Melting, “Making a Good Thing Better: The Residency Program at the University of Minnesota Libraries,” presented at the Diversity Now Conference sponsored by The Big 12 Plus Libraries Consortium and the Univer­ sity of Texas at Austin, Apr. 4, 2000. 9. See http://www.arl.org/careers/residencies.html. 10. It should be noted that in some instances the one institutional response is based on col­ laboration and completion of the questionnaire by the dean/director, the program coordinator, and/or others, as reported informally to the researchers. http://www.arl.org/careers/residencies.html