Like many academic research libraries, the University of Michigan Library has a promotion process for its librarians. The Library’s Policy for the Promotion and Appointment of Librarians states that librarians “hold academic appointments and are part of the faculty of the University” and “have the responsibility to determine the rules of governance under which they may seek to advance their careers as librarians.” And, like many libraries, the policies need to be reviewed on occasion.
In Fall 2016, the Promotion and Appointment of Librarians (PAL) Task Force was charged by the Librarians’ Forum with reviewing our promotion process and making recommendations to better align what we do with the goals of both individuals and the Library. The task force consisted of six librarians at various stages in their professional career, with different ranks, and some with promotion experience elsewhere. This varied experience brought fresh perspectives when we planned the research process.
The PAL Task Force recognized the scale and the complexity of the task early on. With approximately 155 librarians who participate in the promotion process, we had about six months to conduct research and to analyze our results before writing recommendations for the final report. We knew we would need to utilize various qualitative and quantitative research methods to get the best data (described below).
It was important to hear the voices of both those who had gone through the process and those who had yet to do so. With so many librarians spread out across numerous locations on a huge campus, a survey seemed to be the ideal method: it could reach everyone and make it easy for people to provide input. The survey we disseminated had over twenty questions asking about people’s experiences with the promotion process and whether they thought the process provided value. The survey was at least started by 93 respondents (a 60% response rate) and provided a lot of insight into expectations about the process anonymously.
The PAL Task Force also felt it was important to provide opportunities for librarians to have face-to-face conversations about their experiences with the promotion process. We conducted eleven focus groups, consisting of homogeneous groups, such as administrators, senior managers, Librarian Forum Board members, the Promotion Review Committee, members from Library Human Resources, and individuals from various ranks (Assistant Librarians, Associate Librarians, Senior Associate Librarians, and full Librarians). Additionally, we spoke with Dean James Hilton to hear his perspective on the promotion process and individually with twelve other librarians who volunteered to provide more feedback beyond the survey or focus groups.
As we were gathering information from inside the Library, we recognized that it could be really helpful to also review the promotion processes of some of our institutional peers and at other places on campus. Our competitive review consisted of the collection of promotion documentation openly available from other library web sites, and from institutional peers we contacted. We conducted phone interviews regarding the promotion process at several Big Ten institutions, as well as at Cornell, UC Berkeley, University of Texas, and Toronto. We also reached out to the Bentley Historical Library, Kresge Library, Law Library, and spoke with one U-M clinical faculty, all of whom are on Michigan’s Ann Arbor campus but are outside of our promotion and appointment process.
Our final data collection point was quantitative data provided by Library Human Resources staff. This data included the number of people who had applied for promotion, successfully and unsuccessfully, over the past ten years (2007-2017). This data was interesting, but because there was the potential for personally identifiable information, we could only share summarized information in the final report.
Our Approach to Analysis
Analyzing this much data was overwhelming at first. No one on the team had the capacity to dive into all the data, so we decided to break into teams of two. One pair focused on the survey and focus groups, another on the competitive review, and the third pair worked with Human Resources to get statistics that might shine a light on any trends in promotion in the library over the past decade.
Fortunately, Kathy Kosinski, a part-time employee in the Design & Discovery department, was able to assist the task force with the analysis of this data. She used Dedoose to help excerpt and code our qualitative data to find the major themes and supporting quotes. For example, we found that the most important issue to librarians was to have the Promotion Review Committee feedback, whether positive or negative, shared with the candidate, both in focus groups and in the survey (of the 92 quotations containing feedback, 35 of those suggestions were about candidates wanting feedback on their portfolio).
Caption: Suggestions (themes) from the focus groups with the number of times each suggestion was mentioned
Overall, it was reassuring to see trends, or consistent pain points, emerge from the analysis of our research. Each of the PAL Task Force members gained a deeper understanding of the challenges and benefits of various aspects of the promotion process, both in our library and at some of our peer institutions. Because of the sometimes sensitive nature of our work, discussions at our meetings never left the room. This helped us build trust with one another quickly and enabled us to be open and honest with our opinions. We also became comfortable with checking one another’s personal biases.
We did discuss at length what to do with all the data that was out of scope of our investigation and ultimately decided to include it, along with recommendations on how to address the issues, in our final report.
I firmly believe that using multiple research methods strengthened the accuracy and validity of our research. Qualitative research data can be hard to process and using a tool such as Dedoose helped highlight themes across the various data sets. The variety of experiences on our team and scheduling enough time to discuss the research, also greatly contributed to the success of the team and ability to wrangle so much data into coherent recommendations.
PAL Task Force members included Denise Leyton, Angie Oehrli, Jim Ottaviani, Carol Shannon, Lance Stuchell (co-chair), and Rachel Vacek (co-chair). Their final report and recommendations are in the University of Michigan’s institutional repository, Deep Blue.