This blog post presents how the use of multiple streams of data benefited two recent U-M Library studies. For example, one recent study merged survey data, U-M human resources data, and Library document delivery data to provide a very rich picture of how diverse groups on campus use and experience the Library’s document delivery service. Some advantages of joining multiple data sources in assessment projects are discussed in the context of the two example studies.
Posts tagged "quantitative"
Chances are the work processes you already have in place are generating data that you could be using to learn more about those processes. In this second blog post, the author continues to highlight steps for working with data that is generated by your daily tasks.
Chances are the work processes you already have in place are generating data that you could be using to learn more about those processes. In two blog posts, the author shares some steps for working with data that is generated by your daily tasks.
Assessment and research activities focused on the U-M Library faculty, staff, and student experiences are happening regularly, and often the Library Human Resources (LHR) team is contributing to these activities if not leading the research. This work can focus on quantitative data, qualitative data, or take a hybrid approach, and can involve surveys, interviews, and/or some general number-crunching. This post looks over some recent HR assessment projects.
When planning an assessment project in the Library, one important step is to consider whether your project should be vetted by the Institutional Review Board (IRB) at U-M, a committee that ensures studies with human subjects are ethical, that subjects are protected from unnecessary psychological or physical risks, and that subjects are participating in a fully informed, voluntary manner. This post details when your data collection may be subject to a full IRB application and review process.
The 2018 Library Assessment Conference (https://libraryassessment.org/) brought together a community of practitioners and researchers who have responsibility or interest in the broad field of library assessment. This post recaps the conference poster content presented by Laurie Alexander and Doreen Bradley about how analytics advanced the Library's internal understanding of the course-integrated instruction provided by Library staff.
Not everything a library wants to know is available via web-scale analytics tools such as Google Analytics. Often, custom instrumentation and logging are the best way to answer usability and analytics questions, and can offer better protections for patron privacy as well.
Page 1 of 2