A Framework for Measuring Relevancy in Discovery Environments


Discovery environments are ubiquitous in academic libraries but studying their effectiveness and use in an academic environment has mostly centered around user satisfaction, experience, and task analysis. This study aims to create a quantitative, reproducible framework to test the relevancy of results and the overall success of Washington State University’s discovery environment (Primo by Ex Libris). Within this framework, the authors use bibliographic citations from student research papers submitted as part of a required university class as the proxy for relevancy. In the context of this study, the researchers created a testing model that includes: (1) a process to produce machine-generated keywords from a corpus of research papers to compare against a set of human-created keywords, (2) a machine process to query a discovery environment to produce search result lists to compare against citation lists, and (3) four metrics to measure the comparative success of different search strategies and the relevancy of the results. This framework is used to move beyond a sentiment or task-based analysis to measure if materials cited in student papers appear in the results list of a production discovery environment. While this initial test of the framework produced fewer matches between researcher-generated search results and student bibliography sources than expected, the authors note that faceted searches represent a greater success rate when compared to open-ended searches. Future work will include comparative (A/B) testing of commonly deployed discovery layer configurations and limiters to measure the impact of local decisions on discovery layer efficacy as well as noting where in the results list a citation match occurs.

Author Biographies

Blake L. Galbreath, Washington State University

Core Services Librarian

Alex Merrill, Washington State University

Head of Library Systems and Technical Operations

Corey M. Johnson, Washington State University

Instruction & Assessment Librarian


Aaron Nichols et al., “Kicking the Tires: A Usability Study of the Primo Discovery Tool,” Journal of Web Librarianship 8, no. 2 (2014): 172–95, https://doi.org/10.1080/19322909.2014.903133.

Alex Merrill and Blake L. Galbreath, “A Framework for Measuring Relevancy in Discovery Environments,” 2020, https://osf.io/ve3kp/.

Alexandra Hamlett and Helen Georgas, “In the Wake of Discovery: Student Perceptions, Integration, and Instructional Design,” Journal of Web Librarianship 13, no. 3 (2019): 230–45, https://doi.org/10.1080/19322909.2019.1598919.

Anita K. Foster, “Determining Librarian Research Preferences: A Comparison Survey of Web-Scale Discovery Systems and Subject Databases,” Journal of Academic Librarianship 44 (2018): 330–36, https://doi.org/10.1016/j.acalib.2018.04.001.

Blake Galbreath, Corey M. Johnson, and Erin Hvizdak, “Primo New User Interface,” Information Technology and Libraries 37, no. 2 (2018): 10–33, https://doi.org/10.6017/ital.v37i2.10191.

Boram Lee and EunKyung Chung, “An Analysis of Web-scale Discovery Services from the Perspective of User’s Relevance Judgement,” Journal of Academic Librarianship 42 (2016): 529–34, https://doi.org/10.1016/j.acalib.2016.06.016.

Christina Stohn, ”How Do Users Search and Discover?: Findings from Ex Libris User Research,” Ex Libris, 2015, https://www.exlibrisgroup.com/blog/ex-libris-user-studies-how-do-users-search-and-discover/.

Courtney Lundrigan, Kevin Manuel, and May Yan, “‘Pretty Rad’: Explorations in User Satisfaction with a Discovery Layer at Ryerson University,” College & Research Libraries 76, no. 1 (2015): 43–62, https://doi.org/10.5860/crl.76.1.43.

David Comeaux, “Usability Testing of a Web-Scale Discovery System at an Academic Library,” College & Undergraduate Libraries 19, no. 2–4 (2012): 199, https://doi.org/10.1080/10691316.2012.695671.

Diane Cmor and Xin Li, “Beyond Boolean, Towards Thinking: Discovery Systems and Information Literacy,” 2012 IATUL Proceedings, paper 7, https://docs.lib.purdue.edu/iatul/2012/papers/7/.

Greta Kliewer et al., “Using Primo for Undergraduate Research: A Usability Study,” Library Hi Tech 34, no. 4 (2016): 566–84, http://doi.org/10.1108/LHT-05-2016-0052.

Heather Dalal, Amy Kimura, and Melissa Hofmann, “Searching in the Wild: Observing Information-Seeking Behavior in a Discovery Tool” (Association of College & Research Libraries 2015 Conference Proceedings, March 25–28, 2015): 668–75, http://www.ala.org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2015/Dalal_Kimura_Hofmann.pdf.

Hugo C. Huurdeman, Mikaela Aamodt, and Dan Michael Heggo, “‘More Than Meets the Eye’—Analyzing the Success of User Queries in Oria,” Nordic Journal of Information Literacy in Higher Education 10, no. 1 (2018): 18–36, https://doi.org/10.15845/noril.v10i1.270.

Kelsey Renee Brett, Ashley Lierman, and Cherie Turner, “Lessons Learned: A Primo Usability Study,” Information Technology and Libraries 35, no. 1 (2016): 7–25, https://doi.org/10.6017/ital.v35i1.8965.

Kevin Patrick Seeber, “Teaching ‘Format as a Process’ in an Era of Web-Scale Discovery,” Reference Services Review 43, no. 1 (2015): 19–30, https://doi.org/10.1108/RSR-07-2014-0023.

Kylie Jarret, “Findit@Flinders: User Experiences of the Primo Discovery Search Solution,” Australian Academic & Research Libraries 43, no. 4 (2012): 278–99, https://doi.org/10.1080/00048623.2012.10722288.

Marlen Prommann and Tao Zhang, “Applying Hierarchical Task Analysis Method to Discovery Layer Evaluation,” Information Technology and Libraries 34, no. 1 (2015): 97, https://doi.org/10.6017/ital.v34i1.5600.

Marshall Breeding, “Library Technology Guides: Academic Members of the Association of Research Libraries: Index-Based Discovery Services,” Library Technology Guides, https://librarytechnology.org/libraries/arl/discovery.pl.

Megan Oakleaf and Neal Kaske, “Guiding Questions for Assessing Information Literacy in Higher Education,” portal: Libraries and the Academy 9, no. 2 (2009): 277, https://doi.org/10.1353/pla.0.0046.

“Primo Search Discovery: Search, Ranking, and Beyond,” Ex Libris, 2015, https://www.exlibrisgroup.com/products/primo-discovery-service/relevance-ranking/.

Rice Majors, “Comparative User Experiences of Next-Generation Catalogue Interfaces,” Library Trends 61, no. 1 (2012): 186–207, https://doi.org/10.1353/lib.2012.0029.

Sarah P. C. Dahlen and Kathlene Hanson, “Preference vs. Authority: A Comparison of Student Searching in a Subject-Specific Indexing and Abstracting Database and a Customized Discovery Layer” College & Research Libraries 78, no. 7 (2017): 878–97, https://doi.org/10.5860/crl.78.7.878.

“Search It,” Washington State University Libraries, 2020, https://searchit.libraries.wsu.edu/.

Stefanie Buck and Christina Steffy, “Promising Practices in Instruction of Discovery Tools,” Communications in Information Literacy 7, no. 1 (2013): 66–80, https://doi.org/10.15760/comminfolit.2013.7.1.135.

“Student Learning Goals,” Washington State University Common Requirements, 2018, https://ucore.wsu.edu/about/learning-goals.

“Welcome to the Roots of Contemporary Issues,” Washington State University Department of History, 2017, https://ucore.wsu.edu/faculty/curriculum/root/.

Xi Niu, Tao Zhang, and Hsin-liang Chen, “Study of User Search Activities with Two Discovery Tools at an Academic Library,” International Journal of Human-Computer Interaction 30, no. 5 (2014): 422–33, https://doi.org/10.1080/10447318.2013.873281.

Zebulin Evelhoch, “Where Users Find the Answer: Discovery Layers Versus Database,” Journal of Electronic Resources Librarianship 30, no. 4 (2018): 205–15, https://doi.org/10.1080/1941126X.2018.1521092.

How to Cite
Galbreath, B. L., Merrill, A., & Johnson, C. (2021). A Framework for Measuring Relevancy in Discovery Environments. Information Technology and Libraries, 40(2). https://doi.org/10.6017/ital.v40i2.12835