Skip to Main Content

Assessment (Internal Use Only)

Outcomes-Based Statistics and Assessment

 

Start/End Dates FALL2022 
Title Outcomes-Based Statistics and Assessment
Purposes (reasons/motivation)

Requested by CDD Management Team (A recommendation from the 2022 ACRL external reviewers' final report)

Goals (results/directions)

To identify meaningful, impactful, and current statistics and assessment

Objectives (actions/steps)

"One way to work toward a positive vision of the future is to engage in the demonstration of library value, recognizing that the process is not one of proving value, but rather continuously increasing value... Because as librarians learn about library value—that is, what library services and resources enable users to do, what outcomes libraries enable users to achieve—they improve. When academic librarians learn about their impact on users, they increase their value by proactively delivering improved services and resources—to students completing their academic work; to faculty preparing publications and proposals; to administrators needing evidence to make decisions." (ACRL, 2010, p. 140)

 

Planning is crucial to an effective assessment program. There are an endless number of things one could assess in the average library, and most libraries generate an astronomical amount of data each year without even trying. If you do not approach assessment strategically, you will likely be wasting your time and efforts for little or no payoff." (Fleming-May & Mays, 2021, p. 42)

Step 1. Understand what outcomes-based approaches are

"Rather than calculating inputs and outputs, outcomes-based assessment models focus on the end result of providing a resource or service." (Fleming-May & Mays, 2021, p. 92)

"An outcome is a specific benefit from a library program/service that can be quantitative or qualitative and is expressed as changes an individual perceived in themselves." (Project Outcome101, What is an outcome?, 00:04:05) For example, patrons learn something new ("knowledge") about using digital resources, gain "confidence" on using digital resources, change their behavior by applying new knowledge gained to their projects, and increase awareness of library resources and services (Project Outcome, Introduction to Project Outcome, 00:00:37).

  • Metrics-based (outputs) approaches (How much did we do?): I want to see an increase in ebook usage vs.
  • Outcomes-based approaches (What good did we do? What do all figures/transactions mean to users? Do users achieve their goals or gain something as a result of our collections/services? How have learners been changed as a result of our interactions?): I want to see an alignment between ebook usage in mathematics and courses offered/to be provided by mathematics.

Simply speaking, it is about what institutional data (e.g., courses offered) we want to connect to or align with our library data (e.g., collection usage by subject).

Step 2. Identify what outcomes/impact/changes/contributions we want to see as a result of our daily work or targeted projects (e.g., curricular/program alignment) [Note: Please forget about numbers/statistics for now! I can show/give you tons of statistics examples (pp. 103-140) later.]

Step 3. Provide ideas about what we can do to achieve our goal(s) identified from Step 2 by

  • Proposing conceptual/philosophical approaches
  • Creating worksheets that could help identify meaningful, impactful, and current statistics and assessment
  • Setting up IT and non-IT systems and creating an infrastructure that could facilitate outcomes-based assessment
    • Rod Library's fund codes
      • Revise fund codes that align with "User Details" as much as possible
    • Alma Analytics
    • Springshare
      • Use an alternative or supplementary "user-friendly" tool to or for Alma Analytics

Step 4. Create a data management & privacy plan and its relevant document that could help us stay up to date  with meaningful, impactful, and current statistics and assessment (e.g., Johns Hopkins Libraries Privacy Policy)

Results  
Findings

A link to Google Drives' project folder (including handouts)

  • Figure 1. Areas of Library Value and Potential Surrogates, ACRL (2010, p. 19)
  • Figure 10. Areas of Library Impact on Institutional Missions, ACRL (2010, p. 102)
  • Conclusion, ACRL (2010, p. 140)
  • Appendix A. Academic Library Value Checklist, ACRL (2010, pp. 178-80)
  • More
Staff HSK
Additional Information

A link to a document, "Action Plan for Rod Library Assessment Plan"

A link to a document, "Assessment Planning: Holistic & Question-Driven Approaches"

A link to a document, "LAC2020 How to swim and survive in the Data Sea"

References

Appleton, L. (2017). Libraries and key performance indicators: a framework for practitioners. Chandos Publishing.

Association of College and Research Libraries (ACRL) (2010). The value of academic libraries report.

  • student enrollment (e.g., Recruitment of prospective students, Matriculation of admitted students, Recommendation of current students)
  • student retention and graduation rates (e.g., Fall-to-fall retention, Graduation rates)
  • student success (e.g., Internship success, Job Placement, Job salaries, Professional/graduate school acceptance, Marketable skills)
  • student achievement (e.g., GPA, Professional/educational test scores)
  • student learning (e.g., Learning assessments, Faculty judgments)
  • student engagement (e.g., Self-report engagement studies, Senior/alumni studies, Help surveys, Alumni donations)
  • faculty research productivity (e.g., Number of publications, number of patents, value of technology transfer, Tenure/promotion judgments)
  • faculty grants (e.g., Number of grant proposals (funded or unfunded), Value of grants funded)
  • faculty teaching (e.g., Integration of library resources and services into course syllabi, websites, lectures, labs, texts, reserve readings, etc., Faculty/librarian collaborations; cooperative curriculum, assignment, or assessment design)
  • overarching institutional quality (e.g., Faculty recruitment, Institutional rankings, Community Engagement)

ACRL. (n.d.). Project Outcome. [A short video, 00:05:00] & [A long video, 01:00:00]

Elsevier (Sep 23, 2022). Combining product analytics and user research to improve service and design. [a webinar, 00:08:37]

  • Did they achieve their goal?
  • How easy or difficult was it to use?
  • How does this experience compare?
  • What are their unmet needs?

Fleming-May, R. A. & Mays, R. (2021). Fundamentals of Planning and Assessment for Libraries. American Library Association.

  • Metrics-based (e.g., e-resource usage statistics)
  • Economic models (e.g., cost per use)
  • Standards-based (e.g., RUSA Guidelines)
  • Outcomes-based, focusing primarily on the user (e.g., user engagement or “use” in a manner that goes beyond simply counting)
    • the need or deficit to address
    • the specific actions, resources, or services that might address the need
    • the specific ways in which the resources or services under consideration would address that need
    • the future (short-, mid-, and longer-term) impact of the change(s) under consideration
    • the specific empirical evidence to demonstrate that impact
    • how that evidence will be collected

Kelly, M. M. (2020). The Complete Collections Assessment Manual. American Library Association.

Killick, S., & Wilson, F. (2019). Putting library assessment data to work. Facet Publishing.

Grand Challenges in Assessment (n.d.). Grand Challenges Publications.

Oakleaf, M. (2011). Are they learning? Are we? Learning outcomes and the academic libraryThe Library Quarterly81(1), 61-82.

  • The “Shared Learning Standards and Outcomes” comparison table, including the ACRL Standards, AAC&U LEAP Essential  Learning  Outcomes, AAC&U VALUE  Rubrics, ISTE NETS-S  Standards, NCTE 21st-Century  Curriculum  and  Assessment Framework, Partnership for 21st-Century Skills, AASL Standards for  the  21st-Century Learner, Common  Core  State  Standards, exemplary cocurricular standards (e.g., PSU), and CAS Learning and Developmental Outcomes

  • Table 1. Student Learning Impact Map
  • Fig. 1. Library Impact Model

  • Table 2. Mission Impact Map

  • By capturing, tracking, and reporting the answers to these questions in a student learning assessment plan, librarians can record their impact on student learning.

    • What learning outcomes will be achieved?

    • What are the target student audiences for learning?

    • What opportunities for learning exist?

    • What is known about student learning? Not known?

    • What methods or tools would best assess learning?

    • How will student learning assessment data be analyzed?

    • How will librarians know that students have learned?

    • Who is responsible?

    • What is the timeline for assessment?

    • What resources are required?

    • What are the results of student learning assessment?

    • How will results be presented? To whom?

    • Who can make decisions and recommendations based on results?

    • What decisions and recommendations are made based on results?

    • What is the plan for following through and following up on the decisions and recommendations for change?

Oakleaf, M. (2012). Academic library value: The impact starter kit. Dellas Graphics.

Orr, R.J. (1973). Measuring the goodness of library services: A general framework for considering quantitative measures.” Journal of Documentation pp. 315–32.

Weiner, S. (2009). The contribution of the library to the reputation of a university. The Journal of Academic Librarianship35(1), 3-13.

  • This paper explores the relationship between a peer-assessed reputation rating for each of the 247 doctoral universities and cross-institutional performance indicators for universities and their libraries. Using multiple regression, the variable of library expenditures, was the only consistently significant one using combinations of variables. This report is the first showing empirical evidence of the relationship between reputation with measures used in higher education and library performance.
  • The variables in the “library” dimension were: library expenditures, library instruction presentations, number of participants at library presentations, reference transactions, and library professional staff. These variables represent functions that cut cross boundaries within colleges and universities.

Benefits (e.g., Strategic Goals)

 
Required Resources  
Anticipated Duration  
Possible Deadline