Skip to Main Content

VIVA: Value Metric Project

Virginia's Academic Library Consortium

This guide has variable layouts and houses the majority of the "sub" pages for VIVA's website.

 

Findings

VIVA Value Metric  

VIVA Value Metric Revision 2021

Background
In 2021, a small working group formed to update the Value Metric to meet the current and evolving priorities of the consortium. This group's purpose was to ensure that the assessment tool’s values were reflective of the consortium’s priorities (as outlined by the VIVA Steering Committee) in relation to inclusion, diversity, equity, and open access. Additionally, the working group wanted to streamline the data collection and improve overall workflows so that the tool was more easily adoptable and actionable at VIVA member institutions. 

In considering how to meaningfully update the tool, the working group focused on ensuring that values of equity, inclusion, diversity, accessibility, and openness were integrated throughout the tool, rather than separated out in a singular or other category within the tool – by creating, for example, a “diversity” or “accessibility” category. The thought was that by infusing the tools with these values across all of the categories they would ultimately hold more weight, and would be less likely to become a checkbox. Additionally, there was an acknowledgement that this was not a static tool, and that metrics needed to be refined and updated regularly to reflect the consortium’s priorities, therefore the tool had to be in a new format. The group also prioritized eliminating any existing metric that wasn’t actionable. Finally, the working group focused on ensuring that the revised metric did not include points for meeting minimum expectations. 

The working group met with a wide range of VIVA communities throughout the update process, including publishers and vendors, university DEI committees, and accessibility experts in Virginia. The working group also solicited and incorporated feedback from VIVA member scholarly communication and collection development experts throughout the process, including a community-wide forum, to ensure the tool would meet the broader VIVA community’s needs.

Updates 
The updated tool now contains 25 metrics organized into 7 broad categories, including: Statewide Relevance; Supports VIVA Values; Curriculum Alignment; Cost Effectiveness; User Experience; Product Administration; Format Specific Criteria. The equity, diversity, inclusion, and accessibility criteria added can be broadly categorized as including: Content, Engagement, and Accessibility. These three major areas were incorporated into Curriculum Alignment, User Experience, and Format Specific Criteria. The Open Access updates fell into Values, Cost Effectiveness, and Product Administration.

The tool itself has been streamlined and redeveloped as a Qualtrics survey. The workflow is designed to facilitate collaboration among colleagues and to enable individuals to focus on scoring a single or small set of metrics across all products. The survey approach has made the tool less intimidating for users and more easily adaptable to an individual institution’s needs. This approach also allows for the creation of dynamic reports, so that Collections and Steering Committee members can quickly view updated results from the tool. These reports are meant to be used as a framework to begin broader conversation and deeper analysis of the overall collections, rather than as the final decision point. 

VIVA Value Metric Task Force 2016-2017

Charge

Design and apply a framework for the coherent and holistic evaluation of VIVA products. The task force will determine what the highest collection development priorities are for the consortium and examine how these can be translated into quantifiable values. Potential factors to consider include relevance to programs, cost avoidance/list price discount, and usage.  Usage factors may be further delineated to include total usage, usage by institution type, ratio of usage by top institution(s), and cost per use. The end result will be an assessment framework and value metric system for the evaluation of shared resources that are reflective of VIVA’s overarching values.  

Background

In the 2014-2016 biennium, VIVA received a 5% cut in funding to its budget. This necessitated a close review and cancellation of several key products. During this review it became clear that the consortium needed a standardized evaluation criteria to apply when reviewing its resources. Because subscriptions include annual price increases, even if VIVA receives no new cuts, the consortium must always be closely considering which resources it will be able to continue. Critical to this process will be the ability to evaluate and clearly articulate the value of VIVA’s shared research resources.

Regardless of which strategies are employed to measure the perceived value of content, grounded assessment must begin with establishing the priorities and goals of the institution and its users. There is an added layer of complexity in determining value at the consortial level with its wide range of institution types. All of VIVA’s member institutions, for example, are part of the higher education ecosystem within Virginia, but they include both public and private institutions and range from large doctoral research institutions to small two-year community colleges to specialized medical, law, and other institutions.  How these institutions perceive the value of a particular resource to their users will naturally be different. 

The shared resources to be evaluated are themselves diverse in both format and in access models. Formats range from e-books, journals, databases, to streaming media, and access and acquisition models vary, including content that is leased, collaboratively-owned, demand-driven, open-access, and evidence based. 

In order to create a system that could be used to compare the relative value of its shared resources, while prioritizing the highest collection development priorities for the consortium, and accounting for the diversity of materials and models, VIVA’s Collections Committee has formed a Value Metric Task Force.

 

Working Group Members (2021-present)

Summer Durrant
University of Mary Washington

Helen McManus
George Mason University

Christopher Lowder
George Mason University

Genya O'Gara
VIVA

Task Force Members (2016-2017)

Genya O'Gara, Chair
Virtual Library of Virginia

Beth Blanton-Kent
University of Virginia

Cheri Duncan
James Madison University

Summer Durrant
University of Mary Washington

Madeline Kelly
George Mason University

Julie Kane
Washington & Lee University

Crystal Newell
Piedmont Virginia Community College

Anne Osterman
Virtual Library of Virginia