Type to search

Challenges of Tracking Adoption and Impact: Discussion of Cost and Academic Data Management

Curriculum Planning and Development Spotlight on OER

Challenges of Tracking Adoption and Impact: Discussion of Cost and Academic Data Management

This article is part of our August 2020 spotlight on open educational resources. Click here to read the introduction and view the other articles in the series.

Discussions of open educational resources (OER) adoption and zero textbook cost (ZTC) courses often result in three types of data dissemination: cost savings, satisfaction, and materials. In this article we discuss the pros and cons of these categories, focusing on the ease and difficulties of collecting and managing data within each of them.

An important and necessary first step to data management and meaningful data dissemination is to agree on definitions of OER and ZTC. From an institutional perspective, these shared definitions help data managers understand what courses to count, when to count students, and what materials qualify as OER and ZTC. Overall, it may be helpful to start with the results desired by all stakeholders—for example, cost saving, satisfaction, materials—prior to deciding on what data to collect.

Cost savings

One of the most widely reported, and admittedly flashy, data pieces is textbook cost savings. On average, across the nation, students report saving $90-100 dollars per semester when enrolled in an OER or ZTC course. Similarly, across the nation, faculty report saving students between $90 and $100 per semester when they opt for ZTC in courses. When multiplied by student headcount, the numbers add up quickly. In course with 100 students, saving each student $100 dollars equals a flashy $10,000. When just one class saves students $10,000, the savings and reduction in cost attendance can skyrocket and is newsworthy.

While student cost savings is an important number, it is imperative that a campus decide how to determine and report that number. Academic leaders and faculty need to consider the following:

  • Which student head count to use: Should the headcount be collected on the first day of classes, after the drop/add deadline, or at the end of the semester? This decision is important for consistency within the institution and for data collection. Centralizing these decisions is essential and more helpful to be decided early on.
  • Which cost savings number to use: In a course, a textbook could cost $35 or $300 (used and new), what number should be used when calculating cost savings? This number also heavily depends on discipline, with STEM fields regularly costing significantly more. Consistency and clear decision-making are the important points here as data collection needs to be planned so that data managers can understand and explain what the figures represent accurately. The widely reported $90–$100 cost savings figure represents the average of both new and used textbooks across disciplines of higher education that carry very different costs.
  • Which courses to measure: Should only courses that typically require a textbook, such an introduction to biology or English composition, be measured, or should courses that typically do not require textbooks (e.g., yoga) also be counted and measured? Is it appropriate to count this course adoption in the overall cost savings numbers?

As these questions help illuminate and guide a data management plan, it is important for student cost savings data to be clear and consistent. It is important for faculty and academic leaders to work together to understand what data is captured so that they can accurately explain what the data means.


Satisfaction describes three areas, satisfaction surveys, grades, and how many students ZTC use affects. Satisfaction can be understood as an overall, university-wide view of the human impact of ZTC. Satisfaction surveys are helpful for understanding student and faculty subjective perception of the actual ZTC materials (e.g., use, quality, effectiveness, learning, and understanding compared to similar textbooks that cost money). Despite the critique of satisfaction measurement tools as subjective, they are an important element in data collection to understand the impact of ZTC materials on the university broadly. Validated questionnaires that measure student and faculty satisfaction with ZTC materials are essential for data collection but rare (Redcay et al., in draft). The authors recommend using the validated ZTC satisfaction scale that can be found for free through Millersville University’s repository (https://millersville.tind.io/record/6040) (Redcay et al., in draft). Data can be collected during and a few weeks before the end of the semester in which the ZTC materials are being used to ensure a higher participation rate.

The second aspect of satisfaction is grades which includes a student’s perception of what grade they will receive, their actual grade as reported objectively by the university and the DFW rates. DFW rates refer to grades typically between 60 and 69 (D grade), grades between 0 and 59 (F grade), and the number of students who withdraw from the course. DFW rates can be obtained by faculty self-reporting, but objective data pulled from the university registrar or university institutional research office would increase data accuracy.

Finally, the number of students affected by ZTC materials includes the total students enrolled in each section of the course. Some demographic variables are also useful to collect, such as gender, course and department (e.g., psychology, biology), major of student, college (e.g., College of Math & Science), Pell Grant recipient status, and class (e.g., first-year). A study we conducted has found a significant difference in ZTC satisfaction by gender and expected grade in that male students and individuals who had expected a D grade had significantly lower satisfaction with ZTC materials than did female students and those who expected As, Bs, or Cs) (Pfannenstiel et al., under review).


When defining OER and ZTC, the institution must decide which materials count in each category and how to count courses using a mix of open and no-cost-to-the-student materials. It is widely accepted that OER items are digitally available with an open license that allows for remixing and reuse. It is moderately accepted that OER items are anything digitally available without cost, regardless of license or copyright restrictions. ZTC can include the above items as well as materials provided by the institution, often the library, at no direct cost to students.

Garnering usage statistics is often multifaceted. For digitally available open materials, are common access points used, such as links housed within a learning management system? Clicks from a post in a learning management system can often be tracked, but those from an email link cannot. Are these use statistics manually reported by faculty to an institutional office, or can they be systematically collected? For course materials the library offers, using an authenticated link provides statistics for each time the work is accessed, whereas downloading the PDF once and then disseminating it to students counts only one use. This inaccurately reflects real-world activity and stifles the library’s ability to prove its worth.

Retaining access to electronic journals, books, and other published materials depends on increasing budgets, as the cost of access to these items increases annually. When budgets remain static or are reduced, research and course material availability diminishes. Librarians are in a unique position to provide expertise in identifying and accessing digital content; they often are aware of specialized repositories of primary source materials and subject-based journal and book collections and have access to streaming video options. While library vendors and publishers offer tools to systematically collect usage statistics, the difficulty lies in determining which items are used as course materials and which are used for research and other purposes.

To accurately assess use counts of subscription and purchased library content used as course material, the institution must rely on faculty to centrally report the content they assigned, thereby enabling the library systems to report appropriate usage metrics.

No matter the method of identifying, collecting, and quantifying the use of zero-cost-to-students course material, consistency is the only way to make the metrics meaningful.


As this discussion has demonstrated, data collection to assess the impact of OER adoption and ZTC courses is complicated. There is a large number of data points that help a campus understand the effects of adopting OER and ZTC materials in terms of cost, access, and learning. The data collection, management, and dissemination can be improved when university leaders coordinate and work with faculty and librarians as they build a complete picture of OER and ZTC impact on the campus.


Pfannenstiel, A. N., Redcay, A., & Albert, D. (2020). Student satisfaction with OER/ZTC.

Pfannenstiel, A. N., Redcay, A., & Albert, D. (2020). Student perceptions of OER. https://millersville.tind.io/record/6040

Redcay, A., Pfannenstiel, A. N., & Albert, D. (2020). ZTC student satisfaction scale (ZSS). Millersville Repository.

Redcay, A., Pfannenstiel, A. N., Albert, D., & Freeman, A. (2020). The development and validation of zero textbooks cost (ZTC) satisfaction scale.

A. Nicole Pfannenstiel, PhD, is an assistant professor of digital media in English at Millersville University.

Alex Redcay, PhD, LCSW, is the president of the Pennsylvania Association of Social Work Education and an assistant professor of social work at Millersville University. Dr. Redcay is a licensed clinical social worker in New Jersey and Pennsylvania. Her current research includes the impact of social, legal, and medical transition on mental health and substance use for transgender, genderqueer, and nonbinary individuals.

Krista Higham, MSLS, is an assistant professor in the Library Department and access services librarian at Millersville University.


You Might also Like

Leave a Comment