SlideShare a Scribd company logo
Effectively Applying Usage 
Statistics in E-Resource 
Collection Development 
Using Evidence and Outreach in Decision- 
Making 
ACRL-MD – New Identities: Adapting the Academic Library 
November 14, 2014 
Randy Lowe – Collection Development, Acquisition & Serials Librarian, Frostburg State University
Overview 
 Why E-Resources Assessment? 
 Usage Statistics – Types, Reports, 
Collection 
 Assessment: Evidence & Outreach 
◦ Applying usage statistics to collection 
management decision-making 
◦ Engaging librarians, faculty and 
administrators in the process
Why E-Resource 
Assessment? 
 Libraries have historically measured use 
of services (circulation statistics, re-shelving 
counts, gate counts, etc.) 
 The technology upon which e-resources 
reside inherently allows for extensive 
collection of usage data – and 
assessment of that use 
 Assessment of use data supports 
evidence-based collection management 
 Libraries operate in a challenging fiscal 
environment – demonstrating e-resource 
value and fiscal responsibility is a must
Effective E-Resources Assessment 
 Two essential elements in conducting 
effective e-resource assessments: 
◦ Efficient and Accurate Data Collection 
◦ Clear and Succinct Analysis 
 E-Resource assessment is more than 
just collecting usage statistics – it is 
applying them in the making of sound 
management decisions regarding 
library resources 
 Usage statistics measure volume, not 
value of resources
What Can You Do with E-Resources 
Usage Statistics? 
 Track usage / Assess overall collection use 
 Track expenditures / Figure cost-per-use 
 Track turnaways 
 Assess title, subject, publisher and other usage 
elements 
 Identify user behavior trends 
 Assist in making collection development 
decisions, including acquisition model selection 
 Effectively advocate for resources – especially 
if assessment is tied to institutional 
goals/strategic plan, curricular initiatives, 
student learning goals
Types of Usage Statistics Reports and 
When to Use Them 
 Vendor-Defined 
◦ Analyzing usage data from a single vendor 
◦ Obtaining cost information 
◦ Comprehensive data files make it easy to 
analyze combinations of various data 
elements [Example] 
◦ When COUNTER reports do not provide 
adequate detail 
 COUNTER-Compliant 
◦ Analyzing usage data across multiple 
vendors 
◦ Ensuring data integrity though adherence to 
recognized standards
Collecting Usage Data 
 Define Objectives 
◦ What you need to know or are trying to 
find out should drive your data collection 
decisions 
◦ Collecting Usage Statistics can be a major 
time commitment 
 Use your assessment objectives to help you to 
not only determine what data to collect, but 
when you have collected enough data to 
analyze 
 Properly balancing time and resources 
dedicated to both data collection and analysis 
is vital
Collecting Usage Data 
 Various vendors present data differently 
– this can present a challenge not only 
across vendors, but even with combining 
data elements from a single vendor 
 Manipulation / Formatting of raw data will 
likely be necessary 
 Example – COUNTER BR1 Report + 
Acquisition Type Data + Cost Data 
Compiled Manually = Data for 
Assessment 
 Schedule time(s) to collect data 
 Vendors’ archival policies for maintaining 
usage statistics vary
Assessing Usage Data 
You have usage data – What do you do 
with it? 
 It is easy to get overwhelmed in usage 
data – analysis should be guided by 
your assessment objectives 
◦ What do you want/need to assess? 
◦ What questions are you trying to answer? 
◦ Who is your audience? 
 Have a purpose for using your data
Assessing Usage Data 
 Assessment is most powerful when it 
is tied to an action or potential action 
(including requests) 
 There is no single method for 
assessing usage statistics in every 
case – the “right data” to analyze and 
include in your report is that which will 
support your assessment objectives
Usage Data Analysis 
 Data analysis should be thorough, but 
presented succinctly 
 Conclusions, trends, etc. should be 
clear and verifiable 
 Beware of pre-conceived notions, 
perceptions or opinions – hypotheses 
can be both proven and refuted 
 State known limitations of the data you 
have collected and how they may 
affect your analysis
Using/Applying Evidence: 
Writing Your Report 
 Know your audience 
 Include a brief purpose/introduction 
 Write clearly and succinctly 
 Reported usage data should support 
the purpose of the assessment 
◦ Only include data that supports your 
stated objectives – don’t include all 
collected data; it won’t be read by 
administrators
Using/Applying Evidence: 
Writing Your Report 
 Reported usage data should support the 
purpose of the assessment (continued) 
◦ Include data within the text of your report where it 
is necessary and provides clear evidence for the 
points you are making 
◦ It is usually more effective to include visual 
representations of (charts, graphs) rather than 
just figures within the text of reports 
◦ Larger tables and data sets, if necessary to 
include, are best placed in appendices 
 Conclusions and recommendations should be 
easily identified and based on the evidence 
presented 
 State action and/or desired response clearly
Using/Applying Evidence: 
The Frostburg Experience 
 Effectively applying e-resources data to 
collection management has been an 
evolution 
 The lay of the land – 2007 
◦ We had data (searches & link resolver) 
◦ Study to compare journal costs by format 
◦ Data sat in a vacuum outside of annual 
database budgeting 
 Needed to establish a frame of reference 
to begin applying usage statistics in 
engaging faculty and administrators
Evidence & Outreach Example 1: 
Faculty Survey – 2007-2008 
 Faculty had not been previously engaged 
systematically in collection development efforts 
 User behavior as demonstrated in link resolver 
statistics indicated that online full-text was 
preferred by users 
 Library determined periodicals and standing 
orders should be migrated to online format, but 
which ones? 
 Fall 2007: Faculty surveyed regarding value 
(content) and usefulness (format) of journals, 
standing orders, databases. 
 Spring 2008: Results of survey matched link 
resolver usage statistics 
 Subscription Cancellations, additions, format 
migrations made over next 5 years
Evidence & Outreach Example 2: 
Underutilized Journals 
 Library began collecting full text article 
retrievals in 2009-2010 (and re-shelving 
counts in 2011-2012) 
 All journal subscriptions are reviewed 
by librarians annually 
 Faculty are involved in second level of 
review for underutilized subscriptions 
 Objective is to use the process as a 
means for continued dialogue with 
faculty in collection development
Evidence & Outreach Example 3: 
Collaboration with Academic 
Depts  Academic departments becoming 
increasingly engaged in e-resource 
subscription discussions, including funding 
◦ Chemistry – CAS SciFinder 
◦ Visual Arts – Artstor 
 Current collaboration is with Biology 
◦ Department not satisfied with current e-resources 
◦ No funds available for additional resources 
◦ Reviewed use of current journal subscriptions 
and content of requested databases 
◦ Department suggested journal cancellations to 
fund databases 
◦ New e-resource scenarios developed
Evidence & Outreach Example 4: 
E-Book Assessment 
 Frostburg State University: Report 
overall use and expenditures of e-books 
over time; implement the most cost 
effective DDA acquisition model(s) 
[Report] 
 USMAI Consortial E-Book Pilot: Assess 
the effectiveness of a specific DDA 
acquisition model for the consortium; use 
and expenditures by consortium 
members and user types; identification of 
possible future program funding models 
[Report]
Thank You 
 Questions? 
 Contact Information: 
Randy Lowe 
Frostburg State University 
rlowe@frostburg.edu

More Related Content

PPTX
Collection Intelligence: Using data driven decision making in collection mana...
PPT
Clay Shirky, Fantasy Football, and Using Data to Glean the Future of Library ...
PPT
ALA Collections Review Presentation(070209)
PDF
Data informed decision making - Yaz El Hakim
PDF
Ipn metodika konference_pardubice_sivertsen
PPT
Evaluating Libraries
PPT
Library Assessment
PPT
Library Assessment
Collection Intelligence: Using data driven decision making in collection mana...
Clay Shirky, Fantasy Football, and Using Data to Glean the Future of Library ...
ALA Collections Review Presentation(070209)
Data informed decision making - Yaz El Hakim
Ipn metodika konference_pardubice_sivertsen
Evaluating Libraries
Library Assessment
Library Assessment

What's hot (19)

PPSX
Spss workshop by riaz
PPTX
Web and social media metrics: library’s impact and value
PPTX
Academic library assessment and evaluation in higher education
PDF
Researcher profiles and metrics that matter
PPTX
Ed psy 510 5th class 2014
DOCX
Acc 281 final exam
PPTX
AIM: Data-driven collection management
PPT
Collection Analysis and Evaluation: Fundamentals of Collection-Centered Asse...
PDF
028 097e
DOC
School Library Evaluation
PPTX
data driven decision making
PPT
Cochrane for Librarians: An update on searching and specialised registers
PDF
Escape the data dungeon: Shedding light on strategies to share your findings
PDF
Evaluating the Big Deal: What metrics matter?
PPTX
Case studies for open science
PDF
2018 Library Assessment Conference: How many seats do we need in our library?
PPTX
Research metrices (cite score)
PPT
Building a resource for practical assessment: adding value to value and impact
PPT
Aasl panel
Spss workshop by riaz
Web and social media metrics: library’s impact and value
Academic library assessment and evaluation in higher education
Researcher profiles and metrics that matter
Ed psy 510 5th class 2014
Acc 281 final exam
AIM: Data-driven collection management
Collection Analysis and Evaluation: Fundamentals of Collection-Centered Asse...
028 097e
School Library Evaluation
data driven decision making
Cochrane for Librarians: An update on searching and specialised registers
Escape the data dungeon: Shedding light on strategies to share your findings
Evaluating the Big Deal: What metrics matter?
Case studies for open science
2018 Library Assessment Conference: How many seats do we need in our library?
Research metrices (cite score)
Building a resource for practical assessment: adding value to value and impact
Aasl panel
Ad

Viewers also liked (7)

PDF
Best Practices for Managing E-Resources in Academic Libraries
DOC
Academic Library Budgets Bibliography
PDF
Collection development of e-resources
PPTX
Collection development
PPT
IFLA Key Issues in Electronic Resources Collection Development and CMO Propos...
PPTX
PPT
Collection development
Best Practices for Managing E-Resources in Academic Libraries
Academic Library Budgets Bibliography
Collection development of e-resources
Collection development
IFLA Key Issues in Electronic Resources Collection Development and CMO Propos...
Collection development
Ad

Similar to Effectively Applying Usage Statistics in E-Resource Collection Development (20)

PPT
PPT
Moneyball, Libraries, and more - Ithaka collections presentation
PPTX
Eval_Elec_Resources.pptx
PPT
2010 nasig integrating_usage_statistics
PPT
Data driven collection development
PPT
Oklahoma Collections Innovation Presentation
PPTX
Use and user study
PPTX
Managing Electronic Resources for Public Libraries: Part 2
PPT
Ifla 2010
PPT
Getting the Most Out of Your E-Resources: Measuring Success
PPT
Virginia ACRL Presentation
DOCX
NNLM SEA webinar June 2018 script
PPT
Usage Statistics in the Real World: Two Perspectives
PPTX
Needs assessment
PPTX
Needs assessment
PDF
ASA conference Feb 2013
PPTX
Needs assessment 2007 version
PPT
eResources in Academic Libraries
PPTX
E-Metrics: Assessing Electronic Resources
PPT
Needs Assessment 2003 version
Moneyball, Libraries, and more - Ithaka collections presentation
Eval_Elec_Resources.pptx
2010 nasig integrating_usage_statistics
Data driven collection development
Oklahoma Collections Innovation Presentation
Use and user study
Managing Electronic Resources for Public Libraries: Part 2
Ifla 2010
Getting the Most Out of Your E-Resources: Measuring Success
Virginia ACRL Presentation
NNLM SEA webinar June 2018 script
Usage Statistics in the Real World: Two Perspectives
Needs assessment
Needs assessment
ASA conference Feb 2013
Needs assessment 2007 version
eResources in Academic Libraries
E-Metrics: Assessing Electronic Resources
Needs Assessment 2003 version

Recently uploaded (20)

PDF
VCE English Exam - Section C Student Revision Booklet
PDF
O5-L3 Freight Transport Ops (International) V1.pdf
PDF
01-Introduction-to-Information-Management.pdf
PPTX
GDM (1) (1).pptx small presentation for students
PPTX
master seminar digital applications in india
PDF
Module 4: Burden of Disease Tutorial Slides S2 2025
PDF
Complications of Minimal Access Surgery at WLH
PDF
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
PDF
Anesthesia in Laparoscopic Surgery in India
PDF
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
PDF
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
PDF
RMMM.pdf make it easy to upload and study
PPTX
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
PPTX
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
PPTX
Cell Structure & Organelles in detailed.
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PDF
2.FourierTransform-ShortQuestionswithAnswers.pdf
PDF
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PPTX
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
PDF
Supply Chain Operations Speaking Notes -ICLT Program
VCE English Exam - Section C Student Revision Booklet
O5-L3 Freight Transport Ops (International) V1.pdf
01-Introduction-to-Information-Management.pdf
GDM (1) (1).pptx small presentation for students
master seminar digital applications in india
Module 4: Burden of Disease Tutorial Slides S2 2025
Complications of Minimal Access Surgery at WLH
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
Anesthesia in Laparoscopic Surgery in India
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
RMMM.pdf make it easy to upload and study
1st Inaugural Professorial Lecture held on 19th February 2020 (Governance and...
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
Cell Structure & Organelles in detailed.
Final Presentation General Medicine 03-08-2024.pptx
2.FourierTransform-ShortQuestionswithAnswers.pdf
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
Supply Chain Operations Speaking Notes -ICLT Program

Effectively Applying Usage Statistics in E-Resource Collection Development

  • 1. Effectively Applying Usage Statistics in E-Resource Collection Development Using Evidence and Outreach in Decision- Making ACRL-MD – New Identities: Adapting the Academic Library November 14, 2014 Randy Lowe – Collection Development, Acquisition & Serials Librarian, Frostburg State University
  • 2. Overview  Why E-Resources Assessment?  Usage Statistics – Types, Reports, Collection  Assessment: Evidence & Outreach ◦ Applying usage statistics to collection management decision-making ◦ Engaging librarians, faculty and administrators in the process
  • 3. Why E-Resource Assessment?  Libraries have historically measured use of services (circulation statistics, re-shelving counts, gate counts, etc.)  The technology upon which e-resources reside inherently allows for extensive collection of usage data – and assessment of that use  Assessment of use data supports evidence-based collection management  Libraries operate in a challenging fiscal environment – demonstrating e-resource value and fiscal responsibility is a must
  • 4. Effective E-Resources Assessment  Two essential elements in conducting effective e-resource assessments: ◦ Efficient and Accurate Data Collection ◦ Clear and Succinct Analysis  E-Resource assessment is more than just collecting usage statistics – it is applying them in the making of sound management decisions regarding library resources  Usage statistics measure volume, not value of resources
  • 5. What Can You Do with E-Resources Usage Statistics?  Track usage / Assess overall collection use  Track expenditures / Figure cost-per-use  Track turnaways  Assess title, subject, publisher and other usage elements  Identify user behavior trends  Assist in making collection development decisions, including acquisition model selection  Effectively advocate for resources – especially if assessment is tied to institutional goals/strategic plan, curricular initiatives, student learning goals
  • 6. Types of Usage Statistics Reports and When to Use Them  Vendor-Defined ◦ Analyzing usage data from a single vendor ◦ Obtaining cost information ◦ Comprehensive data files make it easy to analyze combinations of various data elements [Example] ◦ When COUNTER reports do not provide adequate detail  COUNTER-Compliant ◦ Analyzing usage data across multiple vendors ◦ Ensuring data integrity though adherence to recognized standards
  • 7. Collecting Usage Data  Define Objectives ◦ What you need to know or are trying to find out should drive your data collection decisions ◦ Collecting Usage Statistics can be a major time commitment  Use your assessment objectives to help you to not only determine what data to collect, but when you have collected enough data to analyze  Properly balancing time and resources dedicated to both data collection and analysis is vital
  • 8. Collecting Usage Data  Various vendors present data differently – this can present a challenge not only across vendors, but even with combining data elements from a single vendor  Manipulation / Formatting of raw data will likely be necessary  Example – COUNTER BR1 Report + Acquisition Type Data + Cost Data Compiled Manually = Data for Assessment  Schedule time(s) to collect data  Vendors’ archival policies for maintaining usage statistics vary
  • 9. Assessing Usage Data You have usage data – What do you do with it?  It is easy to get overwhelmed in usage data – analysis should be guided by your assessment objectives ◦ What do you want/need to assess? ◦ What questions are you trying to answer? ◦ Who is your audience?  Have a purpose for using your data
  • 10. Assessing Usage Data  Assessment is most powerful when it is tied to an action or potential action (including requests)  There is no single method for assessing usage statistics in every case – the “right data” to analyze and include in your report is that which will support your assessment objectives
  • 11. Usage Data Analysis  Data analysis should be thorough, but presented succinctly  Conclusions, trends, etc. should be clear and verifiable  Beware of pre-conceived notions, perceptions or opinions – hypotheses can be both proven and refuted  State known limitations of the data you have collected and how they may affect your analysis
  • 12. Using/Applying Evidence: Writing Your Report  Know your audience  Include a brief purpose/introduction  Write clearly and succinctly  Reported usage data should support the purpose of the assessment ◦ Only include data that supports your stated objectives – don’t include all collected data; it won’t be read by administrators
  • 13. Using/Applying Evidence: Writing Your Report  Reported usage data should support the purpose of the assessment (continued) ◦ Include data within the text of your report where it is necessary and provides clear evidence for the points you are making ◦ It is usually more effective to include visual representations of (charts, graphs) rather than just figures within the text of reports ◦ Larger tables and data sets, if necessary to include, are best placed in appendices  Conclusions and recommendations should be easily identified and based on the evidence presented  State action and/or desired response clearly
  • 14. Using/Applying Evidence: The Frostburg Experience  Effectively applying e-resources data to collection management has been an evolution  The lay of the land – 2007 ◦ We had data (searches & link resolver) ◦ Study to compare journal costs by format ◦ Data sat in a vacuum outside of annual database budgeting  Needed to establish a frame of reference to begin applying usage statistics in engaging faculty and administrators
  • 15. Evidence & Outreach Example 1: Faculty Survey – 2007-2008  Faculty had not been previously engaged systematically in collection development efforts  User behavior as demonstrated in link resolver statistics indicated that online full-text was preferred by users  Library determined periodicals and standing orders should be migrated to online format, but which ones?  Fall 2007: Faculty surveyed regarding value (content) and usefulness (format) of journals, standing orders, databases.  Spring 2008: Results of survey matched link resolver usage statistics  Subscription Cancellations, additions, format migrations made over next 5 years
  • 16. Evidence & Outreach Example 2: Underutilized Journals  Library began collecting full text article retrievals in 2009-2010 (and re-shelving counts in 2011-2012)  All journal subscriptions are reviewed by librarians annually  Faculty are involved in second level of review for underutilized subscriptions  Objective is to use the process as a means for continued dialogue with faculty in collection development
  • 17. Evidence & Outreach Example 3: Collaboration with Academic Depts  Academic departments becoming increasingly engaged in e-resource subscription discussions, including funding ◦ Chemistry – CAS SciFinder ◦ Visual Arts – Artstor  Current collaboration is with Biology ◦ Department not satisfied with current e-resources ◦ No funds available for additional resources ◦ Reviewed use of current journal subscriptions and content of requested databases ◦ Department suggested journal cancellations to fund databases ◦ New e-resource scenarios developed
  • 18. Evidence & Outreach Example 4: E-Book Assessment  Frostburg State University: Report overall use and expenditures of e-books over time; implement the most cost effective DDA acquisition model(s) [Report]  USMAI Consortial E-Book Pilot: Assess the effectiveness of a specific DDA acquisition model for the consortium; use and expenditures by consortium members and user types; identification of possible future program funding models [Report]
  • 19. Thank You  Questions?  Contact Information: Randy Lowe Frostburg State University rlowe@frostburg.edu