2011 Charleston Conference

    BY POPULAR DEMAND:
   BUILDING A CONSORTIAL
  DEMAND DRIVEN PROGRAM
Our 45 Minutes

 Overview of Project
 Lessons Learned
   Process

   Funding

   Workflow

   Communication

   Marketing

   Training

 Evaluation
 Questions & Answers
Partners
Alliance Structure

                           Council of
                            Library
                           Directors

                    Board of
                    Directors


Task Forces         Standing       Interest   Alliance
and Teams          Committees      Groups      Staff

 Collaborative Technical
Services Team (CTST) and
  Implementation Team
Timeline

 2008/2009:
    Alliance Strategic Agenda Adopted
    May -June 2009: Ebook Task Force Issued Report
 2010
    January 2010: Ebook Team Created
    August 2010: Selected EBL and Demand-Driven Acquisitions Model
 2011
    January 2011: Implementation Team Created
    July 2011: Project Went Live
 2012
    Recommendation for Permanent Program?
    Propose New Funding Model?
Goals

 Leverage Vendor Relationships
 Consortium-Wide Access
 Ownership
 Collaborate with CTST
 Equitable Funding Model
 Relevant Content for All 36 Libraries
 Evaluate the Project
Funding Model

 FTE of 17,000+ = $15,000
 FTE of 8,000-17,000 = $10,000
 FTE of 4,000-8,000 = $7,000
 FTE of 1,000-4,000 = $5,000*
 Community Colleges = $2,500
 Total = $231,000


* Bulk of Contributions
Collaboration & Organization
Collaboration & Organization

                  Vendor Relations

 Full Partnership Was Key
 Bring Vendors in Early
Collaboration & Organization

                 Committee Staffing

 Rely upon proven organizational strategies
 Team overlap was valuable
 Ability to see beyond own library environment
 Technical discussion should be clear and
  understandable
 It takes more time than anticipated
Collaboration & Organization:

                        Process

 Discovery and access were more challenging than
  defining the approval profile
 Can’t start work on discovery and access issues too
  soon
 Complex and multi-level solutions will complicate
  the process and take more time.
Collaboration & Organization

                Managing Expectations

 Keep in mind, it’s a pilot
 Remain flexible
 Recognize the apprehension of participating libraries
 For the consortium: have confidence in the team
 Managing expectations should be part of the plan
Workflow
Workflow

                Environmental Scan

 Summit
 Innovative, WCL, and Evergreen OPACS
 Varying Needs, Resources, Expectations
Workflow

                 Goals and Approach

 One “best-of” Process
 Gather & Measure
 Define
Workflow

                   Requirements

 WCP Records
 Knowledge Base
 Alliance Holdings Symbols
 Monthly Record Delivery Schedule
Workflow

                  The Workflow

 Knowledge Base Centric
 Monthly
 Batch-based
Workflow

                   Results & Revisions

             “Cosmetically” successful, but…..

 Unsustainable                         Sustainable
 Off-base                              On-target
 Tardy                                 Timely
Workflow

                       Lessons

 Difficult, Time Consuming Work
 Build for Long-Term sustainability
 Target Common Needs
 Flexibility
 Know Your Environment
Communication

NEVER EASY AND ALWAYS NECESSARY
Communication

              Communication Structure

 Alliance Structure
 Channels
 Vendors
Communication

                     Marketing

 End User
 Internal
 Surprise Results
Communication

                      Training

 Training Sessions
 Online Documentation
 Annual Meeting
Communication

                      Lessons

 Communications management important
  throughout project
 Responsibilities of teams and team members should
  be clear from the start
 Clear expectations of how, where and when
  information will be shared
 Can’t repeat the message too many times
Evaluation of the DDA pilot

       WHY ALL THE FUSS?
Real-time Data


 Weekly emailed reports of group usage and
  expenditures
 Reports posted on Alliance website for archive access
 Monthly emailed spreadsheets with usage and
  expenditures by library
 Alliance-specific reporting site
Midpoint Evaluation



 What is the midpoint?
 Survey to assess effectiveness of training
 Documents for libraries to self-assess
 Profiles of library usage
Midpoint Evaluation

Usage Profile:

   Reed College
   Loaded records locally: 7/11/11
   Classes started: 8/29/11
   118 uses as of 10/14/2011
   Total list price of content used = $12,004.14
   Contribution: $5,000
   Reed College has paid 42% of list price to facilitate access
     to content at its library.
Final Evaluation


 Yet to come
 Survey of technical and content aspects of pilot
 Evaluation of MARC records and discovery of
  content
 Close analysis of usage, focusing on content,
  collaboration, discovery, and return on investment
Additional Information

 Orbis Cascade Alliance DDA Site
 http://www.orbiscascade.org/index/demand-
 driven-acquisitions-pilot
Thank You

QUESTIONS & ANSWERS

More Related Content

PPT
Driving integration and innovation at the canaccord lc pdf
PPTX
Turbo-charging Skills Evaluation Presentation
PPTX
Solution Tree PLC Luncheon Presentation
PPTX
LRT Assessment 20210922_v2
PPTX
The Road to Becoming a Center of Excellence
PPT
Enlaso Case Study Sep 2009 V4 Final
PDF
DevOps Journey_Distributed_Delivery
PPTX
Building a Global Center of Excellence | Antonio Espinoza – Director of Digit...
Driving integration and innovation at the canaccord lc pdf
Turbo-charging Skills Evaluation Presentation
Solution Tree PLC Luncheon Presentation
LRT Assessment 20210922_v2
The Road to Becoming a Center of Excellence
Enlaso Case Study Sep 2009 V4 Final
DevOps Journey_Distributed_Delivery
Building a Global Center of Excellence | Antonio Espinoza – Director of Digit...

What's hot (20)

PPTX
KAPPA AIA MBA Collaboration PP
PPTX
Annual Lecture and Awards Ceremony 2013: Wendy Purcell - Disruption and Disti...
PPTX
Core Values Of Knowledge Transfers
PDF
Promote_About_Promote
PPT
Access Programs P4 Term 1 08
PPTX
Lean Learning Academy overview
DOCX
Arghya-Sen-KS-EY
PPT
Best Practices Professional Development For Librarians
PPTX
Aug. 13, 2015 Service Center of Excellence Planning Kick-off Presentation
PDF
Student Mentoring Programs: The Why's, How's, and More
PPTX
Today's Learning and Technology – How Technology is changing the way people t...
DOC
Well br wbl maturity toolkit v 0.7
PDF
PDF
T353 Web
PDF
T353 Web
PPTX
The Agile Activity based Seating Report 2018 - Presentation
PPT
Is Your Training Progamme Effective?
PDF
Value of Project Management Trainings
PDF
Becoming a Learning Masterchef - Cooking Up the Right Blend - Webinar July 2013
PPTX
Measuring Learning Effectiveness: Beyond Training Happy Sheets & Memory Recall
KAPPA AIA MBA Collaboration PP
Annual Lecture and Awards Ceremony 2013: Wendy Purcell - Disruption and Disti...
Core Values Of Knowledge Transfers
Promote_About_Promote
Access Programs P4 Term 1 08
Lean Learning Academy overview
Arghya-Sen-KS-EY
Best Practices Professional Development For Librarians
Aug. 13, 2015 Service Center of Excellence Planning Kick-off Presentation
Student Mentoring Programs: The Why's, How's, and More
Today's Learning and Technology – How Technology is changing the way people t...
Well br wbl maturity toolkit v 0.7
T353 Web
T353 Web
The Agile Activity based Seating Report 2018 - Presentation
Is Your Training Progamme Effective?
Value of Project Management Trainings
Becoming a Learning Masterchef - Cooking Up the Right Blend - Webinar July 2013
Measuring Learning Effectiveness: Beyond Training Happy Sheets & Memory Recall
Ad

Viewers also liked (16)

PDF
Portada Atlas 2012
PDF
Felicidades1
PDF
WPP DN
PDF
cvphotoshop
PDF
2011 toyota camry le 20,930 miles
DOCX
Actividad 6
PDF
PPTX
Dia del padre
PPTX
Los apóstoles, los amigos de dios
PPTX
Presentación1
DOCX
Prueba redes sociales
PPTX
Presentación1 infor.
DOCX
Actividad creada numero 7
DOCX
Lavamos 1
PPTX
Diagrama carlos
Portada Atlas 2012
Felicidades1
WPP DN
cvphotoshop
2011 toyota camry le 20,930 miles
Actividad 6
Dia del padre
Los apóstoles, los amigos de dios
Presentación1
Prueba redes sociales
Presentación1 infor.
Actividad creada numero 7
Lavamos 1
Diagrama carlos
Ad

Similar to By popular demand building a consortial demand driven program (20)

PPTX
KM SHOWCASE 2019 - Lessons from the IFC - Kemal Cakici
PPT
The Dancing Agile Elephant
PPTX
Organising business field trips for operations management students
PPTX
Adaptive courseware vendor selection and engagement
PPT
Building Community Online
PDF
Cultivating TALint: Using the Core Competencies as a framework for training f...
PPT
Durham Bb presentation
PPTX
214 - Using the AUA CPD framework to create a new apporach to appraisals with...
PDF
5 Barrier to Effective Employee Training Programs and How to Crush Them | Web...
PPSX
U koersythesis&evaluation
PDF
Connecting silos in your institution for effective content operations
PDF
PDF
Gic2012 aula7-ingles
PPT
Content acquisition process
PPT
Presentation for wbl event on 28 june 2011 v2
PPT
ASAE Presentation: Managing Enterprise Projects
PPTX
Building stronger systems for sl os and program review
PPTX
Business Development Strategies to Build Your Employer Relations Program
PPT
OLE Project Regional Workshop - University of Kansas - Day 1
PDF
The IC Who "Lived" Me, Transitioning from implementing D2L to operations
KM SHOWCASE 2019 - Lessons from the IFC - Kemal Cakici
The Dancing Agile Elephant
Organising business field trips for operations management students
Adaptive courseware vendor selection and engagement
Building Community Online
Cultivating TALint: Using the Core Competencies as a framework for training f...
Durham Bb presentation
214 - Using the AUA CPD framework to create a new apporach to appraisals with...
5 Barrier to Effective Employee Training Programs and How to Crush Them | Web...
U koersythesis&evaluation
Connecting silos in your institution for effective content operations
Gic2012 aula7-ingles
Content acquisition process
Presentation for wbl event on 28 june 2011 v2
ASAE Presentation: Managing Enterprise Projects
Building stronger systems for sl os and program review
Business Development Strategies to Build Your Employer Relations Program
OLE Project Regional Workshop - University of Kansas - Day 1
The IC Who "Lived" Me, Transitioning from implementing D2L to operations

Recently uploaded (20)

PPTX
Modernising the Digital Integration Hub
PDF
1 - Historical Antecedents, Social Consideration.pdf
PDF
Produktkatalog für HOBO Datenlogger, Wetterstationen, Sensoren, Software und ...
PDF
The influence of sentiment analysis in enhancing early warning system model f...
PDF
How IoT Sensor Integration in 2025 is Transforming Industries Worldwide
PDF
Hybrid horned lizard optimization algorithm-aquila optimizer for DC motor
PPTX
2018-HIPAA-Renewal-Training for executives
PDF
Taming the Chaos: How to Turn Unstructured Data into Decisions
PDF
sustainability-14-14877-v2.pddhzftheheeeee
PDF
CloudStack 4.21: First Look Webinar slides
PDF
Flame analysis and combustion estimation using large language and vision assi...
PPTX
AI IN MARKETING- PRESENTED BY ANWAR KABIR 1st June 2025.pptx
PDF
Developing a website for English-speaking practice to English as a foreign la...
PDF
A review of recent deep learning applications in wood surface defect identifi...
PDF
Zenith AI: Advanced Artificial Intelligence
PDF
UiPath Agentic Automation session 1: RPA to Agents
DOCX
search engine optimization ppt fir known well about this
PDF
Improvisation in detection of pomegranate leaf disease using transfer learni...
PDF
How ambidextrous entrepreneurial leaders react to the artificial intelligence...
PDF
Convolutional neural network based encoder-decoder for efficient real-time ob...
Modernising the Digital Integration Hub
1 - Historical Antecedents, Social Consideration.pdf
Produktkatalog für HOBO Datenlogger, Wetterstationen, Sensoren, Software und ...
The influence of sentiment analysis in enhancing early warning system model f...
How IoT Sensor Integration in 2025 is Transforming Industries Worldwide
Hybrid horned lizard optimization algorithm-aquila optimizer for DC motor
2018-HIPAA-Renewal-Training for executives
Taming the Chaos: How to Turn Unstructured Data into Decisions
sustainability-14-14877-v2.pddhzftheheeeee
CloudStack 4.21: First Look Webinar slides
Flame analysis and combustion estimation using large language and vision assi...
AI IN MARKETING- PRESENTED BY ANWAR KABIR 1st June 2025.pptx
Developing a website for English-speaking practice to English as a foreign la...
A review of recent deep learning applications in wood surface defect identifi...
Zenith AI: Advanced Artificial Intelligence
UiPath Agentic Automation session 1: RPA to Agents
search engine optimization ppt fir known well about this
Improvisation in detection of pomegranate leaf disease using transfer learni...
How ambidextrous entrepreneurial leaders react to the artificial intelligence...
Convolutional neural network based encoder-decoder for efficient real-time ob...

By popular demand building a consortial demand driven program

  • 1. 2011 Charleston Conference BY POPULAR DEMAND: BUILDING A CONSORTIAL DEMAND DRIVEN PROGRAM
  • 2. Our 45 Minutes  Overview of Project  Lessons Learned  Process  Funding  Workflow  Communication  Marketing  Training  Evaluation  Questions & Answers
  • 4. Alliance Structure Council of Library Directors Board of Directors Task Forces Standing Interest Alliance and Teams Committees Groups Staff Collaborative Technical Services Team (CTST) and Implementation Team
  • 5. Timeline  2008/2009:  Alliance Strategic Agenda Adopted  May -June 2009: Ebook Task Force Issued Report  2010  January 2010: Ebook Team Created  August 2010: Selected EBL and Demand-Driven Acquisitions Model  2011  January 2011: Implementation Team Created  July 2011: Project Went Live  2012  Recommendation for Permanent Program?  Propose New Funding Model?
  • 6. Goals  Leverage Vendor Relationships  Consortium-Wide Access  Ownership  Collaborate with CTST  Equitable Funding Model  Relevant Content for All 36 Libraries  Evaluate the Project
  • 7. Funding Model  FTE of 17,000+ = $15,000  FTE of 8,000-17,000 = $10,000  FTE of 4,000-8,000 = $7,000  FTE of 1,000-4,000 = $5,000*  Community Colleges = $2,500  Total = $231,000 * Bulk of Contributions
  • 9. Collaboration & Organization Vendor Relations  Full Partnership Was Key  Bring Vendors in Early
  • 10. Collaboration & Organization Committee Staffing  Rely upon proven organizational strategies  Team overlap was valuable  Ability to see beyond own library environment  Technical discussion should be clear and understandable  It takes more time than anticipated
  • 11. Collaboration & Organization: Process  Discovery and access were more challenging than defining the approval profile  Can’t start work on discovery and access issues too soon  Complex and multi-level solutions will complicate the process and take more time.
  • 12. Collaboration & Organization Managing Expectations  Keep in mind, it’s a pilot  Remain flexible  Recognize the apprehension of participating libraries  For the consortium: have confidence in the team  Managing expectations should be part of the plan
  • 14. Workflow Environmental Scan  Summit  Innovative, WCL, and Evergreen OPACS  Varying Needs, Resources, Expectations
  • 15. Workflow Goals and Approach  One “best-of” Process  Gather & Measure  Define
  • 16. Workflow Requirements  WCP Records  Knowledge Base  Alliance Holdings Symbols  Monthly Record Delivery Schedule
  • 17. Workflow The Workflow  Knowledge Base Centric  Monthly  Batch-based
  • 18. Workflow Results & Revisions “Cosmetically” successful, but…..  Unsustainable  Sustainable  Off-base  On-target  Tardy  Timely
  • 19. Workflow Lessons  Difficult, Time Consuming Work  Build for Long-Term sustainability  Target Common Needs  Flexibility  Know Your Environment
  • 20. Communication NEVER EASY AND ALWAYS NECESSARY
  • 21. Communication Communication Structure  Alliance Structure  Channels  Vendors
  • 22. Communication Marketing  End User  Internal  Surprise Results
  • 23. Communication Training  Training Sessions  Online Documentation  Annual Meeting
  • 24. Communication Lessons  Communications management important throughout project  Responsibilities of teams and team members should be clear from the start  Clear expectations of how, where and when information will be shared  Can’t repeat the message too many times
  • 25. Evaluation of the DDA pilot WHY ALL THE FUSS?
  • 26. Real-time Data  Weekly emailed reports of group usage and expenditures  Reports posted on Alliance website for archive access  Monthly emailed spreadsheets with usage and expenditures by library  Alliance-specific reporting site
  • 27. Midpoint Evaluation  What is the midpoint?  Survey to assess effectiveness of training  Documents for libraries to self-assess  Profiles of library usage
  • 28. Midpoint Evaluation Usage Profile: Reed College Loaded records locally: 7/11/11 Classes started: 8/29/11 118 uses as of 10/14/2011 Total list price of content used = $12,004.14 Contribution: $5,000 Reed College has paid 42% of list price to facilitate access to content at its library.
  • 29. Final Evaluation  Yet to come  Survey of technical and content aspects of pilot  Evaluation of MARC records and discovery of content  Close analysis of usage, focusing on content, collaboration, discovery, and return on investment
  • 30. Additional Information  Orbis Cascade Alliance DDA Site  http://www.orbiscascade.org/index/demand- driven-acquisitions-pilot

Editor's Notes

  • #2: Our panel is on a demand driven acquisitions pilot project that is a partnership among Orbis Cascade Alliance, EBL, and YBP.
  • #3: I will provide an overview of our pilot project. Each of our panelists will describe an element of our project, along with lessons learned. We believe that our lessons learned are applicable to different consortia and individual libraries.
  • #4: As mentioned this is a partnership between the three parties.
  • #5: Orbis Cascade Alliance is a consortium of 36 academic libraries in Oregon and Washington. Institutions range in size from Chemekta Community College to University of Washington. The governance structure consists of a council of library directors that includes a representative from each institution. This body oversees all the Alliance’s activities, including policies, programs and the budget. The Alliance also has a Board of Directors. There are a variety of groups within the Alliance who manage and carry out different programs and initiatives set by the Alliance Council. The implementation team is a temporary team that is under the Collection Development and Management Committee (CDMC). A Collaborative Technical Services Team (CTST) is also participating in the demand-driven acquisitions pilot project through its ebook working group.
  • #6: In late 2008 and early 2009, the Alliance Council developed a strategic agenda that led to this pilot project. Our pilot project contributes to two of the five elements of the strategic agenda, cooperative collection development and collaborative technical services. Along with several other proposals related to cooperative collection development, CDMC explored different possibilities of a shared ebook program. In spring of 2009, an ebook task force was charged to investigate the opportunities and challenges associated with a shared ebook program. The task force’s report and recommendations led to the creation of a new ebook team. Starting in early 2010, this new ebook team was charged to select an acquisitions model for ebooks. This team investigated different acquisitions models, issued an RFI to ebook aggregators and publishers, and selected a demand-driven acquisitions model with EBL as the vendor of choice. Last January at ALA Midwinter, members of the implementation team, including representatives from EBL and YBP, had its first meeting. This new team also included members of the CTST. Our implementation team initially set a go live date of May 2011. As a result of workflow issues, we delayed the start of the pilot project until July 2011. We have reached the midpoint of our pilot project. We expect to continue the project while funds exist, which we anticipate lasting until January or February of 2012. Next week, the Alliance Council will vote on a recommendation to extend funding through the remainder of this fiscal year. In January 2012, the implementation team will make a recommendation on whether this should turn into a permanent program and outline a more sustainable funding model.
  • #7: The implementation team has met all its original goals for a shared ebook program. The participation of EBL and YBP has been critical to our success. We are also meeting our stated goal of participation by all 36 members. We are moving closer to the idea of one collection for 36 institutions through the ownership of ebooks by all the libraries. By selecting an aggregator, we provided content relevant to such a diverse group of libraries.
  • #8: Another key component of our project was every library had to contribute funds to the pilot project. This model is based on our model for e-resources that are purchased through the Alliance. For a permanent program, we hope to use a combination of FTE, acquisitions budget, and usage from the pilot project.
  • #9: .
  • #14: .
  • #15: As might be expected, the technical services environment across the Alliance libraries is complex. The consortia operates a shared WorldCat Local catalog, Summit. Some of the member libraries also utilize institutional instances of WorldCat Local, others maintain Innovative catalogs, some run both, and one library uses Evergreen. As such, the pilot’s implementation team knew that any successful workflow would need to speak to a diverse set of needs, resources, and expectations.
  • #16: That said, the prospect of creating and managing 36 unique workflows was smartly deemed too time-consuming and expensive. Rather, the Implementation team aimed to deliver a single workflow that could be effectively utilized by all. To help identify the best process, the implementation team and the Alliance’s Collaborative Technical Services team did a lot of info gathering. They solicited libraryfeedback via survey. The team asked the vendors to detail the cataloging options, defining such details as the availability of OCLC control numbers, the cost, and the quality of records. Finally, being a WorldCat Local consortium, the Alliance consulted with OCLC.These efforts provided a map for the workflow’s requirements. The survey revealed, for instance, that the majority of libraries planned to promote discovery through Summit and local record loading and were prepared for OCLC batch record retrieval and loading. The Alliance’s Collaborative Technical Services Team led the communication with OCLC. Of primary concern was how discovery and access would be achieved via Summit, as there was not a locally hosted instance of the shared catalog. The team learned this could be addressed by using a new service, the WorldCat Knowledge Base, which allows libraries to set holdings for electronic collections, facilitating access via “one-click” links.
  • #17: Taken together, the member library, OCLC, and vendor feedback advised the following requirements: The delivery of of OCLC WorldCat Cataloging Partners (WCP) records.  Use of OCLC Knowledge Base. Two Alliance holdings symbols, one to designate non-owned pilot content and one to designate purchased pilot content. A monthly record delivery schedule 
  • #18: The resulting workflow included several defining features. It was very much focused on updating and tracking the status of the Knowledge Base. Twice monthly, EBL sends files to OCLC to facilitate the creation of access links in Summit. It was monthly, with the OCLC updating the KB and the end of each month followed by the delivery of marc records and it was batch based. OCLC made the marc file available via EDX, the Alliance’s Collaborative Technical Services team retrieved it and then distributed it to the member libraries via a central website. And, finally, those libraries loading records processed the files.  
  • #19: As I’ve described, Knowledge Base was central to the activities and the schedule of the workflow. Largely, it successfully delivered the automated creation of “single-click” access links in Summit and in member library instances of WCL. For the most part, the Alliance, EBL, and YBP expected, planned for, and weathered the idiosyncrasies and surprises that come with any new product. Managing the workflow was not simple, however, for either the vendors or for the Alliance.It necessitated a large and time-consuming body of manual activities and workarounds. For example, for EBL, ensuring the monthly record orders only included titles with Knowledge Base links required a title-by-title review. For the Alliance, reviewing, trouble-shooting, and processing the monthly marc files involved the efforts of multiple team members and many hours of work. Looking toward the future and a post-pilot program, the workflow was unsustainable. Moreover, member library reporting revealed that the majority of intuitions were not relying on Summit or the KB to facilitate discovery and access to the pilot content. It this since, the focus of our efforts was off-base,Finally, while new content was added to the program on a weekly basis, discovery via Summit and the library catalogs was consistently one to two months delayed.In late September 2011, after an evaluation of the strengths and weaknesses of the workflow, the implementation team created a revisedprocess.Most significantly, the team chose to disconnect the delivery of records from the status of the Knowledge Base, and to receive weekly record deliveries. These changes were made to facilitate a timelier discovery service and to build the foundation for a sustainable, automated long-term workflow. The Knowledge Base remains an integral component of the process, but its status no longer dictates the timing of workflow’s activities. In this, the Alliance had to make a difficult compromise: without the careful timing of the original workflow, Summit access for some portions of the pilot content is broken – the needed links do not appear. This was accepted, however, as the new workflow better meets the discovery and access needs of most Alliance users.
  • #20: While the original workflow proved to be problematic and was modified, the implementation team absorbed several valuable lessons. Initially, as I’ve said, the team underestimated the time and resources required. While this mistake was quickly discovered, the project would have benefited from an earlier understanding of this. The team also learned the importance of beginning with a sustainable workflow, rather than building a difficult to manage, manual processes, we hoped would be automated in the future. Activities should be prioritized to address majority requirements and expectations. While Summit is an important resource for the Alliance and the workflow still addresses its needs, it is no longer the central focus of the workflow, as most of the member libraries are not using it as a primary discovery interface. The changes the implementation team made to the workflow highlight the importance of flexibility. From the start, the team understood that modifications—in all areas of the project—might be needed in order to influence success. Finally and most importantly, the all of the teams activities and decisions were informed by a holistic understanding of the Alliance’s technical services environment. While it could have been used more effectively at times, without this data and its detail, the team would have been lost and the workflow irrelevant.
  • #21: While there were no surprising lessons learned regarding communication, the experiences of the implementation team underline the importance of on-going management of information sharing. In the following section, we will take a quick look at how communications were handled through the Alliance structure; how training was planned and implemented and the marketing efforts pursued in support of the project.
  • #22: The existing committee structure has provided the framework for reporting out at the committee, council and individual level.The implementation team reports back to the CDMC, CTST and the Board through existing channels. While this has worked well in general, adjustments have been made to address communication problems as they arose. As the project was moving from the planning to the implementation phase, it became clear that representation from each institution was needed to support information sharing between the group and member libraries. As a result, the DDA Liaison group was formed. This group is supported by a new listserv, through which the implementation team can request and disseminate information.One issue that wasn’t recognized until the project was underway was that the two groups working on the project, the implementation team and the ebook cataloging team, were working in parallel with not entirely clear charges. In hindsight, both teams could have worked more effectively had they met early on in the process to compare charges and discuss decision making. The workflow issues that surfaced during implementation required technically difficult conversations about records and this, at times, left the team overwhelmed. This combined with the lack of clarity of the teams’ charges lead to delays in making decisions and moving the project forward. When the implementation team began grappling with these workflow issues the decision was made to add the head of the Ebook cataloging teamto the implementation team. This addition to the team allowed more direct communications about the technical challenges both groups were confronted with. These issues bring into focus the need for team members who are willing to work with complex issues and stay focused on the task at hand. The implementation team greatly benefitted from one member, in particular, with the ability to take complicated information and make it comprehensible to the rest of the team. Finally, in addition to implementation team members and Alliance staff, both EBL and YBP were represented at every meeting. By including vendors in the process from the beginning of the project, the Alliance benefitted by having the vendor’s close attention and understanding of the project.
  • #23: For the pilot project, the team decided against the use of end-user promotion. Issues considered in making this decisions included limiting variables to be evaluated, the size of the pool, and the nature of pilot projects. This decision raised questions among the public services community but appears to have been adhered to. Along with the other adjustments agreed upon in September, the ban on end-user marketing is now also being reviewed in light of the low usage statistics. It is possible that this policy will change for the final phase of the pilot. While the implementation team has worked hard to manage communication within the Alliance, the need for more internal marketing is an issue and something that is currently being addressed. Add: discomfort with new model; would have been good to have a general list for updates; remind people that this is s a pilot – need for flexibility; inability to answer questions right away, as answers are being worked out.Finally, one possible marketing success is that the project appears to be increasing the acquisition of ebooks within the Alliance at the individual library level. A number of libraries with little or no previous ebook experience have started to purchase ebooks since the implementation phase of the project began.
  • #24: 2 sessions were held soon after the project was launched in July, in two central locations. Additionally, training materials were made available on-line for those who could not attend the in-person sessions. Additionally, an annual Alliance meeting provided the opportunity for an additional training and informational session with members of the CDMC, library directors and others from the Alliance. Based on survey results, the training sessions were well received, overall. That said, it was clear, especially during the general Alliance meeting, that communication channels had not been working as planned given the number of questions regarding the basic structure and workflow of the project. The group received questions that had been covered in the implementation team’s FAQ and email updates to the liaison group. While the training sessions were well received, there was some concern about the timing of them. Due to the workflow issues that arose late in the implementation process, the scheduling of these sessions was delayed a few times. In hindsight, it is clear that explaining the reasons for the delays at the time might have allayed concerns.
  • #26: Why so much emphasis on evaluation? With thirty-six libraries involved in the e-book pilot, we are dealing with a mix of funding levels and a mix of comfort levels with this type of experiment. So by emphasizing evaluation, we hope to definitively show return-on-investment for those libraries who felt uneasy about their level of contribution as compared to their small available budget. We also want to set the stage for a continuing program, past the six-month pilot. If we can show usage from all the libraries and show that their users received a measurable benefit from the program, we will be better equipped to create a continuing program that involves the whole group. Additionally, to make this continuing program work, we are evaluating the structure of the pilot as it progresses to understand which components are successful and which will need to change.Another factor in evaluation is the publisher involvement. We need to demonstrate the value of the model to the publishers involved, as well as to those publishers who decided they did not want to contribute content. If the pilot shows a good outcome for the publishers involved, more may be convinced to join.Finally, there is the broader impact. We are the first group of this size to attempt a demand-driven pilot and many other consortia are interested in our progress. We have received calls and emails weekly from consortia interested in the pilot. Creating a detailed and meaningful evaluation will help other groups structure their programs to achieve their goals.
  • #27: An early lesson learned was that monthly expenditure reports posted on our consortium website would not be nearly enough for the thirty-six member libraries. Instead, we moved to weekly reports emailed to a list of liaisons, who were then asked to share as needed with their colleagues. We brought these together into monthly reports that were emailed and posted on the website. These monthly reports expanded the weekly information to provide usage information for each library. Finally, EBL set up a real-time reporting site for the use of the implementation team. The team could access this site at any time to get current information about the pilot usage and expenditures, in order to troubleshoot or answer unexpected questions.
  • #28: For a six-month pilot, we expected to do a midpoint evaluation at three months, only to realize that we had spent far less than half the money at that point. So what really is the midpoint then? Half the time or half the money?We realized, though, that the midpoint evaluations didn’t depend on dividing the time or money exactly in half. We planned to survey the thirty-six libraries to understand the effectiveness of our training and that needed to be done before the training faded from memory. We had also planned to create usage case studies of a few libraries to show examples of the effect of the program. However, we decided that, even if we included directions on how to calculate return-on-investment, not all the libraries would go through the process, so we created profiles of all thirty-six, instead of a few case studies. We also made the calculation method available in case libraries want to do a later analysis.
  • #29: For a six-month pilot, we expected to do a midpoint evaluation at three months, only to realize that we had spent far less than half the money at that point. So what really is the midpoint then? Half the time or half the money?We realized, though, that the midpoint evaluations didn’t depend on dividing the time or money exactly in half. We planned to survey the thirty-six libraries to understand the effectiveness of our training and that needed to be done before the training faded from memory. We had also planned to create usage case studies of a few libraries to show examples of the effect of the program. However, we decided that, even if we included directions on how to calculate return-on-investment, not all the libraries would go through the process, so we created profiles of all thirty-six, instead of a few case studies. We also made the calculation method available in case libraries want to do a later analysis.
  • #30: We are planning the final evaluation for February 2012. We’ll start with another survey to learn how the member libraries felt about the process of loading MARC records locally, the profile of titles, the publisher list, and the perceived effect of discovery option choice. We will look if the quality of cataloging affected usage – i.e. if better-cataloged titles were more likely to be purchased. Finally, we’ll examine usage by a number of different measures:Content:Usage by subject and publisherCollaboration and Dissemination:Number of unique users overall and per library, and how many schools contributed to the purchasesDiscovery:Usage based on discovery options and timing of record loadReturn on Investment:List price of content accessed/loaned/purchased versus individual library contribution By creating this final report, we hope to show the value of the program to the Alliance libraries and to the contributing publishers. We also hope to understand how we can evolve the program in the future to best meet the needs of our group.