Five-Year Evaluation of New York's
Statewide Outcome-Based Evaluation Training Program

Submitted by: Eleanor Carter Carter Consulting Services, January 2007

This document also available in .PDF

Contents

New York State Library Plans for Outcome-Based Evaluation (OBE)

The New York State Library has taken a leadership role in working with the Institute of Museum and Library Services (IMLS) since 2003 to bring Outcome-Based Evaluation training to its library systems and their member libraries, so that the results of activities each year can be reported to meet the requirements of the Government Performance and Results Act of 1993 (GPRA).

Outcome-Based Evaluation (OBE) is defined by IMLS as a systemic way of assessing the extent to which a program has achieved its intended result. It answers questions such as "What difference did the program make?" and "How did the participant benefit from the program?" OBE is useful both as a planning tool and as an evaluation tool. Outcomes are beneficial changes for program participants that include changes in skills, knowledge, behavior, attitude, status, or life condition.

Although Outcome-Based Evaluation cannot be used for every project, the benefits of using OBE evaluation techniques are many. It can be used as a planning tool, as well as an advocacy tool, and can help communicate the value and success of a program. It can also help libraries focus their limited resources on their most effective programs that address the highest priorities. The State Library, in its LSTA Five Year Plan 2002-2007 made a commitment to OBE training and application to all appropriate LSTA projects.

In New York State’s LSTA Five-Year Plan for 2002-2007 the following was proposed:

“EVALUATION PLAN: The process of developing a Request for Proposal for evaluation of New York’s first LSTA Five-Year Plan, intensive work with the evaluation firm that received the contract, and the State Library’s work with a planning consultant on developing a new LSTA Five-Year Plan have all helped the State Library to identify a need to institutionalize evaluation methods as part of ongoing operations.

Over the five years of the next LSTA Five-Year Plan, the State Library intends to develop a training program for State Library and library system staff to assist them in using both performance (outputs) and results (outcomes) measures in their progress towards excellence…

As the State Library moves forward with its plan of incorporating outcome-based evaluation within its operation and with the projects supported by LSTA, grant applicants will be expected to frame their applications to reflect this type of evaluation for categories in which it may be required. They will identify the anticipated outputs and outcomes in their applications and report the results at the end of the project.”

The Five-Year Plan also identified key targets for training State Library librarians and system librarians and for training trainers to provide local level training. At the time the plan was written, the State Library was not positioned to identify appropriate and accurate key targets. As the librarians from the State Library underwent IMLS OBE training, worked with an OBE consultant continuously since 2004, and with the library community to carry out its OBE initiative, the key targets evolved into two documents:

  1. A ten-stage OBE Training Plan for New York and
  2. An OBE Logic Model for Statewide OBE Training. This report will identify results of OBE training matched to those two documents. Both documents are contained in full in Appendices A and B of this report.

OBE Training Results (Matched to OBE Training Plan for New York)

 The ten-stage OBE Training Plan for New York was approved by IMLS in 2003 and was implemented without delay.

2003

Stage 1 of the Plan: Training of New York State Library Staff was completed in June 2003. IMLS trained twenty-five participants including New York State Library staff, an evaluation consultant, and selected systems staff to use the OBE model and apply it to their individual areas of responsibility. In that training, participants achieved the desired outcomes of “understanding the components of OBE and practicing building a logic model.” They were each able to “write outcomes and indicators that were acceptable to the trainers for at least one of the programs they administer.”

2004

Stages 2 through 6 of the Plan: Develop Comprehensive Training Materials, Test Materials, Review and Revise Training Material in Preparation for a Statewide OBE Training Program were completed in 2004.

The Division of Library Development OBE Project team working with a consultant developed a prototype of a comprehensive training package that included a Power Point slide program, a training manual for participants, training activities, useful handouts, and a framework for an evaluation plan. A journal template was also developed as a means of capturing feedback on the training package.

The prototype was tested in a two-day pilot-training workshop for fifteen library systems trainers. The presenter and three members of the OBE team served as observers who kept records of issues for revision. Each of the participants completed a post-workshop survey and turned in a journal record of his/her two-day experience. The journals detailed content problem areas, made suggestions for scheduling of activities, identified instructional gaps, called for more library-specific examples of each of the activities, and suggested an alternative to turning in completed logic models for review. The OBE team met to discuss each of the recommendations for change and to brainstorm the best way to evaluate the learning outcomes of OBE training.

Over forty revisions were made to the training package to provide clarification of topics, to include tips for completing the more difficult activities, to include more library-specific examples of outcomes and indicators, and to add an independent end-of-workshop exercise that could be used to evaluate individual learning. The latter was an important addition because the training was designed for teams of three to four professionals to collaborate on a team project. Often the teams were composed of individuals who had very different job situations or responsibilities. While they practiced writing outcomes and indicators and a logic model for a team project, the OBE project team wanted to know if individuals could write outcomes for their own projects. The final independent exercise answered that question.

While the pilot training workshop aimed at refining the comprehensive training package, the OBE project team was concerned that the participants were not short-changed in their learning of OBE concepts. Therefore, the team assessed the participants’ OBE skills, their pre and post workshop confidence levels, and their satisfaction with the presentation with the following results:

Pilot Participants’ OBE Skills assessed by review of individual evaluation plans (logic models).

Fifteen participants (100%) completed the program and turned in completed and acceptable program logic models demonstrating ability to write outcomes, indicators, data sources, data intervals, targets, and target achievements and include them in a complete plan for evaluation.

Pilot Participants’ Confidence Levels

Thirteen participants turned in post-workshop surveys. The surveys indicated that prior knowledge of OBE was very limited and prior use of OBE was non-existent. The respondents completed a 6-point scale with 1 indicating low confidence and 6 indicating high confidence. The following chart shows the confidence levels at the two lowest levels at the beginning of the workshop and the confidence levels at the two highest levels at the end of the workshop. The responses were gratifying considering that so many needed revisions to the prototype were identified. It should be noted that several of the pilot participants took a revised basic OBE workshop at a later date and went on to advanced training.

Skill/Knowledge
Low Confidence At Start 1-2
High Confidence at End 5-6
% Change  Low to High

Use OBE as a management tool to measure your program outcomes

9

8

88.8%

Assist staff in implementing OBE

10

9

90%

Identify the basic elements of an OBE Plan

9

9

100%

Distinguish outputs from outcomes

9

9

100%

Provide at least one reason why measuring program outcomes would benefit the work that you do

6

6

100%

Identify the three elements of a program purpose statement

7

7

100%

Write outcomes and indicators for a program you wish to measure

7

7

100%

Use outcome data to report on program results

6

6

100%

Apply OBE to other programs or services you offer

9

9

100%

Satisfaction of Pilot Participants

The following chart records participants’ responses on a 6-point scale showing the percentage of responses at the two top levels. It should be noted that the chief concern was ability to provide examples to major points, which became a major thrust of the revised package.

 

5 points out of 6
6 points out of 6
Combined 5-6

Satisfied with knowledge of presenter

2 (15.4%)

10 (76.9%

12 (92.3%)

Presenter ability to respond to your questions

3 (23.1%

7 (53.8%)

10 (76.9%)

Ability to provide examples to major points

2 (15.4%)

7 (53.8%)

9 (69.2%)

Ability to explain difficult concepts

4 (30.7%)

6 (46.2%)

10 (76.9%)

2004-2006

Stage 7: Implementation of OBE throughout New York State Library and library systems is an ongoing process. The objective was to apply OBE wherever possible for plans, applications, and reports. Plans and reports were expected to reflect training provided to library system staff and yield a rich resource for planning, decision-making, and advocacy. Indeed, that has occurred if not as pervasively as expected. LSTA grant applications and reports do reflect OBE training. The plan for this to occur was built into the aforementioned document, OBE Logic Model  for Statewide OBE Training. Analyses of those results are included in the section heading Stage 8: Train Trainers.

In a basic training workshop in 2005 as part of an exercise, Division of Library Development staff responsible for Plans of Service developed a logic model for using OBE to review and revise plans of service to achieve a pre-identified standard. In subsequent workshops also as a training exercise, two systems librarians who received OBE basic and advanced training developed surveys for their member librarians to complete as part of their systems’ development of plans of service. The surveys described a commitment to OBE to the members and sought information that would enable the systems to develop plans of service with an OBE focus. While these preliminary training exercises could serve as models for incorporating OBE into the process of developing and evaluating plans of service, using OBE for Plans of Service has not yet been “required” so it remains voluntary and sporadic.

In 2006, a training team including a Division of Library Development staff member responsible for the Statewide Summer Reading program developed as a training exercise in a NYSL-sponsored basic OBE workshop three items that have potential for use in evaluating summer reading programs. It is recognized that IMLS is working on methodology for evaluation of summer reading programs; Division of Library Development took no official action in support of these three items. They are referenced here as examples of the brainstorming about summer reading evaluation that occurred during OBE training. The three items are: 1) A list of behaviors that indicate impact of summer reading programs on children and parents, 2) a chart of skills that may be outcomes of summer reading and two levels at which the skills might be applied and 3) a sample parent/older child survey that could be used by local libraries to examine the impact of summer reading programs.

In addition, as training exercises, several systems librarians and local public librarians worked on summer reading outcomes in state-sponsored basic OBE workshops. They practiced writing qualitative outcomes that would do more than count participants or collect numerical data about program activities. Such outcomes included conducting follow-up activities/events that would enable children and/or parents to demonstrate visible results of summer reading such as:

  • Programs where children can tell stories, act out stories, recite poems, songs, etc. that reflect their summer reading with parents, grandparents, and other caregivers in attendance;
  • Challenges for children to turn in records of reading accomplishments with evidence that children can articulate something about what they read and/or that they read with understanding;
  • Opportunities to share reading journals in person or online to demonstrate levels at which children can communicate about their reading;
  • Follow-up surveys given to parents and older children that collect qualitative information about the benefit of summer reading programs to users.)
  • Follow-up surveys of peer reading groups, year-round book buddy programs, and teen volunteers who read to children.
  • Incentives for parents and children to respond to surveys, turn in reading records, etc.
  • Follow-up with parents and children on suggested family activities produced by the local library to match summer reading themes;
  • Review of controlled online chat sessions for summer reading participants to share discussions about books and their summer reading.

In 2005, New York State Library partnered with12 Library systems, 6 graduate schools of library and information science, and the New York Library Association on a proposal for a statewide recruitment project titled Making It REAL! Recruitment, Education, And Learning: Creating a New Generation of Librarians to Serve All New Yorkers. The project included outcomes and indicators from the beginning and was funded ($995,660) by the Federal Institute of Museum and Library Services.

In 2006, the project reported that: “All partners, under the guidance of the grant evaluator, have completed their individual outcome-based evaluation logic models. During the next 6 months of the grant period, the grant evaluator will be following up with all grant partners as to their progress in achieving their outcomes and goals. The outcomes identified in the logic model developed at the outcome-based evaluation workshop in Washington, D.C. in December 2004 are mostly long-term outcomes, so they have not yet been achieved. However, Outcome #1, which states "Scholarship students graduate with MLS/MLIS degrees within grant period" has begun to have results. Since the last report, two students have completed their studies and received library degrees. One student is now a certified school media specialist and the other student will go into law librarianship.

It is clear from the activities reported under stage 7 that OBE has made its way into many areas of State Library, Division of Library Development everyday activities as planned including plans of service, statewide programs, and partnership activities. Added ways that the stage 7 goals have been met are found in the stage 8 sections on training results that follow.

Stage 8: Train Trainers for Member Libraries’ Training has been the major thrust and underpinning for the entire NYSL OBE project. As part of the development of a comprehensive training package reported in Stages 2-6 of this report, a logic model was developed to specify intended outcomes and indicators as well as targets and target achievements for the OBE training. This section of the report will address results related to the OBE Logic Model for Statewide OBE Training (Appendix B).

Item 1 in OBE Logic Model

Training Outcome 1: Immediate Outcome: Training participants plan OBE measures of intended program outcomes.

Indicator(s)
Data Source
Target Audience (To Whom Indicator is Applied)
Data Intervals
Target Achievement Level (Goal)

# and % of training participants who write at least three clearly defined, measurable outcomes in an OBE plan (logic model) as assessed by a trained reviewer during the workshop and by a final independent exercise.

Trained reviewer rating of all required elements of measurable outcomes.

All who complete the training. N=75

End of workshop

90% N=68

The following chart of OBE basic training results shows that the first intended outcome for OBE training was met and exceeded. Thirteen workshops were held for participants from 60 systems from all areas of New York State. 155 participants completed a two-day workshop and were tested on a final independent exercise that demonstrated ability to write intended outcomes, indicators, data sources, data intervals, targets, and target achievements. 154 completed the exercise successfully. One participant wrote outputs, not outcomes. The overall success rate was 99.4 %. Outcome 1 predicted that 75 systems trainers would be trained in the use of OBE, that 68 (90%) would be successful in writing acceptable OBE plans. The predicted training cohort number of 75 was exceeded by 106.6% and the 90% success prediction was exceeded by 9.4%.

Date
Type of Training
Location
Attendance
Passed Test
Percent

Summer 2004

Basic

Brooklyn

17

17

100%

Fall 2004

Basic

Manhattan

19

18

92.3%

 

Basic

Painted Post

12

12

100%

 

Basic

Poughkeepsie

12

12

100%

 

Basic

Utica

15

15

100%

Spring 2005

Basic

Long Island

6

6

100%

 

Basic

Batavia

11

11

100%

 

Basic

Saratoga

18

18

100%

Spring 2006

Basic

Dunkirk

11

11

100%

 

Basic

Utica

17

17

100%

 

Basic

Potsdam

5

5

100%

Fall 2006

Basic

Highland

6

6

100%

 

Basic

Painted Post

6

6

100%

Item 2 in OBE Logic Model

Training Outcome 2: Intermediate Outcome: Training participants use OBE in their grant applications. 

Indicator(s)
Data Source
Target Audience (To Whom Indicator is Applied)
Data Intervals
Target Achievement Level (Goal)

# and % of training participants submit a grant application during a subsequent grant cycle who achieve a normalized score of 90

Grant reviewer rating including inter-rater check to achieve normalized score.

All who complete the training who also submit a grant during a subsequent cycle.

End of grant application review

50% N=37

To evaluate Outcome 2, the plan was to look at the reviewer rating scores on grant applications of individuals who participated in OBE training and who subsequently submitted an acceptable grant application. That did not prove to be a measure of whether or not OBE was part of the approved application because not all applications that scored acceptable for funding were appropriate for OBE. As an alternative all grant LSTA and Gates grant applications for 2005 and 2006 were reviewed for the presence of OBE elements. 2004 applications were examined but they were received before any of the applicants were trained. The review was valuable, however, because it showed that none of those proposals were using OBE elements.

The data presented here are drawn from 65 applications reviewed. The only applications reviewed for the presence of outcomes elements were those received with a stamped date following the training date of the applicant and those that could and should be evaluated using OBE. In that category there were 26 applications. All 26 (100%) contained multiple elements of outcome-based evaluation. Four of them contained outcomes that reached the patron level. It is important to note that in initial training, participants came primarily from systems. Many of them stressed that their mission is to serve member libraries. In the workshops, they were required to write outcomes that reached the patron level. For example, if the system trained librarians to use databases, they would assess the learning that took place and they would follow-up to see if librarians used what they learned. In the workshops they were urged to plan how to ascertain how patrons benefited when the newly trained librarians gave them assistance and/or taught them to use databases independently. Many of the systems librarians felt they had no way to get information about patrons. Four of the 26 applications did advance outcomes to the patron level. As OBE makes its way into the consciousness of the member libraries and as systems’ successes in obtaining patron outcomes is observed by the library community, it is expected that more systems will ultimately seek patron outcomes.

When writing indicators that would assess whether participants continued to write OBE plans after the workshops, the focus was on use for LSTA and Gates grant applications and in grant reports. Another indicator should have been written that surveyed participants to find out what other ways they applied their new learning. The training consultant urged participants to share future work. Voluntarily 12 additional evaluation plans (logic models) that were used by participants for applications to other agencies for funding or simply for program management were shared with the consultant.

Item 3 in OBE Logic Model

Training Outcome 3: Long term Outcome: Training participants use OBE in their grant reports to show measurable results of technology training programs.

Indicator(s)
Data Source
Target Audience (To Whom Indicator is Applied)
Data Intervals
Target Achievement Level (Goal)

# and % of training participants who received a grant during a subsequent grant cycle who report results of intended outcomes as assessed by a trained reviewer.

Trained reviewer rating

All who complete the training who also received a grant during a subsequent cycle.

End of grant report review

90% N=33

There were 26 of the participants who received grants and reported during the grant cycle under review. To meet the target achievement level of 90%, 23 would have to report outcomes. Outcomes were reported in 17 grant reports (65.4%) Several others referred to outcomes but evaluation data was not available at the time of the report. There continues to be some mixing of the concepts of outputs and outcomes. The language of the grant report form does not lend itself to easy reporting of outcomes. There is no question about predictions or outcomes and indicators.

Of the reports with outcomes, many of the outcomes had to be extracted from a narrative addressing the qualitative results of the project. Five of the reports, however specified outcome statements, indicators, and results and could be considered model reports.

Item 4 in OBE Logic Model

Outcome 4: Long term Outcome: Training participants report that follow-up mentoring helped them apply OBE principles better.

Indicator(s)
Data Source
Target Audience (To Whom Indicator is Applied)
Data Intervals
Target Achievement Level (Goal)

# and % of training participants who use follow-up mentoring who score 80 or better on a 100-point evaluation scale

Satisfaction surveys

All training participants who request follow-up mentoring

After 6 months of mentoring service, then annually

90% of training participants who use the service

A formal mentoring service has not yet been established. The training consultant offered at each training session a promise to review any completed evaluation plans (logic models) created after the workshops. Ten evaluation plans were sent electronically to the consultant and feedback was provided. Two other participants sent the consultant Plans of Service to review. Three sent outcomes sections of grant proposals and two sent data instruments. In every case notes of thanks were received and several sent revisions to show that comments had, indeed, been useful. Since there was no official mentoring, those who received informal mentoring were not surveyed.

Unintended Outcome: Advanced Training Added

Several participants expressed the need for advanced training. Many wrote outcome statements and indicators during the basic workshop that called for data instruments that they did not know how to develop. Many had no idea how to report outcomes or how to merge output and outcome data in a report. Instead of individual mentoring two advanced training sessions were offered to those who had successfully completed basic training. The results of advanced training can be reported as unintended but significant outcomes.

Even though the advanced workshops were not originally planned, the trainer wrote outcome statements and indicators before the workshops to provide a prediction for evaluation purposes. The participants came with projects they had developed since initial training. A course requirement was to bring a project that had a completed logic model and required some form of advanced work. We predicted that participants would work on development of data instruments, evaluation timelines, development of reports, or analysis of data. Each participant was expected to end the workshop with one usable product. Results are as follows.

Outcome: Participants apply OBE techniques to individualized projects meeting standards for collecting outcomes. Indicator 1: # and % participants who complete at least one acceptable OBE product as assessed by trained observers during the workshop and by participant self-assessment at the end of the workshop. Indicator 2: # and % participants who complete at least one acceptable OBE product as assessed by participant self-report at the end of the workshop. Target: 21 participants already OBE trained.   Target Achievement: 21 (100%)

Results: Indicator 1:Trained Observers (2) report that 21 participants (100%) achieved the predicted outcome. Most developed multiple products. Indicator 2: All 21 report satisfactory development of at least one advanced OBE product. Most reported several products developed. The participants were asked to complete open-ended questions to see if they could articulate the value of the training. Specific patterns of response are as follows:

Skills developed during workshop
Number of participants
Percent of participants

Rubric development

15

71.4%

Survey development
15
71.4%
Open-ended question design
15
71.4%
Interpreting survey data using rubric
6
28.6%
Progression of outcomes to checklists to rubrics to survey to analysis
4
19.0%
Distinguishing intended and unintended outcomes and reporting both
10
47.6%
Writing disclaimers to guarantee privacy
15
71.4%
Creating data collection plans
7
33.3%
Refining Outcomes (e.g. recasting in patron terms, distinguishing managerial outcomes from patron impact)
6
28.6%
Developing checklists
14
66.6%
Distinguishing outputs and outcomes and reporting both.
5
23.8%

 

Products Developed
Most helpful
Least Helpful
Still Want to Learn

Rubrics (15) Surveys (15) Checklists (14) Self-assessment tools (3) Revised outcomes/indicators (6)

One-on-one help from presenters (10) Outcomes refinement (4) Survey development (7) Rubric development (8) Open-ended question design (4) Report writing (5) OBE review (3) Sharing and feedback (9) Application of OBE model to my own work (7)

Group sharing (1) Down time while others were being helped (5) Report writing (2)

Combining outcomes and outputs into comprehensive report (6) Data analysis (6) Data collection (3) More practice developing products (3)

 

Summary of Advanced OBE Workshops: 21 participants who had received basic training attended two-day advanced training workshops. Each participant (100%) successfully completed at least one product. Most completed several. The products included such things as development of instruments (rubrics, checklists, surveys), OBE reports; data interpretation, data collection plans.

Date
Sponsor
Type of Training
Location
Attendance
Produced Advanced Products
Percent

Fall 2005

Gates Foundation

Advanced

East Greenbush

10

10

100%

Fall 2006

Gates Foundation

Advanced

East Greenbush

11

11

100%

Stages 9 and 10: Library Systems develop training plans for OBE implementation and System staff train member libraries’ staff.

The original plan was that systems trainers would attend the training provided by the State Library and would in turn train member libraries’ staff in these two stages. Several variables led to some different action and to a mix of solutions. In some cases the systems did not send their trainers to the initial training. While those individuals may have applied OBE to their areas of responsibility and many of them did, they were not positioned to train member library staff. Others did not feel that the initial two-day training equipped them to train others. Many felt they were still novices, not experts.

Five approaches were taken that attempted to respond to the above reality.

  1. Train the Trainer manuals were designed and produced. The manuals provided trainers with commentary to accompany each of the slides in the basic training program. Information was provided to help respond to common issues, problems, misunderstandings, and objections that commonly occur in workshops. Exercises were developed for future trainers to practice how to help trainees improve the elements of and OBE plan as they are working in a workshop setting. Sample evaluation instruments were included. An independent exercise was developed for trainers to test their abilities to identify trainee mistakes and correct them in a positive manner. Ten participants who had attended basic training attended a Train the Trainer workshop in October 2005. Each of these participants also attended an additional advanced training workshop. After 6 days of OBE training in the three types of workshops, these 10 individuals were certified by the State Library as OBE Trainers and listed on the OBE website for systems to contact for member library training.

Summary of Train the Trainer Workshop: Ten participants who had completed basic OBE training attended a two-day Train the Trainer workshop. At the end of the workshop each completed a final independent exercise. The exercise included samples of typical errors made by OBE learners. The participants demonstrated their ability to identify the problems and propose solutions in a manner similar to what would be required of an OBE trainer. All ten (100%) completed the exercise successfully and were certified by the New York State Library, Division of Library Development as OBE Trainers.

Date
Sponsor
Type of Training
Location
Attendance
Passed Test & Certified
Percent

Fall 2005

Gates Foundation

Train the Trainer

East Greenbush

10

10

100%

  1. The Division of Library Development chose to continue to offer basic training workshops, now open directly to member libraries. More are planned for 2007.
  2. Individual Certified Trainers have organized and offered workshops in their areas for all types of librarians including academic and medical librarians. Two such workshops served 31 participants. Those reported learner success with OBE applications and an average satisfaction rating of 4.1 on a 5 pt. Scale. The next step is to follow up with all the certified trainers and the systems to determine how further training will occur.
  3. OBE Website: The New York State Library Website has an OBE section accessed on the Division of Library Development homepage. The website contains the OBE basic training program, the Train the Trainer manuals, a report on OBE activities including workshops, the list of certified OBE trainers, the State Library’s Ten Stage Training Plan, and links to other OBE information. Individuals who wish to work independently to learn OBE can use the manuals on the website.
  4. Gates/WebJunction 2005-2006 Rural Sustainability Program for rural and small libraries serving fewer than 25,000 people: The State Library incorporated OBE awareness training in this program for rural libraries. In a series of 11 regional workshops given by a State Library Certified Trainer who participated in the Train the Trainer workshop, 497 participants were required to develop an action plan with an evaluation component and were introduced to the concepts of OBE.

Overall Report Summary

The State Library, Division of Library Development OBE accomplishments have surpassed what was originally included in the LSTA Five-Year Plan. When the plan was written, there was a commitment to incorporate OBE evaluation methodology into the operations of New York’s libraries, but no specified means to achieve that goal. The subsequent Ten Stage Plan to roll out OBE and the OBE Logic Model made OBE goals concrete. The Division of Library Development set out systematically to achieve all aspects of the Training Plan and the Logic Model with a high degree of success. The process has remained fluid, with revisions and changes continually being made to respond to the needs of New York’s libraries.

OBE Training has included all 73 systems across New York State as follows:

Type of Training
Number of Participants

IMLS training of State Library staff Pilot training of systems’ staff OBE 2-day Basic Training Train the Trainer 2-day workshops Advanced Training 2-day workshops Gates Regional Workshops

25 15 155 from 60 systems 10 (of original 155 trainees) 21 (of original 155 trainees) 497

Totals

723 (minus 31 overlapping = 692)

 OBE practice has made its way into several LSTA grant program applications, data gathering, and reports. It has been used for some plans of service and for rural library action plans. It has been used by some systems to seek funds from other than LSTA sources and by some for general management activities. It clearly has become part of the consciousness of librarians in New York State.

Recommendations for New Five-Year Plan:

It is recommended that the State Library:

  1. Continue to provide direct training to library system and member library staff.
  2. Follow-up with certified trainers to identify future needs and report results of their training efforts.
  3. Follow closely and participate in federal-level efforts to identify outcomes for major categories of library activity such as Summer Reading. Communicate with the library community as available so that member libraries need not “reinvent the wheel.”
  4. Communicate best practices, e.g. well-written outcomes reports to library community.
  5. Survey the systems regarding best future approaches for OBE training.
  6. Revise grant forms (applications and reports) to reflect OBE terminology.
  7. Integrate OBE methodology with Plans of Service requirements.

Note the last two recommendations grow out of comments made by many workshop participants that OBE is difficult, a lot of work, and not really “required.”

Appendix A: Outcome-Based Evaluation Training Plan

Rationale

The New York State Library (NYSL) proposes to develop a training package to help the staff of the State Library and library systems to build their capacity for using results-oriented evaluation in their State and Federal programs. Over the long term, the NYSL expects the training to spread to the systems' member libraries as well, and this plan includes a "train-the-trainer" component to assist the systems with that long-term goal.

During the process of evaluating the first five years of the LSTA program, the NYSL learned through its evaluation consultant and evaluation facilitator that there is a great need for outcome-based evaluation throughout the library community. Current data being collected is insufficient for measurement of the impact of LSTA on the library services of the State as these data are focused more on the activities of those providing the services than on user outcomes. More and more the numerous funding agencies (State government, Federal government, local government, private corporations) are asking library service providers to show the impact of their services. At the same time, the library professionals are not trained in how to do this. Even people with educational backgrounds find the demands of results-oriented evaluation confusing and difficult.

New York State's libraries and library systems are facing some difficult times over the next two to three years, as are many libraries in other states. The NYSL believes that it is even more important in hard times than good ones for librarians to be able to show evidence that libraries have value for their users and that programs libraries offer them affect their lives. This evidence can be presented to funders in justifying budgets. It can also be presented to the users themselves to help organize users as advocates for libraries.

Finally, the NYSL, looking ahead to the next five-year evaluation of LSTA, believes that the process of training librarians in outcome-based evaluation will improve that next major evaluation. Linking the results viewpoint to advocacy will also help the NYSL in implementing its new advocacy plan.

As a result of identifying weaknesses in its evaluation of programs in the first five-year evaluation of LSTA, the NYSL affirmed its intent to develop a comprehensive results-based approach in its new Five-Year Plan. It expects to adopt OBE methodology broadly for its work, not just for LSTA programs. The NYSL proposes a multi-stage project to train key participants in OBE as described in the following pages. There are eight stages. The timeline for the whole plan depends on setting the date for the workshop in Stage 1. Once the training in Stage 1 is complete, the NYSL expects to implement Stages 2-6 within a year to eighteen months. The remaining stages will probably take another two to three years.

Training Plan

Stage 1

Overall goal: To increase NYSL staff capacity to use an Outcome Based Evaluation model to measure outcomes of all New York's library programs, State and Federal.

Target audience: NYSL staff who profess an interest in OBE and are willing to apply their OBE knowledge to the programs they administer. An evaluation consultant working with the NYSL on the training package will also be included in this stage.

Desired outcomes:

      1. NYSL staff will understand the components of OBE and be able to build a logic model.
      2. NYSL staff will be able to write good outcomes and indicators for at least one program they administer.

NYSL staff person: Sara McCain

Dates: Sometime in June, 2003 would be best as the rest of the summer will be taken up with New York's FY 2004 LSTA grant applications. Staff could be available from May 26 through June, except for June 2-4 when a number of staff will be attending a public library system conference.

The NYSL requests the following from the Federal Institute of Museum and Library Services:

  1. Trainers to conduct a two-day introduction to OBE--in Albany--for selected staff from the Division of Library Development (LD) and the Research Library (RL) and the evaluation consultant. The three LD staff who have already attended training in Washington will also participate in this training because they will be part of the LD project team that will implement Stages 2-8.
  2. Training materials with individual copies for each attendee.
  3. Any prerequisite homework assignments (readings, exercises, etc.) that attendees would have to complete or read before the training.
  4. Any follow-up activities that attendees would have to complete.
  5. Follow-up by either email or telephone to review attendees' completed assignments and advise changes and revisions.

Stage 2

Objective: NYSL OBE project team will develop a comprehensive OBE training package.

The elements of this package include:

  • Rationale for training including how OBE provides a stronger basis for advocacy and helps make tough decisions in hard times.
  • Pre-requisite component that participants will be expected to arrive at training having completed (advance homework).
  • Instructor/trainer manual.
  • Learner toolkit (Project team will review available toolkits or materials available from other states before developing something new.)
  • Participants in the training in Stage 1 will serve as reviewers of all prototype materials.
  • NYSL will ask IMLS to review the prototype training materials.

Stage 3

Objective: Test training package

Use the comprehensive training package to train library system staff with a focus on technology projects, such as technology training. This type of project was the focus of the Outcomes Logic Model prepared by LD staff for their two IMLS training events. The trainers for this stage may include the evaluation consultant, some NYSL staff, and contract trainers. The project team will provide some follow-up assistance to the library system staff to help them complete any assignments from the training workshop. They will also help them as the system staff begin implementation of OBE.

Stage 4

Objective: Conduct review of training

During the training events and follow-up calls, etc., the project team will:

  • Capture applications issues that arise during training.
  • Analyze obstacles and barriers to implementation.
  • Capture ideas for revising training materials.

Stage 5

Objective: Provide advanced training and technical assistance

The NYSL will request from IMLS a one-day workshop in Albany for hands-on problem-solving of issues identified during Stage 4 for a small number of selected system and LD staff who can represent the range of issues. The NYSL may also request some follow-up technical assistance after this workshop by IMLS by email or telephone.

Stage 6

Objective: Revise training package based on testing experience and hands-on assistance from Stage 5.

Use the results of the problem-solving workshop and feedback on initial training to revise trainer and learner materials. Publish comprehensive training package after this stage is concluded.

Stage 7

Objective: Implement OBE throughout NYSL and library systems. NYSL staff will use OBE for all guidelines for plans, applications and reports. Plans and reports will reflect the training provided to library system staff and will yield a rich resource for planning, decision-making, and advocacy.

Stage 8

Objective: Train trainers for member libraries' training.

Conduct training workshops of one trainer for each system and give each trainer a training template for conducting training for member libraries. Systems would now have trainers and tested training materials for training member libraries and other system staff.

Stage 9

Objective: Library systems will develop training plans for implementing OBE in their member libraries.

The systems will submit a training plan to the NYSL that outlines how they will implement the OBE training in their system. Library Development staff will provide technical assistance in refining the plans.

Stage 10

Objective: System staff will train member libraries' staff.

Library Development staff will provide technical assistance to the systems in carrying out this responsibility. They will also assist the system staff in evaluating the effectiveness of their training.

Appendix B: OBE Logic Model for Statewide OBE Training

Training Outcome 1: Immediate Outcome: Training participants plan OBE measures of intended program outcomes.

Indicator(s)
Data Source
Target Audience (To Whom Indicator is Applied)
Data Intervals
Target Achievement Level (Goal)

# and % of training participants who write at least three clearly defined, measurable outcomes in an OBE plan (logic model) as assessed by a trained reviewer during the workshop and by a final independent exercise.

Trained reviewer rating of all required elements of measurable outcomes.

All who complete the training. N=75

End of workshop

90% N=68

Training Outcome 2: Intermediate Outcome: Training participants use OBE in their grant applications.

Indicator(s)
Data Source
Target Audience (To Whom Indicator is Applied)
Data Intervals
Target Achievement Level (Goal)

# and % of training participants submit a grant application during a subsequent grant cycle who achieve a normalized score of 90

Grant reviewer rating including inter-rater check to achieve normalized score.

All who complete the training who also submit a grant during a subsequent cycle.

End of grant application review

50% N=37

Training Outcome 3: Long term Outcome: Training participants use OBE in their grant reports to show measurable results of technology training programs.

Indicator(s)
Data Source
Target Audience (To Whom Indicator is Applied)
Data Intervals
Target Achievement Level (Goal)

# and % of training participants who received a grant during a subsequent grant cycle who report results of intended outcomes as assessed by a trained reviewer.

Trained reviewer rating

All who complete the training who also received a grant during a subsequent cycle.

End of grant report review

90% N=33

Outcome 4: Long term Outcome: Training participants report that follow-up mentoring helped them apply OBE principles better.

Indicator(s)
Data Source
Target Audience (To Whom Indicator is Applied)
Data Intervals
Target Achievement Level (Goal)

# and % of training participants who use follow-up mentoring who score 80 or better on a 100-point evaluation scale

Satisfaction surveys

All training participants who request follow-up mentoring

After 6 months of mentoring service, then annually

90% of training participants who use the service

The New York State Library Division of Library Development 2004

Back to: Contents | LSTA Evaluation Documents and Reports Page | LSTA home page | Library Development home page
Last Updated: July 25, 2019