A Companion to the PowerPoint slide program entitled Outcome-Based Evaluation for Technology Training Projects


Basic OBE Training Manual
OBE Training Presentation

OBE Activities

Training Manuals

Certified OBE Trainers

OBE Training Plan

Links

OBE home

Outcome-Based Evaluation (OBE) for Technology Training Projects
Trainer's Manual

Evaluation for Technology Training Projects>

Developed for The New York State Library
Division of Library Development
by Eleanor M. Carter, Ed.D. (Carter Consulting Services)
© 2005

This presentation is also available in .PDF format (491k)

Contents

Workshop Guidelines

Preparation of Facilities

Arrange the room with tables spread out so that groups seated at the tables are able to:

  • See the screen showing the PowerPoint teaching progra
  • Work together without being distracted by groups around them

Place a flip chart with a marker near each table so that all the groups have a view of each flip chart without obstructing the forward view.

Place participant name tents and participant manuals at tables according to predetermined work groups. Generally participants are asked to submit homework (see homework heading) before the training. The presenter selects groups based on common interests or common work assignments. One project is selected for each group to use when working together on an OBE plan.

Set up a table near the projector for presenter materials. It helps to spread out materials for easy access. Include a different colored marker for making changes on flip charts.

Introductions

Suggestions for introductions include:

  • Asking participants to introduce themselves giving brief information about where they work and what they do
  • Sharing your credentials including your evaluation (especially OBE) experience

Agenda

Prepare and distribute an agenda. Explain that the agenda is flexible. Groups vary, taking longer on different activities. The agenda should include general headings about content of the training and breaks and lunches. General recommendations about where we like to be at different points in the instruction are:

  • A.M. First Day: Instructor covers Rationale, OBE Distinguishing Features, Overview of OBE Process, Step 1 Assumptions about Program Need, and Step 2, Program Purpose. Groups complete Activities 1, 2, and 3, and write Activity 3 (Program Purpose) on Flip Charts. Note sometimes it is possible for instructor to review the flip charts of all the groups with the large group before lunch. Sometimes the review begins or is completed after lunch.
  • P.M. First Day: Instructor covers Step 3, Inputs, Activities, and Outputs. Groups complete Activity 4 fairl quickly. Instructor circulates and assists groups as needed. Groups do not write Step 3 on flip charts. If something occurs in small group worth noting, instructor summarizes for large group. Instructor covers OBE Process Overview and Step 4, Part 1, Writing Outcomes. Groups complete Activity 5, Part 1 only and write outcomes on flip charts. Instructor reviews outcomes with large group making changes as necessary. This activity should be done with great care taking time to be sure the outcomes meet all the requirements. Invite groups to make recommendations that might assist other groups.
  • A.M. Second Day: Instructor covers Step 4, Parts 2, 3, and 4, Indicators, Data Sources, and Data Intervals Groups complete those three parts of Activity 5 and write on flip charts. Instructor reviews with large group.
  • P.M Second Day: Instructor covers Step 4, Parts 5 and 6, Target Audience and Target Achievement Level. Group complete last two parts of Activity 5 and post numbers on flip chart next to indicators. Instructor reviews with large group. Instructor covers reports and summary. Individuals complete Activity 6 (Final Independent Exercise) and Workshop Evaluation.

Roles of Trainers

It is recommended that trainers work in pairs. If both feel comfortable, the presentation can be shared and decisions about who will present what can be made in advance. The most important need for a team is to circulate among the groups assisting with the activities. The developers advise both trainers to move from group to group separately. Some groups need more assistance. When that happens, one team member can generally cover more than one group while the other gives more in- depth assistance to the group that needs it. As trainers circulate and hear discussion or observe successes or difficulties, they can alert each other to what should be shared with the larger group. The developers do not recommend having groups present their work. It is very time consuming and the time is more productively spent giving groups ample time to work out the activities together. Rather, we recommend that one of the trainers quickly review the group work on the flip charts.

When groups complete activities, even though it appears they are on track, they sometimes write something different from what the trainer expects on the flip charts. Sometimes it is an improvement on what was expected and that should be noted. Sometimes it is a return to prior misconceptions and the trainer should suggest changes using a different colored marker as the review takes place.

Homework

Before the workshop send a homework notice similar to the following to all participants:

Please select a technology-training project (existing or planned) in your organization (or one you woul like to do) and describe it briefly, giving the information requested. If the project does not in some way plan to change participants' knowledge, skills, behavior, or attitudes, please select a different project.

Your name
your organization
your organization's mission

Project Title:

Project Description (answering the following questions)

What will the project do?
who is the project for?
how will the participants benefit?

Please send your completed description by email to (insert name and e-mail address here) by (inser deadline date here)

Review the homework submittals before the workshop. If participants come from many different library types or systems you can group them according to library or system type. Try to keep group size to 4 or 5 participants and number of groups to 4. Then select one homework assignment that seems best suited to the OBE process for each group to use for activities. Make enough copies of the assignment selected to hand out to the groups before they start Activity 1. If none of the homework projects are ideally suited to OBE, pick the one you think can most easily be adapted and prepare your explanation to the group about how the project will need to be tweaked for purposes of the training. Remember it's the OBE process, not the project, which is important for learning OBE. (See practic exercises for evaluating homework)

Teaching With the Program

The slide program developed for teaching outcome-based evaluation (OBE) was used in initial training provided by the New York State Library, Division of Library Development to systems' librarians around New York State. Early in the process, since the program was being used in many host institutions around the state and equipment varied, technical difficulties called for flexibility in program presentation. The developers found that when equipment did not cooperate, it was easy to use the participant manuals in lieu of the slides. At least two participants attended more than one session because they had been part of the pilot testing of materials. They expressed their preference for the use of manuals instead of slides. The developers also preferred using the manuals because the content of OBE is not linear and the slides almost force trainers to teach in a linear mode. As the number of initial training sessions increased, experience has led to significant revision of the participant manuals that may mitigate some of the concerns about using the slides. The developers suggest you experiment and choose what works best for you.

Whether you use the slides or the participant manuals as a participation tool, some discussion about the slides will provide important background for your teaching. This section includes:

  • The text of the slides grouped by topi
  • "Discussion" segments that provide additional information about the preceding series of slides. I you read these segments before teaching, it can help answering questions and strengthen your understanding of the content.
  • "Point of view" segments that address some of the recurring issues surrounding OBE as a methodolog and offer alternative ways to view the issues. All training participants and trainers have past experiences, basic assumptions, or underlying value systems. It is important to recognize and deal with them to effect successful OBE training. If the trainer shares the objections that participants have to using OBE and commiserates with the participant, the whole training is compromised. Even if the trainer continues to share reservations, it is important to know how to address the more common participant reservations about OBE.
  • "Thought process" segments to provide tips for carrying out the OBE step

If you review the discussion, point of view, and thought process sections preliminary to teaching, it should help you address issues as they arise. The questions may occur at different points in the workshop than we have identified, so you should be prepared to use any of those segments whenever it appears to be appropriate. The developers purposely did not include those segments in the instruction for two reasons: 1) the time constraints of offering a significant amount of instruction and practice during the workshop and the need to allow time for assessment of the instruction in the final independent exercise, and 2) not every issue is raised in every workshop and different groups have difficulty with different concepts. The commentary on the slides is intended to help the trainer enhance the workshop experience as needed.

Slides 2-5: Rational

2. Outcome-Based Evaluation a Practical, Smar Evaluation Choice

  • Stakeholders (local, state, federal) require visible results for customers-OBE specializes i user outcomes
  • Funding agencies are seeking outcomes information as a condition of funding
  • OBE offers side benefits of improved planning, decision-making, and reporting
  • Consistency of data among libraries improves ability of state and federal agencies to advocat for library appropriations.
  • OBE objectively identifies services that benefit customers the most

3. Outcome-Based Evaluation
program evaluation benefits

  • Examines the need for data before the program begins-important user impact data is planned versu an afterthought
  • Organizes the collection of data
  • Focuses on what the user not the librarian does
  • Strengthens proposals when the process is competitive

4. Outcome-Based Evaluation
management benefits

  • Helps make the argument for change when change is resisted
  • Justifies choices internally and externally
  • Supports decisions about staffing needs and assignments and provides staffing justifications
  • Enables comparison to other libraries with similar programs to make management decisions

5. Outcome-Based Evaluation
advocacy benefits

  • Shifts focus from the activities of librarians to the benefits to patrons
  • Results make the library look good in the community; helps articulate the benefits of technolog and other programs to the community
  • Proof of impact generally improves customer relations
 

Discussion

It is important to share the rationale for using OBE. Think about the bullets on each slide and consider example in your professional life when having true patron impact data would have been highly desirable and use those examples as you work through the rationale. Have you ever had a difficult time explaining a decision that looks subjective to others? Outcomes data keeps you objective. Have you ever had to trim budgets without good information about impact of what various programs do? Outcomes information doesn't remove the pain, but it clarifies choices for everyone. Have you ever looked back on work you have done and wished you had gathered different information, but it is too late? OBE makes highly informed evaluation predictions and data gathering decisions from the beginning. Have you had positive experiences where solid impact data helped you meet an important professional goal? Use some of your own experiences now that you know what OBE can do.

Point of View

* In training sessions the developers have heard a familiar refrain from participants. The words vary bu it goes something like this: "This is the going fad. In our business fads come and go."

contrary to the "mind set" that sees obe as a fad that we can "wait out," OBE seems to be a growing phenomena. We know that IMLS is dedicated to using OBE whenever we can look at end user impact of how we spend our energy and our dollars. We know that many stakeholders (funding agencies and politicians in particular) are beginning to demand OBE evaluation of projects. We also know there is a new term being adopted to make decisions in some states, namely, OBE budgeting. In the latter case, OBE has caught on so well that major budget decisions including public funding for libraries are being made using outcome-based evaluations of program impact.

* sometimes extensive evaluation experience of either trainers or participants can get in the way of learning about OBE. There are other methodologies that use some of the same terms. The word, "outcome, "is one of those overlapping terms that can cause confusion. The developers suggest that participants be encouraged to suspend what they know about evaluation and concentrate on the specifics of OBE methodology.

Slides 6-8: OBE Project Selectio

6. Select OBE to Evaluate Your Project When th Project:

  • Is designed for a clearly defined audience
  • Addresses specific needs of that audience
  • Provides a variety of activities to address the need
  • Provides for repeated contacts or follow-up with the target audience
  • Is designed for the target audience to acquire new or improved skills, knowledge, attitude or behaviors that are predictable and measurable

7. OBE Projects Must Be Predictable and Measurable

  • What you predict and measure:
  • A change in skill, attitude, knowledge or behavior
  • Examples:
  • Specific skills librarians learn in a training program
  • Specific skills patrons learn in a training program

8. Select Another Evaluation Method:

  • When you can't predict user benefits
  • When you are evaluating something other than user benefits
 

Discussion

There are three important themes in this set of slides. One is that OBE is not the best evaluation method fo every project. Tell the groups that when we select one homework project for each group to use in the OBE learning process, we select the one best suited to OBE. When the projects chosen still aren't ideally suited for OBE, in order to learn the process, we ask for adjustments. We are not criticizing any of the projects; rather we are making sure we have projects that will enable us to practice all elements of the OBE process.



The process matters, not the project for this learning experience.



The second important focus is on the specific action the end-user will take as a result of the planned program.



OBE concentrates on what patrons do, not on what service providers do. Patrons are viewed as customers. Note when librarians receive training, they are the customers of the training and the people that benefit from the training are also customers.



The third key theme is predictability. It is what sets OBE apart from other methods. It is repeated in several slides and you need to focus on it from the beginning. In OBE we predict the specific actions the end user of our programs will take because of what we did.



What customers do as a result of a project is intentional (predicted) and measured from the beginning.



All three themes are intentionally repeated often because the workshop groups are being asked to do something different with their projects from what they originally envisioned and need reminders that the process not the project is important for purposes of learning OBE. The customer and predictability emphases are repeated because they are the most difficult to apply.
 

Point of View

* We are accustomed to describing our projects and programs in terms of what we will do. We identify al the steps we will take to make the program happen. We usually know who our potential patrons are and what we will make available for them to use. We often think in terms of providing resources and access. In OBE methodology we have to turn our thinking around to speak in terms of what the patron will do, not in terms of numbers of circulations, questions answered, events attended, or hits on a database or web site, but the specifics of how the user will benefit from what we provide. What will they learn, what will they do with what they learned, and what is the real impact on them and how can we measure it? We may be interested in the benefit to the library and we may capture data that answers the question, but it can't be our primary concern.

* participants in obe workshops often express wishes to know more about alternative evaluation methodologies. Evaluation methodologies are the subjects of whole courses in graduate schools and can't be covered in an OBE workshop. It is more important to focus on OBE so participants can do the steps when required by stakeholders and to understand when a project cannot be evaluated using OBE. When a project requires some other method of evaluation, a literature search or expert advice may be a solution.

Thought Process

When we decide if our project can be evaluated using OBE methodology we think:

  • If my project can only be evaluated using numbers without answering qualitative questions, it does not sui OBE. For example, if it yields only number of web hits versus predicting that a specified number of students in a specified age range will use an online web service for successful completion of homework and research papers, the former would not work while the latter would as long as one can define "successful" and can gather feedback from student web users.
  • If my project involves management systems, cost savings or other librarian activities that do not have a clea connection to end user actions, such projects are valid and necessary, but not OBE projects.
  • Training can always be evaluated using OBE and should whenever possible be extended to evaluate benefit to patron.

See Practice Exercise

Slides 9-14: Distinguishing Feature

9. Outcome-Based Evaluation

  • Defined as a systematic way to assess the extent to which a program has achieved its intende (predicted) results
  • Asks the key questions:
  • How has the program changed the knowledge, skills, attitudes, or behaviors of program participants
  • How are the lives of the program participants better as a result of the program?

10. Outcomes

  • Defined as a target audience's changed or improved skills, attitudes, knowledge, behaviors status, or life condition brought about (partly or wholly) by experiencing a program.
  • What your customer can do and does as a result of the program

11. Outcomes Customers

  • A targeted group of library staff if you are teaching/training them to do something and yo can predict and measure the results
  • A targeted group of library patrons when you can predict the resulting behavior and you ca measure the results
 

12. Outcomes: Examples

  • Immediate-A librarian learns how to do advanced health database searching
  • Intermediate-A librarian uses advanced health databases to help patrons
  • Long term-A librarian teaches X adults in the community how to find advanced health information
  • Impact-More adults in X community will report successful use of advanced health databases

13. Outcomes "Lite" Examples

  • Access-More adults in the community will have access to health databases (usage/outputs; impac unknown)
  • Satisfaction-Adults in the community will like the library's health-related databases (can b present with no real results)
  • Benefits to the Institution-More adults in the community will use health-related databases (usage/outputs impact unknown)

14. OBE Definition of a Program

  • Activities and services leading toward intended (predictable) outcomes
  • Generally has a definite beginning and end
  • Designed to change attitudes, behaviors, knowledge, or increase skills and abilities based o assumed need

Discussion:

* This slide series gives an overview of OBE. You will observe a lot of repetition, hitting on the themes o customer service, predictability and measurability of programs. Note that slide 12 shows a progression of program impact in outcomes ranging from the immediate to the long-term. The workshop groups may begin with a program with an immediate goal of training librarians and not be thinking much beyond the immediate goal. To really grasp OBE, it is important to get work groups to think about how such initial training at the system level can reach the library patron and to write long-term outcomes.

* slide 13 presents a concept that bears repeating throughout the process. outcomes "lite" refers to predicted results that generally don't specify specific actions of customers. In actuality they are outputs of a program, not outcomes (program impact). OBE does not minimize the importance of providing access to resources; it does not minimize the importance of favorable community reaction to services; it does not minimize the importance of library usage, nor does it minimize the importance of good public relations (raising awareness). However, the results of all of these efforts are generally found in quantitative reports (numbers). OBE is a qualitative methodology that uses predicted numbers and predicted customer actions that are specific. An illustration of the difference would be as follows:



Example "Lite Outcome" Library provides access to health databases to entire community and publicize availability of resource. Result: Number of uses of health databases; possibly increase in number of uses of health databases; possibly expressed satisfaction on survey. There may be increased uses but no notion if the uses were successful. Patrons often express satisfaction even if they never use the information learned. Awareness of access does not mean any action was taken. The numbers are important, but "lite" on impact.

Example OBE: Patrons are trained by library staff to do advanced health database searching with a predictio that X number will be trained and that some percentage of that number will demonstrate the searching skills during training and that some percentage will describe successful searches conducted post training. Also prediction may specify how users will benefit from searches post training, e.g. they may be helped to make an important personal health decision, to choose medical care options for a specific health ailment, to understand the diagnosis of self or a close friend or relative, or to have a life-saving experience. Patrons who were trained can be asked to volunteer information about post training uses of databases that can be compared to a checklist of the initial predictions. The results of such an outcome are truly impact results versus numbers. Using the numbers from this outcome in consort with the usage data can be even more powerful. For example if you know that 100 people who volunteered information had successful life-changing experiences using health databases, and there were 3000 uses, then you might conclude the program had a greater life-changing impact than 100 and you can also describe the impact with more than a number.



* Slide 14 gives the definition of a program as stated by the original developers of OBE for IMLS. That a program usually has a beginning and an end doesn't resonate with libraries that usually have ongoing programs and services. For workshop purposes, even if a program is ongoing, if participants define a time period for a project evaluation, the program can be used.
 

Point of View:

* Defining customers in OBE terms is difficult for systems participants to do and that is understandable It is well understood that systems serve member libraries and may not have any direct library patron contact. This, however, makes it difficult in the application of OBE to a project. System participants are asked to write outcomes all the way to the patron level knowing that the further an outcomes developer is from the patron the harder it is to do. For OBE, the customer is ultimately the end user. When the system mounts a project there is a customer hierarchy.

The member library (staff) is the system customer.

If the member library is a 3'R's customer, its customers may be other member libraries or library patrons.

If the member library is a public library system customer, its customers are library patrons.

If the member library is a school library system customer, its customers are school personnel and students.

In the workshops, systems are asked to write outcomes for all levels in the hierarchy even those they don' directly serve. The key point is that the system can and should ask the member library to report outcomes that relate to patron action that can be connected to the original system service. Such reports can be designed and communicated when the original service is provided. Note: Some systems have learned that incentives at all levels can be useful for getting the desired feedback.

* Another way to think about this issue is can be illustrated by some training examples.

When systems train librarians or staff from member libraries to implement management systems or cost saving measures, it is difficult to say what the patron of any member library will do as a result. One can do cost/benefit studies or time studies and look at management results. One could even have member library staff track additional activities they could accomplish with time saved. However, this would not be an OBE project beyond the initial training. The system could and should write outcomes for what the member library staff will be able to do with training received (specific skills) and it should follow up to see if the member library staff are using the new skills. Those would be outcomes to add to the efficiency results. We don't use such projects for OBE training because they don't enable us to get into outcomes in depth and because the goal is primarily a management goal.

When a system trains member library personnel to provide patron services or to use resources that patron will also use or can be trained to use, then it makes an ideal OBE project. The system can predict what the member library staff can do as a result of training. The system can predict how the member library staff can help patrons. The system can predict how the member library staff can train patrons to use resources independently. The system can predict what the patron will be able to do either when receiving help from staff or when functioning independently. Throughout the hierarchy, there were learners who acquired new skills or behaviors that can be tracked to discover the impact of the initial training.

Slide 15: OBE Proces

15. OBE Plan Overview

  • Step One: Identify assumptions
  • Step Two: Include purpose statement
  • Step Three: List inputs, activities, and outputs
  • Step Four: Write measurable outcomes
Discussion:

This is an overview of the whole process of developing an OBE plan (sometimes called a simplifie program logic model). Steps 1-3 are the ones we are most accustomed to doing. What continues to distinguish this methodology is the approach to writing outcomes that are intentional and predicted. The goal of the workshop is to work through the four steps and achieve a completed OBE plan.

Slides 16-21: Distinguishing Feature

16. Assumptions About Program Need

  • Programs are developed as a result of assumptions about people's needs
  • Assumptions can be drawn from:
    • Experience of your institution
    • A program partner's experiences
    • Formal or informal research

17. Assumptions: Three Parts

  • Part 1: Need: A need identified among a group of individuals based on their common characteristics.
  • Part 2: Solution: A program that will change or improve behaviors, knowledge, skills, attitudes life condition or status related to the need.
  • Part 3: Desired results: The change or improvement you intend (or predict) to achieve.

18. Assumptions Part 1: Example

  • Assumption--Need
    • Based on a survey of the state's library systems, many library professionals lack the abilit to help library patrons find information they seek using electronic resources. Further, there is little time available for them to learn these new skills.
 
19. Assumptions Part 2: Example

Assumption-Solution
* provide goal-directed learning opportunities to help library staff learn to use electronic information resources effectively.
20. Assumptions Part 3: Example

Assumption-Desired Results
library staff throughout the state will develop ability to:
o help patrons find desired information from electronic sources
o teach targeted community groups to use electronic sources

21. Step 1 Checklist

Did you:

"Identify a common need of a target group?
" describe the solution?
"consider the desired results?
"transfer to obe plan: evaluation framework

Discussion:

Logic tells us that if there is no need for a program, we should not waste resources to develop and offer it We always have some underlying assumptions about need before we begin. What is key to this series of slides appears in slide 17. Part 1 calls for knowing who the customers are at all levels of the hierarchy including end-users. Part 2 is even more important because the proposed solution is expected to enable the customers to learn new skills. If the program is management, not skill related, it may be viable and valuable, but is not suited for OBE. Part 3 makes clear that you need to be able to predict the change that will be manifested in the customer. The final slide in the series is a checklist that can be used by the workshop participant to be sure that all elements are present but also by the trainer when reviewing the work of the groups.

If a group is working with a project and need has not fully been considered, urge the group to write a nee that makes sense for the project and identifies a logical customer group. We are not asking groups to invent need, but to have the experience of putting together a plan with all the requisite parts. To repeat: The OBE process, not the project, is how we are concentrating the efforts of the workshop.

Point of View:

A needs assessment is a valuable tool. If a needs assessment has been done that relates to our planned project we should summarize it for our OBE plan. Needs assessments are labor intensive and not necessary for all projects. Many services began by offering patrons what was determined to be a need based on professional observation and experience. It is important to distinguish OBE data gathering from purist research. If you need to run tests of significance, prove reliability and validity of instruments, form control groups and any of the other typical research activities for the benefit of a stakeholder or as required by a funding agent, by all means do so. For most projects, however, needs statements need not follow a full-fledged needs assessment and impact data can be gathered without meeting strict "research" standards.

Slides 22-26: OBE Process; Step 2: Program Purpos
22. Program Purpose

program purpose is driven by assumptions about need. it relates to the organization's mission statement and program stakeholders. It defines audience, services, and benefits.

Translation: Who does what, for whom, for what benefit?

23. Program Purpose

Before you write the purpose statement:

  • Consider the program stakeholders (who needs to know the results? Boards? Legislators? Managers?)
    • Who are the program influencers?
    • What do they want to know?
    • How will the results be used?
  • Consider your organization's mission

24. Purpose Statement

  • Who does what? Identify the service provider(s) and the service to be offered.
  • For whom? Identify the target audience(s). What specific group will be served?
  • For what benefit? State in terms of changed, improved or demonstrated knowledge, skills, behaviors or attitudes.

25. Program Purpose-Example of a Program Purpos Statement

System-wide Electronic Resources Training provides training in Internet technology (who doe what) for library professionals (for whom) in order for them to:

  • Search electronic genealogy resources
  • Help patrons use electronic genealogy resources
  • Teach targeted community groups to use electronic genealogy resources (for what benefits) i order for community members to search electronic genealogy resources independently. (for what benefits)

26. Step 2 Checklist

Did you:

  • Consider the stakeholders?
  • Communicate the mission?
  • Identify the service provider(s) and services
  • Identify the target audience?
  • Describe the program benefits?
  • Put it all together in a program purpose statement?

Transfer to OBE Plan-Outcomes Evaluation

 

Discussion:

* Again logic tells us that we will not offer programs that are in conflict with our mission. If our organizatio has a mission statement we plug it in this section of our OBE Plan. If we don't have a mission statement, for purposes of the workshop exercise, we quickly summarize what the organization is intended to do. Our time is needed for outcomes, so if a workshop group is struggling with this section, urge them to write two or three simple sentences about the kinds of programs that are offered by the organization, to whom they are offered, and to move on with the purpose statement.

* Note that the purpose may seem somewhat repetitious of the assumptions about need and the proposed solutions That is very legitimate. The important thing is that need is considered, a target audience is described, and a succinct purpose statement grows out of that.

* A few words about stakeholders. Thinking about stakeholders may change the way a program is defined. If w do that before we write our purpose we are unlikely to forget a key desired benefit, since the desires of stakeholders do count. For OBE purposes, stakeholders are not defined as anyone who has a "stake" in the project. If that were the case, the librarians, library staff, and the end-users would be stakeholders. Rather, stakeholders are power brokers, decision makers, authority figures, or program funding agents. They may also be end-users of our libraries, but it is their influential role that makes them a stakeholder not their patron role. Typical stakeholders are legislators, grantors, Trustees, related state and national agencies, supervisors.

* Stop! Since the assumptions about need and program purpose are so interwoven conceptually, the first brea for workshop groups to practice begins at this point. Workshop participants complete Steps 1 and 2 on Activity Sheets 1-3. As trainers circulate among work groups, look over Assumptions and give advice as needed. Ask a member of each group to write the purpose statement on the flip chart provided. Go over all of the purpose statements making sure to comment on how the they are OBE appropriate or how the group may have revised them to make them so.

Slides 27-29 OBE Process; Step 3: Inputs, Activities and Services, and Output

27. Typical Examples of Inputs, Activities, Services Outputs

Inputs: Resources dedicated to or consumed by the program.

Staff, computers, facilities, materials, money (source), consultants, web site, software Internet, instructors

Activities: Program actions that are management related.

Recruiting, coordinating, promotional, curriculum development, purchasing, scheduling, an evaluating activities.

Services: Program actions that directly involve end users.

Conducting workshops, mentoring, online offerings, following up with customers.

Outputs: numbers of direct program products

Participants served; participants completed; materials develop and used; workshops, web hits.

28. Outputs vs. Outcomes

Caution-Outputs are not Outcomes

  • Outputs: A direct program product, typically measured in numbers (participants served, workshop given, web-site hits, etc.)
  • Outcomes: A target audience's changed or improved skills, attitudes, knowledge, behaviors, statu or life condition brought about (partly or wholly) by experiencing a program. These changes are intentional and measured from the beginning.

29. Checklist

Step 3 Checklist

Did you:

  • Identify all inputs (consumable resources)?
  • List activities and services?
  • Identify all outputs?
  • Transfer to OBE Plan-Outcomes Evaluation
 

Discussion:

* The chart on slide 27 contains the definitions of inputs, activities and services, and outputs as well a typical examples of each. These are fairly commonplace activities of program planning and can be done by workshop groups quickly.

* Slide 28 is an exact duplicate of slide 30. The reason for this is that groups will be identifying outputs If is often difficult for newcomers to OBE to distinguish outputs from outcomes. They are most familiar with outputs. The usage data they customarily report falls into this category. Before they list outputs it is helpful to note the difference as shown in slide 28. Then when they begin outcomes, the same slide is repeated to make sure they don't write outcomes statements for which only output data are available.

Stop for groups to do Step 3, Activity 4. As you circulate among the groups look at their work and make suggestion for additions. Be sure to look for activities that relate to evaluation, e.g. development of evaluation instruments, data collection, and data analysis. When the activity is complete comment to the whole group on any ideas that seem worth sharing. Groups need not record this activity on flip charts.

Slides 30-32 OBE Process: Step 4 Outcomes Overvie

30. Caution-Outputs are not Outcomes

  • Outputs: A direct program product, typically measured in numbers (participants served, workshop given, web-site hits, etc.)
  • Outcomes: A target audience's changed or improved skills, attitudes, knowledge, behaviors, statu or life condition brought about (partly or wholly) by experiencing a program. These changes are intentional and measured from the beginning.

31. Outcomes: Six Parts

  • Part 1: Outcomes: Identify specific intended or predicted changes in participants and pick few important ones to measure (What the customer can do)
  • Part 2: Indicators: Measurable conditions or behaviors that show an outcome was achieved
  • Part 3: Data Sources about conditions being measured
  • Part 4: Data Intervals: when will you collect data?
  • Part 5: Target audience: the population to be measured
  • Part 6: Target or Achievement Level: the amount of impact desired

32. Before You Write Outcomes

  • Name each group that will learn a skill or change a behavior; Start at the top with what th system does. Who learns from what the system does? Who learns from those the system taught? Who receives help from those the system taught? Keep going to the last group of end users. Write predictions for the specific action each group will take with what they learned.
  • Avoid added verbiage: Use action verbs
  • Avoid increase and improve unless you have baseline data
 

Discussion:

* Slide 30 repeats slide 28. The groups have just finished listing outputs, the quantitative data that wil be gathered. This repeats the distinction between outputs and outcomes, placed just before writing outcomes. Slide 31 is an overview of the six parts to the outcomes step. Some groups express a wish to take all six parts together, but experience dictates working writing and refining outcomes statements before beginning indicators, creating indicators, data sources and intervals as a separate activity, and finishing with target and target achievement levels. Slide 31 enables you to give the groups an idea of where you are headed and it is useful to tell participants that the sequencing of these steps is important.

* Slide 31contains tips developed through considerable experience. Before the developers start to write outcomes we always list all the "learners (end-users, patrons, customers) in the hierarchy of learners attached to the project. That helps us make sure we have all the major outcomes that should be measured. When we work with the groups creating outcomes, we ask questions like who is the first group of customers the project will reach? If the system is training, the first group may be librarians, library staff, school library media specialists, agency personnel, hospital personnel, etc. If the initial trainees will use what is learned to help others, there is another group of customers. If the initial trainees will train others, there is another group of customers. Some of the "customers" of initial training for librarians that we have encountered include: job seekers, students, teachers, librarians and library staff, legislators, agency personnel, agency clients, and hospital patients. At this point groups should also be sure that if there are customer qualifiers, they are specified. Some examples are teenagers served by the public library, pre-school children served by the public library, targeted community groups with a specific interest, students in grades 5 and 6.

* Slide 31 also suggests outcomes that use minimum verbiage and action verbs are best. Examples:

Librarians will have improved ability to teach genealogy databases to targeted community groups.
librarians teach targeted community groups to use genealogy databases. (less verbiage, action verb)
students will develop skills to use novel databases to complete assignments.
students search novel databases for successful completion of assignments. (less verbiage, action verb)

* Lastly slide 31 cautions about the use of the words increase and improve in outcomes unless baseline dat are available or unless pre-and post testing data are imbedded in the program. Sometimes what is thought to be baseline data are really usage data. Usage is generally a direct output not outcome of a program. When you describe customer changes in knowledge, skills, behaviors, it is less usual to have baseline data. For example, an outcome would not be to increase usage of NOVEL databases. That might be highly desirable and you would certainly capture usage and report it, but an outcome would describe how a specific group of patrons would use and benefit from NOVEL databases. It might be high school sophomores use NOVEL databases to meet an annotated bibliography class requirement. If school library media specialists or teachers had data on numbers and percentages of students who did not meet a specified satisfactory score in a previous year, and if the program provided training for teachers to integrate a collaborative lesson plan to teach NOVEL searching, then the results after the new curriculum was implemented could be compared to the baseline information about previous performance. However, if the only data available were numbers of NOVEL searches, there is no baseline data for comparison.

Thought Process:

When writing outcomes try to step out of professional roles. Assume the role of customer at each level and as the "What's in it for me question (WIIFM)." As a patron, what will this program enable me to do that is value added to my life in some manner? What specific knowledge and or skills will I acquire? How will I use those skills to my benefit? Then as a professional answer those questions in a way that would satisfy you as a customer. Write outcomes that clearly state what each customer group will do as a result of the program. Work hard not to jump ahead to think about measurement. If you state what the ideal outcomes are, then in the next steps, you can work on the difficulty of measurement.

Point of View: It is important before writing outcomes to remember the definition of "lite" outcomes. Avoid stopping at awareness or exposure type outcomes. It is never enough to just expose people to opportunities especially when we a spending considerable resources to do so. We need to think about how we can ensure that targeted groups will take advantage of what we provide. It is also important to remember that training for the sake of training is not enough. Training is a knee-jerk reaction to problems and it can be a good one if we make sure the training leads to action. If our stated goals and predictions don't go beyond the training itself, we risk an expensive undertaking with limited payoff.

Slide 33: OBE Process, Step 4, Part 1 Writing Outcome

33. Part 1:Outcomes

Outcomes: Target audience's changed or improved skills, attitudes, knowledge, behaviors, status or life condition brought about (partly or wholly) by experiencing a program

Examples specific to Electronic Genealogy Resources Training (EGR)

  • Outcome # 1: Library staff search EGR
  • Outcome # 2: Library staff help patrons use EGR
  • Outcome # 3: Library staff teach targeted community groups to use EGR
  • Outcome #4: Patrons report successful use of EGR
 

Discussion:

Note that the examples state what the library staff will do during training. They will search the prescribe databases. After training they will help patrons use those databases. Also after training they will teach targeted groups to use the databases. The patrons involved in outcomes 2 and 3 will successfully use the databases, all as a result of the initial training. Stop for groups to do Activity 5, Outcomes only.

Point of View

We are so accustomed to thinking about what librarians do that it can be difficult to write all of our outcom statements in terms of customer action. If we initially write them that way, we can go back over them and ask ourselves how we can change them to reflect customer action. If we are a system and don't have access to end-users we still need to identify the benefit to them. End-user information can be provided to systems by member libraries.

Thought Process

As we think through the process of writing outcomes, the key is to identify all customers. If we are conductin training, who learns from the training and what action will they perform during the training and after the training? If a librarian learns a skill that can be used when helping patrons, what will they do with that skill and what will the patron do as a result? If a librarian trains a group of patrons, what will the patrons do during training and after training? The outcome entails a "specific prediction" of the prescribed action a target group will take as a result of training.

Slides 34-39: OBE Process: Step 4, Part 2: Writing Indicator

34. Part 2: Indicators for Each Outcome

Measurable conditions or behaviors that show an outcome was achieved:

  • What you hoped (intended, predicted) to see or know
  • Observable evidence of accomplishment, changes, gains

Indicator Format:

# and % of _________ (target audience) who ____________ (will be able to do what?) as assesse by _________ (the measurement that will show results).

Note: # and % symbols are placeholders for values to fill in later.

35. Before You Write Indicators

  • For each outcome write as many indicators are needed to show the outcome was achieved.
  • Typical indicators are:
    • # and % of learners (librarians, library staff, teachers, agency personnel, etc.) who will perfor X amount of specific skills as assessed by trained observer, or quiz, or final independent activity during training
    • # and % of learners who will report X amount of help given others after training because o what they learned
    • # and % of learners who will report training others and how many others were successful durin training
    • # and % of end-users (patrons, students, agency clients) who received help and or training wh report X amount of independent use of new knowledge.

36. Part 2: Examples of Outcome 1 Indicators System-wid EGR Training

Outcome 1. Library staff search electronic genealogy resources

Indicator(s)

  • # and % staff who can describe how to conduct an effective electronic search for genealogy informatio assessed by a trained observer during a workshop
  • # and % of staff who score 80 or better on 3 searches prescribed in a quiz
  • # and % staff who can identify 3 effective search engines assessed by a trained observer durin a workshop
 
37. Part 2: Examples of Outcome 2 Indicators: System-wid Electronic Resources Training

outcome 2: library staff help patrons using electronic genealogy resources

indicator(s): # and % staff who report x number of successful searches with patrons using electronic genealogy resources as assessed by a rubric

38. Part 2: Examples of Outcome 3 Indicators EG Training

Outcome 3: Library staff teach targeted community groups to use electronic genealogy resources

Indicator(s)

  • # and % staff who report training X number of community groups to use EGR
  • # and % of targeted community patrons who can identify 3 effective genealogy search engine assessed by a trained observer during a workshop
  • # and % community patrons who can describe how to conduct an effective EGR search as assesse by a trained observer during a workshop
  • # and % of community patrons who score 80 or better on 3 searches prescribed in a quiz

39. Part 2: Examples of Outcome 4 Indicators EG Training

Outcome 4: Patrons report successful use
of egr

Indicator(s)

  • # and % of patrons helped by a librarian who report X number of successful uses of EGR as assesse by a rubric applied to a survey
  • # and % of patrons who received community training who report X number of successful uses o EGR as assessed by a rubric applied to a survey.
 

Discussion:

Experience with several groups tells us that writing indicators is the most difficult of all the OBE steps Many individuals try to fill in the placeholders at this stage. Urge them not to do so because the numbers rarely turn out like they think they will. The examples of indicators can be used to note that the format always includes # and % as placeholders and may include X placeholders. What the indicator is saying is that some number and some percent of the target audience will succeed. The indicator specifies what they will succeed doing. Maybe it is using the right methodology to search databases. Maybe it is using newly acquired search skills for a specific purpose when the training is over. The indicator should specify how much success is required, for example, 3 successful searches. Sometimes when you write an indicator, you are not ready to specify how many successful actions is enough. Then you put in an X placeholder and state that the action will include at least X number.

Every indicator has specified assessment of the desired action usually preceded by the words "as assesse by." If customers are learning a skill, some number and some percent of those taught will achieve the desired action. If you have direct access to the learners/customers, and you are teaching them, then you can use a variety of assessments including teacher observation, testing, collecting a final independent exercise, role-playing, or any activity that demonstrates that the participant met the prescribed action and the minimum amount specified. If the training is over, the indicator usually calls for a report that tells how the new skill was used. The report does not call for just numbers, but for something substantive that lets you know that the skill you taught was successfully applied. Many indicators say as assessed by a rubric or checklist applied to the report. Urge participants to use such measures as opposed to surveys where respondents simply check off what they did. The following example is intended to demonstrate the difference.

Example of a survey that produces primarily quantitative data.

After training I used the advanced database to assist patrons find information. 0 Yes 0 No
in the first 6 months following training i assisted _____ number of patrons using the advanced database.
of the patrons i assisted, ______ number expressed satisfaction with the results.
note these are just examples that could be used if the indicator says, "as assessed by a survey."

Example of a survey applied to a rubric or a checklist.

  • For each time you assisted a patron using your advanced database skills acquired in the workshop, create log that includes:
    • The question asked. __________
    • Description of the search. ___________
    • Description of the information found. ____________
  • The checklist would include questions like, was the question one likely to be answered using the database taught, were the search steps appropriate, did the search results answer the question?

The difference between the two methods is that the survey produces numbers without knowing if the results ar in any way connected to the initial training. Also customers have been known to be satisfied even if a search is not successful. In the open-ended survey, the analyzer can look at the question and the steps described and know if there is a connection to the training and then look at the results to see if the search yielded results.

This is an overly simplistic explanation of complex procedure. See pages 34-38 for an examples of full-fledge rubrics that could be applied to customer responses to see if the desired outcome was achieved.

Point of View

Sometimes the inclination is to leave out outcomes and indicators when there are perceived barriers to collectin the data or to collecting the data in time, particularly during a short grant cycle. Writing those indicators is crucial to knowing true impact of a program. It is important to communicate all the data needs to all the customers at the beginning of a project. For example the system tells trainees that there are certain reporting requirements associated with receipt of training. That includes answering questions at designated intervals post-training. It includes providing results achieved by helping or teaching others. It may include asking their patrons to provide information at intervals. Setting up and sharing data collection requirements from the beginning makes it possible to write those difficult indicators. See data collection section for other suggestions that mitigate some of the other issues surrounding the difficulty of data collection. The concern about short project timelines is genuine, but should not prevent the collection of important impact information even after it is too late for a funding cycle. If you get in the habit of collecting impact information, it helps in future program predictions and future program planning. It may also provide important baseline information so that future outcomes can include "increase or improve" statements."

Slides 40-44: OBE Process: Step 4, Parts 3-4: Data Sources and Data Interval

40. Parts 3-4: Data Sources and Intervals

Sources: Tools, documents, and locations for information that will show what happene to your target audience, e.g. pre- and posttest scores, assessment reports, observations, anecdotal self-reports, surveys

Intervals: The points in time when data are collected

  • Outcome information can be collected at specific intervals, for example, every 6 months
  • Data can also be collected at the end of an activity or phase and at follow-up
  • Data are usually collected at program start and end for comparison when "increase" data are needed.

41. Parts 3-4: Examples of Data Sources and Interval for EGR Training

Indicator(s) Outcome 1

  • # and % staff who can describe how to conduct an effective EGR search as assessed by a traine observer during a workshop (Data Source -- observation; Data Intervals -- at end of course)
  • # and % of staff who score 80 or better on 3 searches prescribed in a quiz (Data Source - quiz; Data Intervals -- at end of course)
  • # and % staff who can identify 3 effective search engines assessed by a trained observer durin a workshop (Data Source -- observation; Data Intervals -- at end of course)

42. Parts 3-4: Examples of Data Sources and Interval for EGR Training

Indicator Outcome 2

  • # and % staff who report X number of successful searches with patrons using electronic genealog resources as assessed by a rubric (Data Source -- participant reports; Data Intervals -- after 6 months/annual)

43. Parts 3-4: Examples of Data Sources and Interval for EGR Training

  • # and % staff who report EGR training of community groups (Data Source -- participant reports Data Intervals -- after 6 months / annual)
  • # and % of targeted community patrons who can identify 3 effective genealogy search engine (Data Source -- observation and participant reports; Data Intervals -- at end of course)
  • The # and % community patrons who can describe how to conduct an effective EGR search (Dat Source -- observation and participant reports; Data Intervals -- at end of course)
  • # and % of community patrons who score 80 or better on 3 searches prescribed in a quiz (Dat Source -- quiz and participant reports; Data Intervals -- at end of course)

44. Parts 3-4: Examples of Data Sources and Interval for EGR Training

  • # and % of patrons helped by a librarian who report X number of successful uses of EGR as assesse by a rubric applied to a survey (Data Source -- survey / rubric; Data Intervals -- after 6 months / annual)
  • # and % of patrons who received community training who report X number of successful uses o EGR as assessed by a rubric applied to a survey (Data Source -- survey / rubric; Data Intervals -- after 6 month / annual)
 

Discussion:

The blue handout sheet in the participant manuals gives an overview of data sources. Note that in the Output section of the OBE plan, a list was made of all the sources of quantitative information, generally usage statistics. Data sources for the "Outcomes" step in the OBE process are those sources that will match the indicators and outcomes. It is important to remember that there should be qualitative elements to the data sources because of the very nature of outcomes seeking to prove program impact. Data intervals will vary for the different outcomes. It is always advisable to gather data when the customers are present. The interval then becomes at the end of the class or intervals during a course, or at the end of the workshop. The intervals for follow-up data collection will most likely be determined by what is manageable. If the system is creating the outcomes, it is advisable to call for follow-up reports that coincide with the routine reporting cycle.

Several workshop groups have been interested in satisfaction and confidence levels of program participants While these are not the primary goals, the information is valuable. Survey questions that elicit such information can be put together with open-ended surveys that evaluate actions of participants. It is not an either/or choice. A single instrument can be used in multiple ways.

Many OBE workshop participants have asked for an example of a rubric when evaluating the workshop. See pag of this training manual.

Stop for participants to do the next three parts of Activity 5 (indicators, data sources, data intervals)

Point of View

Many of us have deeply imbedded ideas about research and its focus on the "significance" of result using control groups, reliability and validity standards, and statistical analyses. Unless an agent that requires the rigors of research sponsors a program, most programs can be analyzed using reputable methods that are much more manageable. OBE need not be seen as an impossible task given busy schedules and limited staffing. Sampling techniques are acceptable for qualitative assessment. The intent is to ask through interview or survey how people acted on what they learned. Output data may show that 2000 people were served. A sampling of the actions of 100 of those to elicit specific impact information can be used to extrapolate a number who were successful. For example if 75 out of 100 successfully used what they learned in your workshop according to predetermined standards for success, as long as you acknowledge that the data came from a sample, you can conclude that most likely 75 percent of 2000, or 1500 were successful. If your ultimate prediction was that 60 percent would be successful, you will have exceeded your anticipated outcome and you will have solid information for future predictions. Often sampling is done during selected busy days or weeks.

Many express concerns for "privacy" when introduced to OBE. Privacy can be an excuse, because a long as patrons volunteer the information and as long as aggregate information is used without identifying any one respondent, privacy is maintained.

Many of us have experienced low return in requests for information. Past experience can be a barrier to collectio of the most meaningful information. OBE practitioners have found that several activities increase response from library staff to systems and from patrons to library staff. It is important to design questions, rubrics for analysis, and instruments before the project begins and to tell participants at all levels that you need their voluntary help in measuring program impact. Feel free to tell them that future funding may depend on getting good information to the right people. And remember if you answered the questions "what's in it for me" when developing outcomes and if you demonstrated value added to participants they will more likely cooperate by supplying data. Some practitioners have offered creative incentive at little or no cost. Some have found creative ways to collect follow-up information. One library published a telephone number and an e-mail address in a community newspaper inviting program participants to make contact to answer a few open-ended questions. Some have follow-up forms at library service points with signs asking for cooperation. If conducting a sample, reference librarians can give out forms when they are helping a patron and briefly explain the need for information.

Thought Process:

At the first training session:

  • Have all the collection instruments ready for all target audiences
  • Tell the first target group what follow-up information is needed and wh
  • Urge first target group to use instruments with patrons or with people they trai
  • Urge first target group to be prepared to explain that privacy will not be violated and response is voluntar
  • Urge first target group to consider what incentives might increase voluntary submittal of follow-up informatio
  • Periodically remind targets what is needed and of deadline
  • Urge first target group to periodically remind patrons of need and deadline
  • Have data sampling plans ready to share with first target grou

Slides 45-50: OBE Process: Step 4, Parts 5-6: Target Audience and Target Achievement Level

45. Parts 5-6: Outcomes Target Audience and Achievemen Levels (Goal)

targets: the population to whom the indicator applies

  • Decide if you will measure all participants, completers of the program, or another subgroup
  • Special characteristics of the target audience can further clarify the group to be measured

Achievement Levels: the stated expectations for the performance of outcomes

  • Stated in terms of a number and/or percent
  • Meets influencers' expectations
  • May be estimated by the program's past performance
46. Target and Achievement Levels - Filling in th Place holders in the Indicators

indicator example:

# and % library staff who report helping at least x number of patrons to use a particular resource.

Target= number of library staff who were successfully trained to help

target achievement= % who will report helping at least x number

X=minimum achievement level

47. Parts 5-6: Examples of Targets and Achievemen Levels for EGR Training

Indicator(s) Outcome 1

  • # and % staff who can describe how to conduct an effective EGR search as assessed by a traine observer during a workshop
  • The # and % of staff who score 80 or better on 3 searches prescribed in a quiz
  • The # and % staff who can identify 3 effective search engines assessed by a trained observe during a workshop

Target/Level

All library staff who complete the course
N=445
Level = 356 (80%)

  • Note: same for all three indicators
 

48. Parts 5-6: Examples of Targets and Achievemen Levels for EGR Training

Indicator(s) Outcome 2

  • # and % staff who report X number of successful searches with patrons using electronic genealog resources as assessed

Target/Level

All library staff who complete the course N=445

Level = 222 (50%) of staff who report at least 5 successful searches (X=5)

49. Parts 5-6: Examples of Targets and Achievemen Levels for EGR Training

Indicator(s) Outcome 3

  • # and % staff who report training community groups to use EGR
  • # and % of targeted community patrons who can identify 3 effective genealogy search engine assessed by a trained observer during a workshop
  • # and % community patrons who can describe how to conduct an effective EGR search as assesse by a trained observer during a workshop
  • # and % of community patrons who score 80 or better on 3 searches prescribed in a quiz

Target/Level

Completers
N=445
Level = 222 (50%)
patrons who complete n=2220
level # = 1776 (80%)
level # = 1776 (80%)
level # = 1776 (80%)

50. Parts 5-6: Examples of Targets and Achievemen Levels for EGR Training

Indicator(s) Outcome 4

  • # and % of patrons helped by a librarian who report X number of successful uses of EGR as assesse by a rubric applied to a survey
  • # and % of patrons who received community training who report X number of successful uses o EGR as assessed by a rubric applied to a survey.

Target/Level

222 library staff report 5 help incidents; Patron N=1110
level # = 111 (10%)
patrons successful during training
N= 1776
level #=177 (10%)

 

Discussion:

Every indicator has specified assessment of the desired action usually preceded by the words "as assesse by." If customers are learning a skill, some number and some percent of those taught will achieve the desired action If they are in a class, they are the ones who "get it" and "apply" it later. When writing indicators we leave placeholders. Now we fill in the placeholders to complete the target audience and target achievement levels. For each indicator we need a realistic assessment of the target audience number. For example if the first year of our program will involve workshops for school library media specialists and teachers, there may be hundreds or thousands in the district who need the training. In a grant application we can certainly provide those numbers and we can say that, if successful, the program will be expanded to reach more people. However, if the workshops are limited by time, facilities, and equipment so that only 20 workshops for 20 people each will be given, then the target audience is 400. If similar limitations mean that only teachers in certain grades will be trained, then our outcome should specify the grade levels involved. Similarly we may specify an age range of patrons or a characteristic such as job seekers.

The process of making a realistic estimation of the target audience should be applied to each indicator. Th numbers are not always the same for every indicator.

For each indicator, we need to predict how many of the target group will be successful or carry out the actio predicted to the standards specified. This is the percent that we think will be successful. If our number was 400 and we predicted a 60 percent success rate, then the target achievement level is 240 (60%).
if we had an x placeholder, we should fill in a number for X. Example:
indicator 1: we trained 300 high school teachers to develop lesson plans that integrate database searching into their curriculum. We predict that during training 270 (90%) will develop one acceptable lesson plan.
indicator 2: we predict that of the 270 who were successful during the workshops, 135 (50%) will implement at least one lesson plan into their classrooms.
indicator 3: we know that there are approximately 10,000 students in classes taught by the 135 teachers who implemented their lesson plans. We predict that 6000 (60%) will successfully use the databases taught to complete at least X number of assignments. We decide X= 2 assignments.

Stop for participants to complete the target audience and target achievement portions of Activit 5.

Point of View

Many believe that stakeholders care only about large numbers. Indeed, some do care about numbers. Howeve as OBE catches on, there is an evolving understanding that depth of impact is more important that meaningless numbers. OBE makes realistic predictions about how many are in a target audience at any one time and how many will be successful. Because there are multiple outcomes and indicators, the cumulative effect of the impact will be large even if some of the numbers are small. As you work through the indicators, the numbers may increase or decrease depending on the project. There is no value judgment about which is preferable. The cumulative effect is what is important.

Example of increasing numbers:

300 librarians trained; 270 (90%) successful during training

Of the 270 who were successful 135 (50%) helped at least 10 different patrons (1350 patrons) with a successfu search during the first 6 months.
numbers increase from a few librarians trained to many patrons experiencing a successful search.

Example of decreasing numbers:

1000 patrons trained to write acceptable resume
500 use resume to apply for jobs
200 get jobs after using resume
50 previously unemployed in jobs and holding after 1 year. note that while the numbers go down with each indicator, the impact on the economy of such an outcome is significant.

Slide 51: Finishing Outcome 4: Six Part

51. Step 4 Checklist:

Did you:

  • Write a few important, measurable outcomes?
  • Identify all the indicators for each outcome?
  • Identify data sources for each indicator?
  • Identify the number and characteristics of the target audience?
  • Note the data interval for each data source?
  • Decide the target achievement level (goal) for each indicator?
  • Transfer to OBE Plan-Outputs Evaluation
Once this checklist has been completed, all four steps in the OBE process are also complete an each participant has a copy of an OBE plan. The green handout sheet in the participant manuals is another example of a completed plan. See pages of this manual for some additional samples that may be useful as you help the work groups.
 

Discussion

Once this checklist has been completed, all four steps in the OBE process are also complete and each participan has a copy of an OBE plan. The green handout sheet in the participant manuals is another example of a completed plan. See pages of this manual for some additional samples that may be useful as you help the work groups.

Slides 52-56: Post-Planning: Report

52. Post-planning Step: Reports

Summarize the results of outcome data and include?

  • Participant characteristics
  • Inputs, activities and services, outputs, and outcomes
  • Elements requested by stakeholders
  • Comparisons of previous periods
  • Interpretation of the data
53. Post-planning Step: Reports

outcomes ..what did target audience achieve?

inputs what did we use? how much did we spend? How much did we consume?

activities and services .what did we do?

outputs ..how many units did we deliver? to whom (audience characteristics)

54. Post-planning Step: Reports

Bottom line of reports management:

  • We wanted to do what?
  • We did what?
  • So what? (Outcomes)

55. Post-planning Step: Reports

Reporting for State Purposes

  • Relates to needs of target audiences identified in the state LSTA plan
  • Shows relationship to goals
  • Identifies the outcomes achieved by people served in programs

56. Post-planning Step: Reports

Showing relationships

  • Local libraries show achievements >>
  • Library systems show aggregate achievements >>
  • State Library shows statewide achievements

 

Discussion:

When we write program reports we report everything - inputs, activities and services, outputs (all the quantitative/usag data pertinent to the program) and we report outcomes telling what was achieved by each group of customers and how achievement relates to what was predicted. The following two charts may be helpful when talking about reports. The first entitled "Use of Information for Reports" identifies each of the report elements and indicates what form they take in a report. The second chart entitled "Outcomes in the Context of Outputs" discusses shows how "output" data and "outcome" data can be used together to show program impact.

Slide 57: Summary: Value of OB

57. OBE Evaluation for New York's Libraries

Staff in all libraries can use OBE to:

  • Evaluate true audience impact
  • Plan programs
  • Seek funding
  • Increase advocacy for programs
  • Submit consistent applications, plans, and reports
 

Discussion:

Wrap up the workshop with this slide repeating the value of OBE to libraries. Before closing the workshop, eac participant should complete the workshop evaluation survey and Activity 6 which is a final independent exercise. This activity is done individually using a new outcome that the participant writes. The results will show how many participants can successfully complete the 6 parts of the outcomes writing process.

Practice Exercises

Selecting Homework Most Suited to OBE

Look at the homework examples below. They are organized according to the groups that will work on a single project. For each group, select one homework item best suited for OBE. Next to each item tell why it was or was not selected.

Homework for Group 1

Your Organization's Mission

The mission of the System is to improve and expand library service in counties through leadership education, advocacy and enhanced resource sharing.

Project Title: Training for Circulation and Maintenance Functions of Integrated Librar System

Project Description (answering the following questions)

What will the project do? Offer repeated training on basic competency requirements of every staf member for using circulation and holdings maintenance functions of the integrated library system

Who is the project for? Staff of member libraries

How will the participants benefit? Staff members will be aware of expected competencies. Trainin will be offered on a recurring basis so new staff and those wanting refresher sessions will be able to attend in a timely manner. Practices throughout the system will be consistent, making for more efficient and accurate use of the system.

Selected? ___ Yes ___ No Why or why not?

Homework for Group 1

Your Organization's Mission

The mission of the System is to improve and expand library service in counties through leadership education, advocacy and enhanced resource sharing.

Project Title: Positioning Your Library for the Financial World: Finding Grants, Writin Grants

Project Description (answering the following questions)

What will the project do? Provide information and techniques on how to write grants effectively The outcomes-based techniques learned in this workshop will be applied to an online course in grant writing.

Who is the project for? Anyone with an interest in learning how to write effective grants fo library projects

How will the participants benefit? Ideally, the participants will benefit by obtaining gran funding through better-written proposals.

Selected? ___ Yes ___ No Why or why not?

Homework for Group 1

Your Organization's Mission

The mission of the System is to improve and expand library service in counties through leadership education, advocacy and enhanced resource sharing.

Project Title: The Blended Learning Program

Project Description (answering the following questions)

What will the project do? This project will provide library staff (Professionals and suppor staff) with a variety of learning options in order for them to participate in a "blended" continuing education program, much of which will be offered on technology training courses and using technology-assistance.

Who is the project for? More than 1500 staff working in system's member libraries

How will the participants benefit? Participants will benefit from the knowledge received by participatin in workshops, courses and seminars designed to enhance their skills and abilities.

Selected? ___ Yes ___ No Why or why not?

Homework for Group 1

Your Organization's Mission

The mission of the System is to improve and expand library service in counties through leadership education, advocacy and enhanced resource sharing.

Project Title: No User Left Behind

Project Description (answering the following questions)

What will the project do? Train staff in basic PC readiness for ILS migration and train tec liaisons to gain valuable tech skills.

Who is the project for? Staff

How will the participants benefit? More confident in technology skills and abilities - bette equipped to help other staff and the public.

Selected? ___ Yes ___ No Why or why not?

[SEE Answers]

Homework for Group 2

Your Organization's Mission

The system serves a statewide library network of member school libraries by providing qualit information services in support of excellence and equality for all learners.

Project Title: Elementary Health Advantage

Project Description

What will the project do? Elementary teachers and library media specialists will work in collaborativ teams to develop lesson plans that incorporate the NOVEL HRC database.

Who is the project for? Target audience: elementary teachers and library media specialists
specifically each sls will target 5 schools that have had zero usage of the hrc database using the stats received from NOVEL. Teams will be created at each school to include the library media specialists and teachers. School administrators will be included in the selection of each team to create global support in each school.

How will the participants benefit? Awareness and use of accurate and up-to-date resources wit students. Create original lesson plans to stimulate increased learning by students. Develop and utilize collaborative approach to teaching. Long-term use of accurate information promotes a model for reliable research when utilizing the HRC database.

Project will:

  • Change participants knowledge by training that will be hands-on for each core team to develo lesson plans to be implemented in the classroom.
  • Participants' skills using the HRC database will increase.
  • Continue follow-up contact with participants through a variety of resources: listserv, grou meetings, blackboard electronic communication tool, web site with useful update tips and ideas for awareness and usage.

Selected? ___ Yes ___ No Why or why not?

Homework for Group 2

Your Organization's Mission

The system serves a statewide library network of member school libraries by providing qualit information services in support of excellence and equality for all learners.

Project Title: Extranet Training

Project Description

What will the project do? This project will provide orientation training to the member librarian so that they can use the new extranet website for enhanced communication and resource delivery services. The training will focus on reading and posting in the discussion forums, finding resources on the page, staying updated with
news postings, accessing new content.

Who is the project for?This training is being offered to all librarians and library staff fro the member districts

How will the participants benefit? After attending this training, the participants will be abl to communicate using the new extranet site. This will allow librarians and library staff to communicate with a greater level of efficiency using archived news posts, threaded discussion boards, and RSS syndication.

Selected? ___ Yes ___ No Why or why not?

Homework for Group 2

Your Organization's Mission

The system serves a statewide library network of member school libraries by providing qualit information services in support of excellence and equality for all learners.

Project Title: NOVEL Awareness

Project Description

What will the project do? The project will promote awareness and use of the State's NOVEL database by all types of libraries.

Who is the project for? School librarians, teachers, and students

How will the participants benefit? Participants will benefit from staff development in a bette understanding of the State's databases and their uses in their own particular environment.

Selected? ___ Yes ___ No Why or why not?

Homework for Group 2

Your Organization's Mission

The system serves a statewide library network of member school libraries by providing qualit information services in support of excellence and equality for all learners.

Project Title: Library Automation Training

Project Description

What will the project do? Train library staff to use the new automation system.

Who is the project for? Library staff.

How will the participants benefit They will learn how to use the new automation system.

Selected? ___ Yes ___ No Why or why not?

[SEE Answers]

Modifying Homework Project to Fit OBE

Look at the two examples of homework. Imagine that none of the homework submitted was really ready for an OBE plan. Tell how you would ask the groups to modify the project so it would work for purposes of learning the OBE process.

Project Title: Using WebMax to Meet Curriculum Needs

Project Description

What will the project do? Increase the number of school staff that uses the WebMax to locat audiovisual resources, i.e. DVDs, videotapes, laser disks, and/or multiple collection titles for use in instruction.

Who is the project for? Classroom teachers and librarians.

How will the participants benefit? Increased access to materials for their curriculum needs.

Proposed Modifications:

Project Title: Making an IMPACT! On Your Students & Your Teachers

Project Description

What will the project do? Train participants on the use of IMPACT! (Instructional Media Professional' Academic Collaboration Tool) software, which will enable them to document library program activities.

Who is the project for? Library media specialists

How will the participants benefit? The software will allow individual Library Media Program to be documented for accountability purposes and presentations as needed by profiling collaborative planning between teachers and LMSs.

Proposed Modifications:

[SEE Answers]

Improving Purpose Statements

The following purpose statement is spaced so suggested changes can be written in. Use the space to cross out items and add-in others.

The school library system provides training in collaborative planning and classroom integratio of NOVEL databases to librarian/teacher teams to:

  • Increase the teams collaboration and lesson planning skills for teaching NOVEL databases

    Increase evidence of librarian/ teacher collaboration

    Increase integration of NOVEL databases into instruction

[SEE Answers]

Creating and Improving Outcome Statements

The following "Outcome" statements are double spaced so suggested changes can be written in. Use the space to cross out items and add-in others.

Outcome: System trainer will provide teach library staff to use consumer health information
Outcome: System trainer will teach library staff skills to help patrons find consumer health information
Outcome: Library staff teach targeted community groups to find consumer health information
Outcome: More library patrons have the ability to find consumer health information independently
Outcome: Library staff help patrons search genealogy databases

Using the following program purpose statement, write at least 6 outcomes.

The school library system provides training in collaborative planning and classroom integratio of NOVEL databases to librarian/teacher teams to:

  • Produce collaborative lesson plans for teaching NOVEL databases
  • Develop patterns of collaboration between librarians and teachers
  • Integrate NOVEL databases into classroom instruction
  • Develop students' skills using NOVEL databases for assignments

[SEE Answers]

Creating and Improving Outcome Indicators

The following "Indicator" statements are double spaced so suggested changes can be written in. Use the space to cross out items and add-in others.

Outcome: Library staff search advanced health databases
indicator # and % of library staff who successfully complete prescribed advanced health database searches
Outcome: Library staff report helping patrons search genealogy databases
indicator: # and % of library staff who report helping patrons use genealogy databases
Using the following outcome statements, write indicator(s) for each:
outcome: librarians and teachers define collaboration skills
Outcome: Librarian/teacher teams collaborate to design instructional units that integrate the us of online databases
Outcome: Librarians and teachers report on-going collaboration activities
Outcome: Librarians and teachers report implementation of lesson plans using online databases wit students
Outcome: Teachers report successful completion of student assignments using online databases
Outcome: Students report independent, successful use of online databases.

[SEE Answers]

Creating and Improving Target and Target Achievement Predictions

In the boxes next to the following "Target Audience and Target Achievement Predictions" write proposed revisions.

Outcome: Library staff search advanced health databases

indicator: # and % of library staff who successfully complete at least x number of prescribed advanced health database searches as assessed by the trainer during a workshop Target Achievement = 80%
Suggested Change:
Outcome: Library staff report helping patrons search genealogy databases

# and % of library staff who report helping at least x number of patrons successfully use genealogy databases as assessed by a checklist applied to the report form.

target audience = 250 library staff

target achievement = 200 (80%)
Suggested Change:
Using the following outcomes and indicators, fill in possible target audience and target achievemen information:

outcome: librarians and teachers define collaboration skills

# and % of librarians and teachers who score at least 85 % on a collaboration skills quiz at the end of th workshop




outcome: librarian/teacher teams collaborate to design instructional units that integrate the use of online databases

# and % of librarian/teacher teams who design at least 1 collaborative unit that successfully integrates th use of online databases as assessed by rubric on unit design during the workshop




outcome: librarians and teachers report ongoing collaboration activities

# and % of librarians and teachers who report at least one incident of librarian/teacher collaboration afte the workshop as assessed by a collaboration rubric applied to a survey.
# and % of librarians and teachers who achieve a self-assessment score of at least 80 on a collaboration survey.





outcome: librarian/teacher teams report implementation of lesson plans using online databases with students

# and % of librarian/teacher teams who report using the lesson plans created during the workshop in the classroo as assessed by a survey




outcome: students report independent, successful use of online databases.

# and % of students who report "successful databases searches" for different assignments than th initial requirement as assessed by a survey.




outcome: teachers report successful completion of student assignments using online databases

# and % of teachers who report # and % of students who successfully complete at least 1 assignment using onlin databases as assessed by a "successful database search" checklist applied to the assignment.
Suggested Targets:

[SEE Answers]

Sample OBE Plan -- Outcomes

Outcomes: Public Library Web Site Attracts Teenagers

Outcome 1: (Changes in skill, knowledge, attitude, behavior, life condition or status)
library staff write improved web sites that attract teens.
Indicator(s)
(Concrete evidence, occurrence, or characteristic that will show that the desired change occurred)
Data Source
(Where data will be found)
Data Intervals
(Points at which information is collected)
Target Applied To
(Segment of population to which this indicator is applied)
Target Achievement
level (goal)
(the number, percent, variation or other measure of change)
# and % of library staff who write web sites that include 3 teen components deemed acceptable b a trained web developer. Observation At end of training workshop All staff who complete the training N=20 80%
N=16
# and % of library staff who edit their web sites with 3 new teen attractors. Library reports to system Six months after training workshop All staff who complete the training N=20 80%
N=16

Outcome 2: (Changes in skill, knowledge, attitude, behavior, life condition or status)
teens use web site and teen targeted web-based library services.
Indicator(s)
(Concrete evidence, occurrence, or characteristic that will show that the desired change occurred)
Data Source
(Where data will be found)
Data Intervals
(Points at which information is collected)
Target Applied To
(Segment of population to which this indicator is applied)
Target Achievement
level (goal)
(the number, percent, variation or other measure of change)
# and % of teens that provide feedback on the library's web site and the web-based library service before and after teen input incorporation. Pre- and post web evaluations by teens. Semi-annual system report cycle Middle and high school students in the community with web access N= 1000 25% increase in feedback
N=250
# and % of teens that participate in teen-targeted events advertised on the web site before an after teen input is incorporated as reported by library staff. Library reports from program records including teen self- reports submitted to system. Semi-annual system report cycle Middle and high school students in the community with web access N= 1000 10% increase in event participation
N= 100

Outcome 3: (Changes in skill, knowledge, attitude, behavior, life condition or status)
teens design enhancements to the library's web site.
Indicator(s)
(Concrete evidence, occurrence, or characteristic that will show that the desired change occurred)
Data Source
(Where data will be found)
Data Intervals
(Points at which information is collected)
Target Applied To
(Segment of population to which this indicator is applied)
Target Achievement
level (goal)
(the number, percent, variation or other measure of change)
# and % of teens enhancing web site as assessed by changes to web sites based on teen input Program records reported to system System reports every two years when advisory group changes and new teens are trained in web development Teens in advisory groups at each site
N=100
25%
N = 25

Outcomes: 19-34 year old Black/Hispanic men: non-library and non-computer users

OBE Step Four: Measurable Outcomes

Outcome 1: (Changes in skill, knowledge, attitude, behavior, life condition or status)
target audience (see above) demonstrates electronic resume writing.
Indicator(s)
(Concrete evidence, occurrence, or characteristic that will show that the desired change occurred)
Data Source
(Where data will be found)
Data Intervals
(Points at which information is collected)
Target Applied To
(Segment of population to which this indicator is applied)
Target Achievement
level (goal)
(the number, percent, variation or other measure of change)
# and % of participants who write a resume in a provided standard format
as assessed by trainer
Resume document End of course (8 weeks) All course participants
N=100
50 %
N=50
# and % of participants who write an error-free resume in a standard format resume in a provide standard format
as assessed by trainer
Assessment report End of course (8 weeks) All course participants
N=100
50 %
N=50

Outcome 2: (Changes in skill, knowledge, attitude, behavior, life condition or status)
target audience will apply for jobs electronically.
Indicator(s)
(Concrete evidence, occurrence, or characteristic that will show that the desired change occurred)
Data Source
(Where data will be found)
Data Intervals
(Points at which information is collected)
Target Applied To
(Segment of population to which this indicator is applied)
Target Achievement
level (goal)
(the number, percent, variation or other measure of change)
# and % of participants who identify appropriate jobs electronically as assessed by trainer Program record checklist End of course (8 weeks) All course participants
N=100
30%
n-33
# and % of participants who report that they apply for at least one job using the resume. Self-report End of course (8 weeks) All course participants
N=100
30%
n-33

Outcome 3: (Changes in skill, knowledge, attitude, behavior, life condition or status)
target audience will gain employment using the resume.
Indicator(s)
(Concrete evidence, occurrence, or characteristic that will show that the desired change occurred)
Data Source
(Where data will be found)
Data Intervals
(Points at which information is collected)
Target Applied To
(Segment of population to which this indicator is applied)
Target Achievement
level (goal)
(the number, percent, variation or other measure of change)
# and % of participants who report that they obtained their targeted jobs Survey
self-report
End of course (8 weeks) 3 & 6 mo. intervals All course participants
N=100
10%
N=10

Outcome 4: (Changes in skill, knowledge, attitude, behavior, life condition or status)
participants' exhibit changed attitudes about the value of the library.
Indicator(s)
(Concrete evidence, occurrence, or characteristic that will show that the desired change occurred)
Data Source
(Where data will be found)
Data Intervals
(Points at which information is collected)
Target Applied To
(Segment of population to which this indicator is applied)
Target Achievement
level (goal)
(the number, percent, variation or other measure of change)
# and % of participants who apply for a library card Library records Weekly during 8 week course, then 3 mo. intervals All course participants
N=100
75%
N=75
# and % of participants who report changed behavior for computer use Follow-up survey
focus groups, follow-up survey, observation
50%
N=50

Checklists and Rubrics for OBE Evaluation Projects

When many indicator statements are written it becomes obvious that when we ask participants in a program to tell us about their actions as a result of the program, we need to compare their responses to some standard that we set at the beginning and predicted they would meet. We find ourselves writing indicators that say, "assessed by a checklist or rubric." We sometimes say, "assessed by a checklist or rubric applied to a survey. In the latter case, we ask open-ended questions about what happened and check our rubric to see if our prediction standard was met.

Checklists are familiar to us. We can easily make a list of skills associated with a training activity. In a follow-up survey if we ask participants to describe incidents in which they used what they learned during training, we look at our checklist and see if the skills we taught were used. An example of our indicator might be: # and % of target audience who report at least X number of successful uses of at least 3 new (fill in the blank wit content) skills within six months of training.

Rubrics are a bit more complicated, but are used in essentially the same way. A rubric allows you to write a continuum of performance levels and to make predictions about how skill levels may increase over the life of a project. For example if a rubric identifies five levels from basic to advanced, we can predict how patrons (customers) will develop skills at the 3 level from beginning training and at the 5 level from follow-up training. Or if our training content is complex, the continuum may show that behaviors changed at least at the 1 level, bu we can still see change that exceeds that level and track it. Or we can predict different percentages of target achievement at each level.

Because so many school library system groups in early OBE training used references to rubrics for collaboration between teachers and school library media specialists, the developers chose that topic to create a checklist and a rubric that could be used for evaluation. We have created an artificial context for each of our samples.

Context: Teacher/librarian teams were taught collaboration skills, e.g. equal treatment of peers, consensus building, effective brainstorming, etc. Our indicator states: # and % of teachers and librarians who report successfullychecklist applied to a survey.

Survey:

Please describe how you used the collaboration skills you learned during training in the 6 months since the training.

Use the suggested format to record as many incidents as you can recall.

Collaboration skill? _________________________________________________________
Who was involved? _________________________________________________________

The point of the open-ended question is that we don't lead the respondent to tell us what we want to hear. Rather in their own words they tell us how they think they applied what they learned.

We look at their responses and apply them to our checklist of skills and ask these questions?

Does the skill match (meaning not exact wording is important) one on our list?

Was it practiced in collaboration? (Note you can brainstorm alone. In the context of collaboration, the skill of brainstorming implies more than one participant).

Did the respondent practice 3 different skills?

Are the accomplishments listed for at least 3 different skills reasonable and desirable outcomes of collaboration?

If there are yes answers to each question, then all of the indicator standards are met.

Sample checklist of collaboration skills:

  • Awareness of own attitudes and opinions, recognition of others attitudes and opinions - ability to understan and compromise.
  • Make only positive assumptions about ideas and motives of other
  • Include everyone in the creative proces
  • Follow the rules of brainstormin
  • Listen to all sides of an argumen
  • Practice consensus buildin
  • Treat peers equall
  • Practice win-win philosoph
  • Share versus take responsibility for group succes
  • Express thanks and share result
  • Seek peer inpu
  • Problem solv
  • Manage conflict effectivel
  • Avoid showing favoritis
  • Listen attentively and ask for clarificatio

Context: Teachers and librarians were taught collaboration skills. Our indicator states: # and % of teacher librarian teams who agree that their collaborative skills when working together on a library skills of students project after the training reached at least the four level on a collaboration skill continuum rubric.

Collaboration Skills Continuum Rubri
Skill
Level
Level
Level
Level
Level
Self Reflects Locked in own attitudes Little understanding of attitudes or opinions of others Some evidence of understanding others Mostly shows understanding of others attitudes and conclusions Displays ability to step out of own attitudes; can be objective
Listens Closed questions, judgmental, no follow-up One or two open questions, mostly judgmental Some open ended questions; mixed follow-up Mostly open-ended questions; mostly non-judgmental; some follow-up questions Ability to develop broad, open-ended questions based on curiosity. Listens non-judgmentally; follow up for clarification
Builds trust
brainstorms
solves problems
Makes negative assumptions, impugns others, does not check out own assumptions, does not includ all in creative process, ignores or is unaware of other's issues; advocates own position to detriment of group or goals Makes mostly negative assumptions; not responsive to creativity of others, shows favoritism, take mostly narrow positions Includes a few in the process. Includes some concerns of others;
limited brainstorming using appropriate methods
Includes most in the collaboration process; excludes few differing positions;
includes most in problem-solving; mostly attempts to negotiate differences
Displays integrity, reliability, responsiveness, and empathy. Achieves common interests when problem-solving.
doesn't take narrow positions. manages own preconceptions. includes all in creative process equally. identifies and negotiates differing positions; follows rules for effective brainstorming.
Shares Decision-making Makes unilateral decisions Little peer input to decisions Some group input to decisions; still room for win/lose Most participate in decisions; usually consensus is reached Displays forms of consensus to achieve win/win negotiations
Shares accountability No communication
no group responsibility for positive or negative results
Little communication
little group responsibility
Some communication
some group responsibility for results
Most communicate; most take responsibility for results Open and frequent communication
equal responsibility for successes and results

Context: Teachers and librarians collaborated on lesson plans to integrate information skills into the curriculum. Our indicator states: # and % of teacher librarian teams who rate at least three collaborative activities (planning, assessment, materials development, teaching, evaluation) related to information skills for students reached at least level three on a collaboration activities continuum rubric.

The following table is adapted from a rubric developed by Poland Regional High School and Bruce M. Whittier Middle School, School Union 29, Poland, Maine

Collaboration Activities Continuum Rubric for Teaching Information Skill
Activit
Level
Level
Level
Level
Level
Planning No discussion of assignment occurs prior to activity Informal discussion of assignment precedes activity At least 1 planning session to review existing unit goals, assessments, and activities At least 1 planning session to create unit outcomes, assessments, and activities More than 1 planning session to create outcomes, assessments, and activities
Assessment Assessment objectives are discussed informally if at all. Assessments do not target information skills specifically. Assessments may target information skills Assessments target subject area discipline and some relevant information skills Assessments target integration of subject area skills and all relevant information skills
Materials development Materials are not provided or prepared before activity. Materials are already developed, are described and/or provided. Materials may be adjusted based on discussions and available resources. Some materials are planned collaboratively using available resources. All materials are developed using a collaborative planning model and available resources.
Teaching No instruction or teacher brings resources to classroom Library media specialist may give brief orientation to resources and/or does some pre-searchin to identify available resources. Library media specialist teaches information skills while teacher focuses on content. Teaching of information skills and content are somewhat shared but may fall along role lines. Teaching of content and information skills areas are shared by teacher and library media specialis using collaborative lesson plans.
Evaluation No evaluation occurs. Library media specialist may discuss informal observation with teacher, but does not participat in evaluation of assessments. At least one type of information skill evaluation occurs collaboratively. Assessment includes at least 2 types of collaborative evaluation of information skills and may includ content. Library media specialist and teacher share collaboratively in evaluation of information skills an content using informal observation, objective rubric assessment, and review of assignments that require information skills for completion. Team compares and interprets results for future planning; team compares results to outcome predictions.

Context: Teachers and librarians were taught collaboration skills. The indicator states, "# and % of librarians and teachers who score 15 out of 20 points on a collaboration self-assessment checklist.2

Note: The checklist for this indicator could be use as a survey before training and repeated as a follow-up to see if participants perceive that they have changed attitudes or behaviors.

Self-Assessment Checklis
 
Ye
N
1. I am aware of my own attitudes and opinions.    
2. I recognize attitudes and opinions of others.    
3. I show understanding of others attitudes and conclusions.    
4. I step away from my own attitudes and opinions and observe others objectively.    
5. I make negative assumptions in work situations.    
6. I impugn others when their opinions don't match mine.    
7. I include others in the creative process.    
8. I show favoritism in work situations.    
9. I follow the rules of civilized brainstorming.    
10. I ignore or are unaware of the concerns of others.    
11. I identify and address differing positions.    
12. I am a consensus builder.    
13. I am inclined to make unilateral decisions.    
14. I am a problem-solver.    
15. I get locked in my own position.    
16. I seek peer input.    
17. I treat peers equally.    
18. I know how to achieve win/win results.    
19. I share responsibility for group successes and failures.    
20. I share responsibility for communicating successes and failures.    

Scoring:

  1. Yes = 1 pt
  2. Yes = 1 pt
  3. Yes = 1 pt
  4. Yes = 1 pt
  5. No = 1 pt
  6. No = 1 p
  7. Yes = 1 pt
  8. No = 1 p
  9. Yes = 1 pt
  10. No = 1 pt
  11. Yes = 1 pt
  12. Yes = 1 pt
  13. No = 1 pt
  14. Yes = 1 pt
  15. No = 1 pt
  16. Yes = 1 pt
  17. Yes = 1 pt
  18. Yes = 1 pt
  19. Yes = 1 pt
  20. Yes = 1 pt

Frequently Asked Questions

The following are selected and modified examples of frequently asked questions developed by the Institute of Museum and Library Services, 1100 Pennsylvania Avenue, NW, Washington, DC 20506.

What is outcome-based evaluation (OBE)?

Outcome-based evaluation, sometimes called outcomes measurement, is a systematic way to determine if a program has achieved its goals. The organized process of developing an outcome-based program and a logic model (an evaluation plan) helps institutions articulate and establish clear program benefits (outcomes), identify ways to measure those program benefits (indicators), clarify the specific individuals or groups for which the program's benefits are intended (target audience), and design program services to reach that audience and achieve the desired results.

What is an "outcome" and how do you evaluate (measure) them?

An outcome is a benefit that occurs to participants of a program; when the benefits to many individuals are viewed together, they show the program's impact. Typically, outcomes represent an achievement or a change in behavior, skills, knowledge, attitude, status or life condition of participants related to participation in a program. In OBE, an outcome always focuses on what participants will say, think, know, feel, or be-not on mechanisms or processes that programs use to create their hoped-for results. Well-designed programs usually choose outcomes that participants would recognize as benefits to themselves. To simplify planning for evaluation, state the outcome you want to produce in simple, concrete, active terms.

Poor Outcome Statements

  • Students will know how to use the Web
  • Health database users will have better health information
  • School library media specialists and teachers will be trained in collaborative curriculum design
  • Students will know how to use NOVEL databases

Better Outcome Statements

  • Students will demonstrate three key web skills
  • Users will use health databases to make healthier life-style choices
  • School library media specialists and teachers will collaborate to develop lesson plans to teac use of NOVEL databases
  • Students will use NOVEL databases to complete classroom assignments.

What is the difference between outputs and outcomes?

Outputs are measures of the volume of a program's activity: products created or delivered, people served, activities and services carried out. Think of outputs as the "things" piece of evaluation. Outputs are almost always numbers: the number of loans, the number of ILLs, the number of attendees, the number of publications, the number of grants made, or the number of times a workshop was presented. Outcomes are the "people" or the "so what" piece - what happened because of the outputs.

Outputs

  • 42 staff members will complete training
  • 37 libraries will participate in technology training
  • 4 workshops will be held
  • 80 participants will receive 3 CEUs

Outcomes

  • School library media specialists and teachers develop lesson plans to teach NOVEL databases.
  • Students use NOVEL databases for successful completion of assigned bibliographies

How do I choose outcomes for my program?

First, carefully think out and describe the purpose of the program. A program is not usually developed only to carry out various actions or tasks. There is a reason for undertaking the tasks and offering the services. Most libraries don't build collections only to own them, or to go through the processes of cataloging, storing, and maintaining them. They develop collections to support the need of existing or anticipated users for information and education.

Ask, "why you are offering this program, what do you want to accomplish, and who should benefit?" It may be helpful to ask program staff, program partners, and other stakeholders, "If we are really successful with this program, what will the results look like for the people we serve?" Knowing your audience is equally important, their needs and wants, and what your program can do to help them achieve their aims. The answers to those questions should allow you to describe the changes or impact that you want to see as a result of the program. Those hoped-for changes become the intended program outcomes.

What is an indicator?

Indicators are the specific, observable, and measurable characteristics, actions, or conditions that tell a program whether a desired achievement or change has happened. To measure outcomes accurately, indicators must be concrete, well defined, and observable; usually they are also countable.

Poor Indicators

  • The # and % of students who know how to use the Web
  • The # and % of advanced health database users who make healthier choices

Better Indicators

  • The # and % of participating students who can bring up an Internet search engine, enter a topi in the search function, and bring up one example of the information being sought within 15 minutes
  • The # and % of advanced health database users who report they used the database to make on or more life-style changes from a list of 10 key life-style health factors in the last six months

It is easier to construct a good indicator if you use the format:

  • Number and/or percent (use as placeholders
  • of a specific target population wh
  • report, demonstrate, exhibit an attitude, skill, knowledge, behavior, status, or life conditio
  • in a specified quantity in a specified timeframe and/or circumstance

Number and percent: Both number and percent are usually specified to provide adequate information. If only two people participate in your program, after all, reporting that 50% of them benefited could be misleading. Rather, state 30% of 150, 75% of 25, or 10% of 1,500.

Target audience: The group of people the program hopes to affect. Effective programs keep the characteristics of the people they want to benefit clearly in mind. The more narrowly and specifically the group of people who are expected to participate in a program can be described, the greater the likelihood that a program will be designed to actually reach them.

Examples (low to high definition) Monroe County residents, Albany high-school students, Clinton county mothers at literacy level or below
Examples (low to high definition) Schoharie County residents, County residents age 18-26, Job seekers age 18 to 26
Examples (low to high definition) Montgomery County school districts, Montgomery County teachers and students, Montgomery County 3r and 4th grade teachers and students

Report, demonstrate, exhibit: Note that all of these are active, observable behaviors or characteristics that don't depend on guesswork or interpretation. More direct verbs can also be used such as "search" databases, "read" at grade level, "write" acceptable resume.

Attitude What someone feels or thinks about something; e.g., to like, to be satisfied, to value.
Skill What someone can do; e.g., log on to a computer, format a word-processed document, read.
Knowledge What someone knows; e.g., the symptoms of diabetes, the state capitals, how to use a dictionary.
Behavior How someone acts; e.g., listens to others in a group, reads to children, votes.
Status Someone's social or professional condition; e.g., registered voter, high-school graduate, employed.
Life Condition Someone's physical condition; e.g., non-smoker, overweight, cancer-free.

Specified quantity and specified timeframe or circumstance: This is the measurable part of an indicator. It asks the program developer to choose a quantity of achievement or change that is enough to show the desired result happened, and the circumstances or timeframe in which the result will be demonstrated. Examples: three times per week, in 15 minutes or less, 6 months after the program ends, 4 or higher on a 5-point scale.

Do I have to evaluate every program my institution offers?

No. We believe IMLS constituents will come to know the benefit of OBE and will want to incorporate it in many, if not all, programs, particularly those that have a clear audience to whom a program is targeted. We're urging library and museum staff to choose one program that they offer, and to "pilot-test" OBE with that program. That will provide the experience to decide what skills and resources an institution needs to develop to demonstrate and report outcomes to its stakeholders.

How many program participants have to be evaluated, all or a sample?

For many programs it is possible to evaluate the impact to all participants. Others will have access to only a sample of participants. This is often true, for example, of programs that provide digital resources - collections, exhibits, curriculum tools, or Web sites. Many programs will seek volunteers to answer questionnaires or to participate in focus groups to provide outcome information. This is perfectly acceptable.

Will funders pay for small outcomes?

For IMLS it is less about small or large outcomes than about what you hoped to achieve for an audience, what you learned in the process and what was reasonable to expect for that audience. In some cases a 10% improvement is very significant, while in others, a 90% impact is reasonable to expect. You need to know your audience and your stakeholders and creating appropriate goals and expectations. When that is done, and outcomes still fall short of goals, OBE allows institutions to assess, explain, and learn from why outcomes fell short of goals. Without OBE, it can appear as if a program just didn't do what it said it would. With OBE, you have the opportunity to learn why and make improvements for the next offering, or the next program.

IMLS turns to its reviewers to decide what projects seem most promising and most needed. If a proposed project can show a clear plan for evaluation that will demonstrate meaningful outcomes (even small ones) concretely and objectively, we believe reviewers will find it very competitive.

Finally, the "size" of the outcome is proportionate to the size of the target audience and the duration or the intensity of their experience in a program. If a project works closely with a small number of participants, the outcome might look small, but might be profound for those participants. If a project offers a rapid service to a very large number of participants or users, the outcome is likely to be minor, but may reach many people. Reviewers' assessments of a proposal consider those factors.

Many proposals make idealized claims for anticipated contributions without offering any concrete information about how project managers will know if their intentions were realized. Some favorite examples include: " if this project is funded, democracy will flourish," and "if other states followed our model they would find it very productive." It is increasingly important to resource allocators and policy makers that programs or projects have concrete audience benefits, with services designed to achieve them for a clearly defined audience, and that managers demonstrate that the benefits were achieved.

Can my program take credit for large outcomes?

Certainly, if the outcomes were logical and closely related results of the services provided. "Attribution" is less concerned with big or small outcomes and more concerned with the logical connection between services and outcomes and the clarity of indicators. Part of the usefulness of OBE is the concrete, objective way it can connect participation in a program or service to specific knowledge, attitudes, behaviors, skills, and other achievements.

What does OBE cost?

On average, an institution should budget 7-10% of a program's total budget to cover the costs of OBE. Almost all funding agents require some evaluation, and substantive evaluation is a requirement for many. As a result they expect that evaluation costs will be included in the budget for any project. Remember that in the case of State Program grants, LSTA funds may be used for evaluation. In the case of IMLS discretionary grants, staff time, and other resources required for evaluation can be used to match funds awarded directly, or funds can be requested for evaluation. The exact cost will depend on the project, the level of evaluation and the knowledge of the organization.

If OBE is not the same as academic research, and the results may not be reliable evidence of outcomes, so why should I do it?

Formal research is one way of capturing information, not the only way. OBE is a strong, effective and reliable management tool that provides an institution with information regarding the degree to which a program did what it set out to do. While it does not allow you to determine and claim unique or complete credit for an outcome, it does allow you to demonstrate the degree to which a program contributed to the outcome for individuals. If you have no information, you cannot credibly claim any contribution to impact.

This is a burning question for many in the library and museum worlds, in part because academic training conditions us to look skeptically at any information that is not statistically valid, rigorous in its sample selection, and otherwise derived from the scientific model. In OBE, we're not looking for information we intend to extend to other institutions or contexts. Instead, we're looking to see if what we did had the result we intended. That information helps us make decisions about a particular program: whether to continue it, expand it, improve it, or replace it with another.

OBE normally looks at an individual program's participants for logical, credible evidence that a limited number of very specific, observable attributes or phenomena happened in relatively close proximity to an experience or service designed to produce them for those particular people.

OBE doesn't usually look for signs that participants have more or better of what it's evaluating than non-participants. It is not intended to prove that one program did something more effectively than another (although that's possible).

If a project intends to show unique attribution, to demonstrate the relative worth of one approach measured against others, or to provide a tool for use by other organizations, then of course it needs to turn to the tools and criteria of research. Since the use of the data provided by OBE is limited, we can usually be satisfied with information that is accurate, without requiring statistical rigor, blind or random sampling, or other characteristics of research for which broad applications are intended.

What do I look for in an evaluator?

Someone who has a strong working knowledge of outcome-based evaluation - measuring impact on the people served by a program - and also has knowledge and experience working with your discipline. A good evaluator can quickly assess and learn your specific programs and mission. It helps, but is not a requirement, that they have experience evaluating similar projects.

How many outcomes should my program have?

A program needs to have at least one outcome, however, programs are likely to have more than one outcome. It is important to consider what the purpose of the program is and the ways you would expect participants to benefit from your program. These benefits will likely be the outcomes for your program, but you need not measure everything. You may want to set priorities and determine what you and your program's stakeholders would really need to know about the program's impact.

What is a logic model and is it necessary?

A logic model is a step-by-step approach for defining and measuring outcomes. It is your program's evaluation plan. It shows how you will measure outcomes, what information you need to collect, who you will collect information about, when you will get the information and what targets you have chosen for the outcomes.

Yes, a logic model is essential to the success of your institution's implementation of outcome-based evaluation. Without this, outcome based evaluation will not become a reality for your institution.

Logic Model (OBE Plan) Elements and Structur
Outcom
Indicato
Data sourc
Data Interva
Target Audienc
Achievemen
Definitions: Intended Impact Observable and measurable behaviors and conditions Sources of information about conditions being measured When data will be collected The specific group within an audience to be measured (all or a subset) The amount of impact desired
Examples:
students will have basic internet skills
The # and % of participating students who can bring up an Internet search engine, and enter a topi in the search function, and bring up one example of the information being sought within 15 minutes Searching exercise, trainer observation At end of workshop Howard County 7th-8th graders who complete the workshop 85% of approximately 125 participants

How complicated is outcome-based evaluation?

Once the concepts are understood and you have successfully implemented it a few times, it is a very simple process to understand and manage. The key to success is commitment of the institution and the clear identification of roles in managing OBE.

How much time will it take?

It isn't possible to prescribe a time for all programs. It does take a commitment of time and resources to get it done. The majority of time comes at the front end, particularly as you first begin to implement outcome-based evaluation in your institution. In compensation, once incorporated, OBE can save significant time in planning and management by allowing you to get at the right questions, and answers, early on in the program planning process.

What can outcome based evaluation do for my institution?

Employing outcome-based evaluation and reporting on the impact of the program can have many positive benefits for an organization:

  • First, it can help institutions tell their story in ways their stakeholders and the general public can understan and appreciate. It helps institutions convey important information about the collective impact on their program participants, while maintaining the ability to convey the very powerful and personal stories that show how important the program was to specific individuals.
  • Second, it can help better position institutions to request and receive funding because they can describe th intended benefits and impact of a proposed program in very specific terms by identifying what the program will do for participants. This is particularly important given that more and more funding agents expect programs to identify what they hope to achieve as a result of funding.
  • Third, when OBE becomes part of an organization's management routine, its programs can be improved as a result Program goals are well planned and established, and these goals are regularly reviewed. Stakeholders are informed about the impact of funded programs. In turn, outcome-based evaluation will helps an organization's program staff better communicate the benefits they intend to deliver to program participants - it can aid recruitment and marketing.

Aren't some things difficult to measure?

Some things will seem more difficult to measure (evaluate) than others, and not all things programs accomplish need be measured. It is often more straightforward to measure "hard" impact, such as knowledge, behavior, and skills than it is to measure "soft" impact such as attitudes. Measuring attitude changes or other "soft" impacts is not actually more difficult, but it may require more creativity. Regardless, clarifying the relationship between an outcome and measurable and observable "indicators" is key to success.

How will I know if my outcomes are good enough?

Outcomes are effective if they 1) are closely associated with the purpose of a program and describe what an organization wants to make happen for people, 2) are realistic and within the scope of what the program can affect and 3) have indicators that allow them to be measured.

How do I report outcome based evaluation information?

Consider what your program's stakeholders want to know about the results of your program when developing reports from outcome-based evaluation data. The institution's Board, its community, and funding agents may want similar information, but this does not mean that one report will satisfy everyone. In general, consider the following as desirable information for reports:

Needs identified
  • Inputs (what we used)
  • Activities and services (what we did)
  • Audience (characteristics and participation)
Outputs (what we produced)
  • Outcomes (what impact we achieved and how we know) and
  • Interpretation (what it all means, why it matters)

Do I have to do this?

IMLS does not currently require its grantees to conduct outcome-based evaluation, but it supports and encourages it as a valuable management tool. At the same time, IMLS is required to report to Congress in outcome-based terms; we cannot do that without input from you. We consider the consistent use of outcome-based evaluation to be an effective and efficient way for all programs to capture critically important information and to tell their story persuasively. IMLS is gradually strengthening information about outcome-based evaluation in guidelines for its discretionary grant programs and its program for State Library Agencies, and is considering the benefit of making outcome-based evaluation for funded programs a requirement at some future time.

OBE Glossary

The following are selected and modified definitions of OBE terms developed by the Institute of Museum and Library Services, 1100 Pennsylvania Avenue, NW, Washington, DC 20506. OBE evaluators should adapt any terms to fit the requirements of a program application or funding request.

Ter
Definitio
Activities Tasks that support and/or produce a project's products and services; they serve as the basis for estimating th project's schedule, resources, and costs
"Applied to" In OBE usage, a project's target audience or the part of that group for whom you will measure indicators to asses outcomes; e.g., a group of people who participate in a program or use a product in a certain time period or for a specified number of times
Assumption Proposition or principle which you suppose is true or take for granted
Attitude What someone feels or thinks about something
Audience Individual, group or institution for which the organization's products and services are provided, e.g., librar patrons, museum visitors, other libraries, museums or partner institutions.

target audience: individual, group, or institution that is the focus of the project's goals
Behavior How someone acts
Benefit Gain or payoff accruing to the project stakeholders as a result of the project
Characteristics of target audience Attributes--e.g., age, geographic location, number, job position--of the target audience that you should take int account in analyzing their needs
Data analysis Organization, processing, and presentation of information you collected for the purpose of making recommendation or drawing conclusions
Data collection interval The points in time and the frequency with which you will collect or assess information about the project's results
Data source Instruments, records, or other resources you will use to provide information about your project's needs analysi and evaluation; examples include written surveys, interviews, structured observation, documents
Desired result Goal you want your project to achieve or provide for its stakeholders; it could be expressed as an outcome or a output
Evaluation Evaluation activities: tasks you perform to measure the extent to which your project has met its goals

evaluation approach: unified set of principles, methods, rules, and processes for assessing and demonstrating the extent to which a project has met its goals
Formal research Set of systematic procedures recognized by a community of scholars for collecting, analyzing, and interpretin data for some purpose; enables you to draw tentative conclusions about cause-and-effect relationships among phenomena, e.g., your project's services and benefits to the target audience, and about the effectiveness of one approach or project in comparison to another one
Goals At the organization level, the broad results your organization wants to achieve for its audiences during a specifi period of time, which guide the selection of programs and management/operations functions; can be expressed as an output or an outcome
at the project level, the specific results you want your project to accomplish for its target audience(s), which guide the development of the project's activities and milestones and define the scope of what you can accomplish within a specific time frame; can be expressed as an output or an outcome
Indicators Measurable conditions or behaviors that can show an outcome was achieved; usually expressed as a number and/o percentage of the target audience that demonstrates a measurable sign or characteristic representing the intended outcome
Informal research Set of procedures that can be documented to collect, analyze, and interpret data and draw conclusions with enoug confidence and credibility to serve the purposes of project management
Knowledge What someone knows
Life condition Someone's physical or psychological condition
Mission Overall purpose of an organization; typically identifies key broad audiences and purposes, and often broadly describe the methods by which the organization will achieve its mission
Outcome Gain or change in an individual's knowledge, skills, attitudes, behaviors, status, or life condition
Outcome based evaluation approach (OBE) Set of principles and processes to provide information about the degree to which a project has met its goals i terms of creating benefits for individuals in the form of knowledge, skill, attitude, behavior, status, or life condition
Output Measure of the amount, the quality, or volume of use of a product or service
Output based evaluation approach Set of principles and processes to provide information about the degree to which the project's products and service have achieved the desired result; e.g., the quantity or quality of services, the volume of users or participants, or the number of products that met the target audience's expectations
Output measure Measurable unit showing that an output was achieved, usually expressed as a number and/or a percentage
Product Anything created or obtained as a result of some operation or work
Project Series of related activities that has a discrete beginning and end and is intended to produce a desired resul for its target audience
Sample In conducting your target audience needs analysis or your project evaluation, a representative subgroup to whic you will apply the selected data source to gain information about the whole group
Service Activity carried on to provide people with the use of something
Skill What someone can do
Solution Approach, including a product and/or service, to close a gap between the desired result you want to achieve fo an audience and the current state or condition
Stakeholder Any individual, group, or organization that influences or is affected by the project; can be external to your organization e.g., partners, boards, grant-making organizations, or internal to your organization, e.g., management project team members, management personnel
Status Someone's social or professional condition
Target Measurable amount of success you believe your project can and should achieve within a certain time frame

if the target is expressed as a output, it refers to the amount, quality, or volume of use you believe your project can and should achieve within a certain time frame

if the target is expressed as an outcome, it refers to the measurable amount of success you believe your project can and should achieve with regard to the target audience's knowledge, skills, attitudes, behaviors, status, or life condition, within a certain period of time
Target audience Individuals, groups, or organizations that will be the focus or beneficiary of the products or services of you project

Suggested Reading

The following are selected and modified suggested readings developed by the Institute of Museum and Library Services, 1100 Pennsylvania Avenue, NW, Washington, DC 20506. Please take note that:

  • There are many available OBE resources. As OBE catches on there are more examples in the literature to hel the OBE novice. You should periodically search for newer resources and to answer specific questions you may have.
  • Terms may vary from publication to publication, but you can usually match concepts easily to those used i your trainer's manual.
  • Many publications appear to use the terms, impacts, results, or outcomes interchangeably
  • Many OBE resources are available at no cost onlin

Administration on Children, Youth, and Families, Department of Health and Human Services. The Program Manager's Guide to Evaluation (nd). Washington, DC: DHHS. This excellen introduction was developed for grantees of this program and provides very concrete, practical explanations. It is accompanied by additional guides for specific kinds of human services programs funded by the agency.

Bond, Sally L., Boyd, Sally E., and Rapp, Kathleen A. (1997). Taking Stock: A Practical Guide to evaluating your own Programs [.PDF format]. Chape Hill, N.C.: Horizon Research, Inc., 111 Cloister Court, Suite 220, Chapel Hill, NC 27514, 919-489-1725 ($25.00, pb). This manual was developed for community-based science education initiatives through funding from the DeWitt Wallace-Readers Digest Fund. Participating advisors included the Association of Science-Technology Centers and the National Science Foundation.

Durrance, Joan and Karen Fisher-Pettigrew (2002). How Libraries and Librarians Help: Outcome-Based Evaluation Toolkit. A simple, flexibl effective methodology for evaluating outcomes, targeting libraries and community-focused services. It includes worksheets and examples, and the method is in the process of being piloted by a group of public libraries from large to small for a variety of typical library programs.

Florida Department of State, Division of Library and Information Services. Compiled at the Division of Library and Information Services, State Library and Archives of Florida by Cherie McCraw, Dr. John C. Bertot, Amy Johnson, and Ruth O'Donnell. The LSTA Outcome-Based Evaluation Toolkit, Tallahassee, FL, 2003. Information in this publication is also available online.

Steffen, Nicole O., Lance, Keith Curry. "Who's Doing What? Outcome-Based Evaluation and Demographics in the Counting on Results Project," Public Libraries, Sept/Oct, 2002, pp 271-279.

Steffen, Nicole O., Lance, Keith Curry, and Logan Rochelle. "Time to tell the whole story: Outcome-based Evaluation and the Counting on Results Project." Public Libraries, July/August, 2002, pp 222-228

United Way of America (1996). Measuring Program Outcomes: A Practical Approach. Alexandria, VA: United Way of America, 701 North Fairfax Street, Alexandria, VA 22314, 703-836-7100 ($5.00, spiral bound, to not-for-profit organizations). Developed by United Way for its grantees, this manual led the movement to outcome-based evaluation by funders of not-for-profit organizations. 2003 version is available online.

Sage Publications, Inc., 2455 Teller Road, Thousand Oaks, CA 91320, 805-499-0721, is a commercial publisher that specializes in publications on evaluation and related subjects. They offer many titles that cover aspects of evaluation in detail.

Appendix A: Answers to Practice Exercises

Selecting Homework Most Suited to OBE

Look at the homework examples below. They are organized according to the groups that will work on a single project. For each group, select one homework item best suited for OBE. Next to each item tell why it was or was not selected.

Homework for Group 1

Your Organization's Mission

The mission of the System is to improve and expand library service in counties through leadership education, advocacy and enhanced resource sharing.

Project Title: Training for Circulation and Maintenance Functions of Integrated Librar System

Project Description (answering the following questions)

What will the project do? Offer repeated training on basic competency requirements of every staf member for using circulation and holdings maintenance functions of the integrated library system

Who is the project for? Staff of member libraries

How will the participants benefit? Staff members will be aware of expected competencies. Trainin will be offered on a recurring basis so new staff and those wanting refresher sessions will be able to attend in a timely manner. Practices throughout the system will be consistent, making for more efficient and accurate use of the system.

Selected? ___ Yes ___ No Why or why not?

Answer: No. While training is involved, this project would make it difficult to predict patro behavior. The skills of the trainees could be evaluated and certainly there could be a cost benefit, management study, but it does not lend itself to a fully developed OBE plan that benefits a hierarchy of customers.

Homework for Group 1

Your Organization's Mission

The mission of the System is to improve and expand library service in counties through leadership education, advocacy and enhanced resource sharing.

Project Title: Positioning Your Library for the Financial World: Finding Grants, Writin Grants

Project Description (answering the following questions)

What will the project do? Provide information and techniques on how to write grants effectively The outcomes-based techniques learned in this workshop will be applied to an online course in grant writing.

Who is the project for? Anyone with an interest in learning how to write effective grants fo library projects

How will the participants benefit? Ideally, the participants will benefit by obtaining gran funding through better-written proposals.

Selected? ___ Yes ___ No Why or why not?

Answer: Yes. The grant-writing skills of the participants can be evaluated during trainin and after. Success with grants can be measured and since OBE would be used to evaluate the grants, predictions could be made about patron benefits.

Homework for Group 1

Your Organization's Mission

The mission of the System is to improve and expand library service in counties through leadership education, advocacy and enhanced resource sharing.

Project Title: The Blended Learning Program

Project Description (answering the following questions)

What will the project do? This project will provide library staff (Professionals and suppor staff) with a variety of learning options in order for them to participate in a "blended" continuing education program, much of which will be offered on technology training courses and using technology-assistance.

Who is the project for? More than 1500 staff working in system's member libraries

How will the participants benefit? Participants will benefit from the knowledge received by participatin in workshops, courses and seminars designed to enhance their skills and abilities.

Selected? ___ Yes ___ No Why or why not?

Answer: No. Any individual course could be evaluated using OBE. It would be difficult to predic customer action given the broad range of options planned and the lack of specificity of how customers will benefit.

Homework for Group 1

Your Organization's Mission

The mission of the System is to improve and expand library service in counties through leadership education, advocacy and enhanced resource sharing.

Project Title: No User Left Behind

Project Description (answering the following questions)

What will the project do? Train staff in basic PC readiness for ILS migration and train tec liaisons to gain valuable tech skills.

Who is the project for? Staff

How will the participants benefit? More confident in technology skills and abilities - bette equipped to help other staff and the public.

Selected? ___ Yes ___ No Why or why not?

Answer: No. Another management program aimed at efficiency of librarians. The librarian trainin could be evaluated using OBE, but it is unclear how patrons would benefit in a way that the change could be predicted and measured.

Homework for Group 2

Your Organization's Mission

The system serves a statewide library network of member school libraries by providing qualit information services in support of excellence and equality for all learners.

Project Title: Elementary Health Advantage

Project Description

What will the project do? Elementary teachers and library media specialists will work in collaborativ teams to develop lesson plans that incorporate the NOVEL HRC database.

Who is the project for? Target audience: elementary teachers and library media specialists
specifically each sls will target 5 schools that have had zero usage of the hrc database using the stats received from NOVEL. Teams will be created at each school to include the library media specialists and teachers. School administrators will be included in the selection of each team to create global support in each school.

How will the participants benefit? Awareness and use of accurate and up-to-date resources wit students. Create original lesson plans to stimulate increased learning by students. Develop and utilize collaborative approach to teaching. Long-term use of accurate information promotes a model for reliable research when utilizing the HRC database.

Project will:

  • Change participants knowledge by training that will be hands-on for each core team to develo lesson plans to be implemented in the classroom.
  • Participants' skills using the HRC database will increase.
  • Continue follow-up contact with participants through a variety of resources: listserv, grou meetings, blackboard electronic communication tool, web site with useful update tips and ideas for awareness and usage.

Selected? ___ Yes ___ No Why or why not?

Answer: Yes. While this project speaks of awareness, it also focuses on skills for severa target groups.

Homework for Group 2

Your Organization's Mission

The system serves a statewide library network of member school libraries by providing qualit information services in support of excellence and equality for all learners.

Project Title: Extranet Training

Project Description

What will the project do? This project will provide orientation training to the member librarian so that they can use the new extranet website for enhanced communication and resource delivery services. The training will focus on reading and posting in the discussion forums, finding resources on the page, staying updated with
news postings, accessing new content.

Who is the project for?This training is being offered to all librarians and library staff fro the member districts

How will the participants benefit? After attending this training, the participants will be abl to communicate using the new extranet site. This will allow librarians and library staff to communicate with a greater level of efficiency using archived news posts, threaded discussion boards, and RSS syndication.

Selected? ___ Yes ___ No Why or why not?

Answer: No. This is primarily a management project and should be evaluated for results. Wha the staff learns can be evaluated using OBE, but the options for applying the training are so broad, it would be difficult to predict specific outcomes in terms of staff behavior after training.

Homework for Group 2

Your Organization's Mission

The system serves a statewide library network of member school libraries by providing qualit information services in support of excellence and equality for all learners.

Project Title: NOVEL Awareness

Project Description

What will the project do? The project will promote awareness and use of the State's NOVEL database by all types of libraries.

Who is the project for? School librarians, teachers, and students

How will the participants benefit? Participants will benefit from staff development in a bette understanding of the State's databases and their uses in their own particular environment.

Selected? ___ Yes ___ No Why or why not?

Answer: No. This is an awareness (outcomes "lite") project.

Homework for Group 2

Your Organization's Mission

The system serves a statewide library network of member school libraries by providing qualit information services in support of excellence and equality for all learners.

Project Title: Library Automation Training

Project Description

What will the project do? Train library staff to use the new automation system.

Who is the project for? Library staff.

How will the participants benefit They will learn how to use the new automation system.

Selected? ___ Yes ___ No Why or why not?

Answer: No. This is a management project and should be evaluated for results. The trainin can be evaluated using OBE, but it is more of an efficiency study.

Modifying Homework Project to Fit OBE

Look at the two examples of homework. Imagine that none of the homework submitted was really ready for an OBE plan. Tell how you would ask the groups to modify the project so it would work for purposes of learning the OBE process.

Project Title: Using WebMax to Meet Curriculum Needs

Project Description

What will the project do? Increase the number of school staff that uses the WebMax to locat audiovisual resources, i.e. DVDs, videotapes, laser disks, and/or multiple collection titles for use in instruction.

Who is the project for? Classroom teachers and librarians.

How will the participants benefit? Increased access to materials for their curriculum needs.

Proposed Modifications:

Answer: The project as submitted is basically an access project. If the project aimed a teachers and librarians using WebMax to locate materials for instruction and included the next step of integrating use of those materials in lesson plans, predictions could be made about librarian, teacher, and student performance.

Project Title: Making an IMPACT! On Your Students & Your Teachers

Project Description

What will the project do? Train participants on the use of IMPACT! (Instructional Media Professional' Academic Collaboration Tool) software, which will enable them to document library program activities.

Who is the project for? Library media specialists

How will the participants benefit? The software will allow individual Library Media Program to be documented for accountability purposes and presentations as needed by profiling collaborative planning between teachers and LMSs.

Proposed Modifications:

Answer: As written, it sounds like a management project. LMSs learn to use software fo accountability purposes. The software skills of the initial training can be evaluated using OBE. To get to the end-user level with outcomes, this project could specify a secondary project that would be a specific collaborative effort between teachers and LMSs. The evaluation could document the value of Impact Software skills in tracking collaborative planning and student performance and thereby in providing outcomes information as well as document the outcomes themselves. The group would need to select a specific goal of a teacher/LMS collaborative effort.

Improving Purpose Statements

The following purpose statement is spaced so suggested changes can be written in. Use the space to cross out items and add-in others.

The school library system provides training in collaborative planning and classroom integratio of NOVEL databases to librarian/teacher teams to:

  • Increase the teams collaboration and lesson planning skills for teaching NOVEL databases

  • Increase evidence of librarian/ teacher collaboration

  • Increase integration of NOVEL databases into instruction

The school library system provides training in collaborative planning and classroo integration of NOVEL databases to librarian/teacher teams to:

  • Produce collaborative lesson plans for teaching NOVEL databases
  • Develop patterns of collaboration between librarians and teachers
  • Integrate NOVEL databases into classroom instruction
  • Develop students' skills using NOVEL databases for assignments

Note: The use of" increase" is risky unless there has been a previous study An interpretation of the desired benefits of this program is as follows:

  • During training librarians and teachers will learn collaboration skills
  • During training librarians and teachers will work collaboratively to create lesson plan that can be used to integrate NOVEL instruction in the classroom
  • After training there will be evidence that librarians and teachers continued to use thei collaboration skills and that lesson plans were actually implemented.
  • Ultimately there will be evidence that students used NOVEL databases for useful purposes.

Creating and Improving Outcome Statements

The following "Outcome" statements are double spaced so suggested changes can be written in. Use the space to cross out items and add-in others.

Outcome: System trainer will provide teach library staff to use consumer health information

Answer: Library staff (customer not provider of service) use consumer health information fo prescribed searches

Outcome: System trainer will teach library staff skills to help patrons find consumer healt information

Answer: Library staff report helping patrons successfully locate consumer health information

Outcome: Library staff teach targeted community groups to find consumer health information

Answer: Targeted community groups use consumer health information for prescribed searches.

Outcome: More library patrons have the ability to find consumer health information independently

Answer: Patrons who were helped or trained by librarians report successful, independent consume health information searches

Outcome: Library staff help patrons search genealogy databases

Answer: Library staff report helping patrons search genealogy databases

Using the following program purpose statement, write at least 6 outcomes.

The school library system provides training in collaborative planning and classroom integratio of NOVEL databases to librarian/teacher teams to:

  • Produce collaborative lesson plans for teaching NOVEL databases
  • Develop patterns of collaboration between librarians and teachers
  • Integrate NOVEL databases into classroom instruction
  • Develop students' skills using NOVEL databases for assignments

Answer: Outcome: Librarians and teachers define collaboration skills

Answer: Outcome: Librarian/teacher teams collaborate to design instructional units that integrat the use of online databases

Answer: Outcome: Librarians and teachers report on-going collaboration activities

Answer: Outcome: Librarians and teachers report implementation of lesson plans using onlin databases with students

Answer: Outcome: Teachers report successful completion of student assignments using onlin databases

Answer: Outcome: Students report independent, successful use of online databases.

Creating and Improving Outcome Indicators

The following "Indicator" statements are double spaced so suggested changes can be written in. Use the space to cross out items and add-in others.

Outcome: Library staff search advanced health databases
indicator # and % of library staff who successfully complete prescribed advanced health database searches

Answer: # and % of library staff who successfully complete at least X number ofas assessed by the trainer during a workshop (Minimum standar and method of assessment added)
Outcome: Library staff report helping patrons search genealogy databases
indicator: # and % of library staff who report helping patrons use genealogy databases

Answer: # and % of library staff who report helping at least X number of patron successfully use genealogy databases as assessed by a checklist applied to the report form.
Using the following outcome statements, write indicator(s) for each:
outcome: librarians and teachers define collaboration skills

Answer: # and % of librarians and teachers who score at least 85 % on a collaboration skill quiz at the end of the workshop
Outcome: Librarian/teacher teams collaborate to design instructional units that integrate the us of online databases

Answer: # and % of librarian/teacher who design at least 1 collaborative unit that successfull integrates the use of online databases as assessed by rubric on unit design during the workshop
Outcome: Librarians and teachers report on-going collaboration activities

Answer: # and % of librarians and teachers who report at least one incident of librarian/teache collaboration after the workshop as assessed by a collaboration rubric applied to a survey.

# and % of librarians and teachers who achieve a self-assessment score of at least 80 % on a collaboration survey.
Outcome: Librarians and teachers report implementation of lesson plans using online databases wit students

Answer: # and % of librarian/teacher teams who report using the lesson plans created durin the workshop in the classroom as assessed by a survey
Outcome: Teachers report successful completion of student assignments using online databases

Answer: # and % of teachers who report # and % of students who successfully complete a least 1 assignment using online databases as assessed by a "successful database search" checklist applied to the assignment.
Outcome: Students report independent, successful use of online databases.

Answer: # and % of students who report "successful databases searches" for differen assignments than the initial requirement as assessed by a survey.

Creating and Improving Target and Target Achievement Predictions

In the boxes next to the following "Target Audience and Target Achievement Predictions" write proposed revisions.

Outcome: Library staff search advanced health databases

indicator: # and % of library staff who successfully complete at least x number of prescribed advanced health database searches as assessed by the trainer during a workshop

target achievement = 80%
Answer: Need to assign number to target audience
need to apply percentage to target audience to calculate target achievement.
e.g. number in target audience = 200
Target Achievement: 160 (80%)
need to apply a number to X; e.g. X=3 prescribed searches
Outcome: Library staff report helping patrons search genealogy databases

# and % of library staff who report helping at least x number of patrons successfully use genealogy databases as assessed by a checklist applied to the report form.

target audience = 250 library staff

target achievement = 200 (80%)
Answer: Need to apply a number to X, e.g. X= at least 10 patrons helped.
250 librarians times 10 patrons helped:
target audience = 2500
Target achievement = 2000
Using the following outcomes and indicators, fill in possible target audience and target achievemen information:
Outcome: Librarians and teachers define collaboration skills

# and % of librarians and teachers who score at least 85 % on a collaboration skills quiz at the end of th workshop
Answer: 50 librarian/teacher teams =100
target achievement = 90 (90%)
Outcome: Librarian/teacher teams collaborate to design instructional units that integrate the us of online databases

# and % of librarian/teacher teams who design at least 1 collaborative unit that successfully integrates th use of online databases as assessed by rubric on unit design during the workshop
Answer: 50 teams
target achievement = 45
Outcome: Librarians and teachers report ongoing collaboration activities

# and % of librarians and teachers who report at least one incident of librarian/teacher collaboration afte the workshop as assessed by a collaboration rubric applied to a survey.
# and % of librarians and teachers who achieve a self-assessment score of at least 80 on a collaboration survey.
Answer: 100 librarians and teachers
target achievement 30 (30%)
100 librarians and teachers
30 survey responses anticipated
target achievement = 27 (90%)
Outcome: Librarian/teacher teams report implementation of lesson plans using online databases wit students

# and % of librarian/teacher teams who report using the lesson plans created during the workshop in the classroo as assessed by a survey
Answer: 50 teams
target achievement = 25 (50%)
Outcome: Students report independent, successful use of online databases.

# and % of students who report "successful databases searches" for different assignments than th initial requirement as assessed by a survey.
Answer: 50 teachers, 30 students each = 150 students
target achievement = 10 (20%)
Outcome using online databases

# and % of teachers who report # and % of students who successfully complete at least 1 assignment using onlin databases as assessed by a "successful database search" checklist applied to the assignment.
Answer: 50 teachers, 30 students each = 150 students
target achievement = 120 (80%)


February 9, 2006 -- asm
For questions or comments, contact Linda Todd.
url: http://www.nysl.nysed.gov/libdev/obe/trainer/index.html

Last Updated: June 3, 2009