New York State Library

Division of Library Development

Making it REAL! Recruitment, Education, And Learning:
Creating a New Generation of Librarians to Serve All New Yorkers

IMLS Grant Partners Program Evaluation Workshops

Presentations, June 1-2, 2005

Evaluation of REAP Change Consultants Workshop on June 1 and 2, 2005


Overview

REAP Change Consultants (REAP Change) completed all day workshops on June 1 and 2, 2005, for grant partners in the Institute for Museum and Library Services funded New York State Library grant "Making It REAL! Recruitment, Education, and Learning: Creating a New Generation of Librarians to Serve All New Yorkers." The purpose of the workshops was as follows:

  1. Make technical assistance from the REAP Change Consultants team, charged with the IMLS Grant overall grant evaluation, available to all IMLS grant partners in planning, designing and thinking through their individual IMLS grant evaluations.
  2. Enable partners to meet one another and work directly with the REAP Change Consultants team.
  3. Provide communication to all partners about REAP Change Consultants current plans for conducting the overall multi-site evaluation of IMLS grant outcomes, and its needs for overall evaluation data to be obtained from partners.

All three REAP Change evaluators, Dr. Stephen C. Maack, Associate Professor Clara Chu, and Dr. Suzanne Stauffer, attended both days and participated as workshop presenters. Fifteen (15) of 18 partners sent representatives to the workshops, 60 percent or nine (9) of whom attended the first day, with six coming the second day. Ten of twelve (83%) of the Teaching Libraries sent representatives, and an equal proportion of Library Schools (five of six) did so as well. No Palmer School representative attended, and Onondaga County Public Library and Northern New York Library Network also were unable to send representatives. Both Teaching Libraries and Library Schools attended in the same pattern, with 60 percent of each attending on June 1 and 40 percent on June 2. Amanda Tehonica, one of the scholarship recipients from North Country Library System joined the group of June 1 and Michael Borges, Director of the New York Library Association (NYLA), also attended on June 2.

Bar chart shows which days people attended the workshop.

Workshop participants were asked to complete a one-page evaluation of the workshop at the end of each day (see attached), and 18 did so. Evaluative comments provided by the first day attendees helped guide and improve the second day of workshops. For example, participants said that the REAP Change team had spent too much time explaining and justifying their connection to New York, so that was expressed more briefly on June 2. Due to a technology failure, Professor Chu had to present her Diversity Overview with just handouts the first day, but was able to present using her PowerPoint presentation the second day. Two or three people complained about the tight schedule, a cold room on June 1, a nice day outside, and reliance on PowerPoint for presentations. One person on the first day felt that the workshop could be improved with"more dynamic interactive presentations." Responding to this feedback, and with a smaller group attending the second day, REAP Change staff used a more informal and interactive presentation style on June 2. This was well received.

Outcome-Based Evaluation Experience

Although most of the grant partners at the workshop had previously attended Outcome-Based Evaluation (OBE) training, many did not feel especially experienced with Outcome-Based Evaluation. While 70 to 80 percent rated themselves"somewhat experienced" in OBE before the workshop, 20 to 30 percent rated themselves"not very experienced" and only the project manager rated herself"very experienced." Most partners came into the workshop with some exposure to Outcome-Based Evaluation, but also a degree of insecurity about doing it.

Bar chart shows level of experience with OBE prior to workshop.

This showed up in an open-ended response to Question 7 (What was the best thing about the workshop?):"Third time through OBE -- it finally begins to make sense." This is a new skill for the grant partners. The workshop technical assistance was needed and helped most partners improve their understanding of OBE, as shown below.

Review of Evaluation Concepts

The workshops were never intended to replicate the Outcome-Based Evaluation training that most partners had already taken. However, Dr. Maack did review key evaluation concepts after lunch. Respondents to the evaluation questionnaire were asked to rate each of the four major presentations on the information presented and the quality of the presentation on a scale of"excellent,""good,""fair," or"poor." This session received an overall median rating of"good," but a modal (most common) rating of"excellent" regarding information presented, and median and modal ratings of"good" on presentation quality.

The first day, this session followed a PowerPoint presentation while the second day the group decided to hold more of a dialogue, with partners asking questions and Maack responding by using the PowerPoint slides as appropriate, to make a point, and by talking through examples. The adjustment in response to suggestions of the first day group did improve the ratings. The first day median and modal ratings of both information presented and the quality of presentation were"good" while the second day median and modal ratings were"excellent." A favorable comment the second day in response to the"best thing" about the workshop was"Not being read to. Not a 'cookie cutter' workshop -- 'canned' (like in Washington) -- follows our needs." Another partner appreciated learning the"Definition of outputs and outcomes" that was covered and emphasized in this session.

However, another partner complained about the length of time spent the first day talking about the New York connections of the Los Angeles area REAP Change Consultants, and then noted,"Could use extra time on OBE examples + concepts" and added in response to the question on the"muddiest" concept left at the end of the day,"Don't feel my understanding of OBE improved as much as I hoped."Generally, though, most partners felt that the session went well.

Small Group Individual Program Evaluation Planning

The small group individual program evaluation planning that occurred toward the end of the day was the best received of the group sharing activities, and one of the most important sessions to have been offered. The most common rating of the small group OBE planning session was"excellent," although the median rating (half of respondents rated above, half below) fell between"good" and"excellent." The second day, small groups went better since both the median and the modal rating values were"excellent."

Especially during the first day the REAP Change Consultants team switched among small groups, so that partners could take advantage of the individual expertise of the consulting team members. The ploy was successful, but was not repeated the second day when groups were smaller and worked well with the first consultant assigned to them. Partners highlighted this session in these comments about"the best thing about the workshop":

"One-on-one discussion with consultants."
"Individual help with outcomes."
"Interaction with the consultants."
"Sharing info with other grant partners+ talking with 2 REAP consultants (Chu + Stauffer)."

As the last comment illustrates, it wasn't just the work with the consultants that was important, it was the work with other partners on individual OBE plans that contributed to the success of this part of the workshops. During both days the camaraderie of the assembled group increased during the small group sessions in ways noticeable to observers, and some small groups wanted to keep talking instead of moving to the final session of the day. These comments about"the best thing about the workshop" capture that camaraderie:

"Establishing relationships."
"Meeting the other grant participants -- exchanging ideas.
"Chance to meet other participants."
"Meeting library school partner & other participants."
"Meeting/networking."
"Building Partnerships."
"Opportunity for grant partners to connect."
"Exchange with other institutions."

While some of the networking and exchange had already taken place earlier in the day, on both days some partners took advantage of the small groups to work directly; Teaching Library with similar Teaching Library, University with University, or Teaching Library with University partner. On June 1, a group of rural Teaching Libraries started to strategize on how they might approach Syracuse University (who sent a representative on June 2) to increase or obtain online course offerings for their students in areas of interest to their rural setting. They also spoke of cooperating with one another to create synergy where few resources exist. Rural teaching libraries gained program ideas from the partners from larger urban teaching libraries. Two school library system partners, Capital Region BOCES SLS and Franklin-Essex-Hamilton BOCES SLS decided to work jointly toward the same OBE plan, which each would implement -- so creating a"natural experiment" within the broader multi-site program effort (they have followed through since the workshop on that approach). For the evaluation team, these were most pleasurable and exciting developments to see occurring and to support.

Partners were left at the end of the small group sessions understanding what more they needed to do to perfect their OBE plans. Some of the"muddiest point" comments included these:

"Wondering if we should aim too high in the target achievement or goals."
"Is the chart I completed really workable? I'd like to know if these outcomes really will indicate the work we're doing."

Other partners had gotten to the point of realizing that they needed to go back and touch base with others in their organizations to move forward their OBE planning. One recommended a workshop improvement of"More time for feedback on small group work." The REAP Change team reassured partners that they would continue to be available for long distance consultation by telephone and e-mail as partner OBE plans continued to develop. Dr. Maack commented that the partners were ending the day about where they should be -- engaged in their OBE planning, part way done, and aware of how much and what more each would need to do to produce a good OBE plan.

Perhaps because it was at the end of the day and people were eager to leave, the sharing of small group results received less attention each day than the small groups themselves. Only eight of the 18 people who attended the workshops even bothered to rate these sessions, and they gave them"good" median and modal ratings. The response to this and other sharing sessions was mixed. One partner commented"not enough time for discussions. Tight schedule." However, another who attended the first day recommended:"Shorten sharing times" as a way to improve the workshops. See below for other positive comments related to partner sharing.

Evaluation Dialogue

The camaraderie of the small groups had partly been set up during the Diversity Dialogue and the Evaluation Dialogue. Dr. Stauffer led the brief but important Evaluation Dialogue bridging session. She started the session with an exercise demonstrating the diversity within the partner group that had everyone standing up if they fell within certain group criteria. This not only got people moving after a long period of sitting, it also made a point about how the group was diverse itself in various ways, even though not particularly ethnically or gender diverse. The majority of those present each day were non-Hispanic Caucasian females, as is typical of the library profession as a whole, yet the exercise surfaced a great deal of diversity in other personal characteristics. Dr. Stauffer then asked the partners to talk about their projects, students, and the evaluations they planned.

The Evaluation Dialogue contributed to the following favorable comments about"the best thing about the workshop":

"Chance to meet other participants."
"Meeting library school partner & other participants."
"Great to hear everyone's project."
"Meeting/networking."
"Learning more about the IMLS grant and the projects planned."
"Opportunity for grant partners to connect."
"The variety in speakers, presentations, break-out sessions."

Overall the Evaluation Dialogue session was rated"good" in terms of information and quality of presentation.

Diversity Overview

Professor Chu's Diversity Overview garnered the highest overall scores, with median and modal ratings of"excellent" on both information presented and quality of presentation. Her examples were especially germane to and understood by the library-oriented audiences. Teaching Library and University partners gave her equivalent highest ratings. She provided an especially rich handout and PowerPoint presentation that discussed"diversity," defined as broadly as the grant does, in both individual and institutional context. Those who attended the workshops received handouts showing the slides for each presentation, and either the slide presentations themselves or links to them have been placed on the Making It Real! Yahoo! Group that the New York State Library has created.

Diversity Dialogue

Clara Chu followed up the end of her presentation with an open discussion with the partners about diversity in relation to their programs and students. The median and most common rating of this part of this session was"good."

The first day discussion was especially rich, with many partners speaking in a heartfelt way about some of their diverse scholarship students, and the ways in which they are diverse. Someone pointed out that males remained under-represented in the diversity of the library field, as well as among the candidates recruited and offered scholarships. Partners also surfaced issues about how library clients related to diverse staff in ways that did not respect their abilities (e.g. approaching another reference librarian after asking a minority librarian a reference question -- a possible indication that the client wondered about the quality of the minority librarian's response). Conversely, one partner spoke of an excellent African-American reference librarian that gave such good service that people sought her out over other reference librarians. The discussion included comments about expectations (self, other staff, clients), mentoring (including review and feedback of such situations), assumptions about quality level of diverse librarians -- and diverse individual's pressure on themselves to do better than everyone else's"normal" in order to be viewed as"qualified."

Interestingly, the majority of the partners on the first day rated the diversity dialogue"good," while the majority of those on the second day rated it"excellent." The discussion was also rich on the second day and the smaller group may have helped the ratings.

Making It REAL! Grant Program Overview

Dr. Maack followed the Diversity Dialogue with a summary of his understanding of the current state of the Making It REAL! grant project, based on the grant proposal itself and introductory discussions that REAP Change evaluators had held with the partners during May. His approach was to not only to discuss the grant but also to present his views of the strengths, weaknesses, opportunities and threats from the grant. The median rating of this presentation was"excellent" on information provided and between"good" and"excellent" on presentation quality. Favorable comments in response to the question about the"best" thing about the workshop included:

"Learning more about the IMLS grant and the projects planned."
"Great to hear everyone's project. Hear we 'did this together.'"
"Getting overall view of the grant."

Some issues about the grant program surfaced during this presentation, especially on the first day. One issue related to a realization evidenced by the comment on the evaluation that the worst thing about the workshops was"Finding out we need to do extensive evaluation." This realization displeased some partners, who indicated that they had not previously understood the implications of the self-evaluation requirement. REAP Change staff had been mentioning this requirement to the partners during their introductory telephone calls and had several times heard responses that partners, notably some of the Teaching Libraries, had apparently thought that their primary involvement with the grant had ended with recruitment of library school candidates and selection of one. The discontent led to a direct question on the first day, at the end of the IMLS grant overview presentation, about what the REAP Change Consultants were doing, if the grant partners were to evaluate their own programs. The answer was that REAP Change is handling the overall evaluation, and the explanation of current plans for that would took place in the afternoon. It appears that the individual self-evaluation expectation had previously either not been clearly communicated or not been fully understood or accepted by the partners as their responsibility.

In this presentation Maack raised communication as a potential threat to the project, since it has so many partners spread across the state, and the use of Internet and web technology as a potential opportunity to overcome communication issues. Judging from comments raised by partners during discussions before the workshops and during the workshops, communication of expectations had not been clear or expectations communicated to different partners has been inconsistent and variable across partners.

Related comments on the workshop evaluation form include:

How could the workshop be improved?
"More discussion of what NYS has in mind for outcomes -- levels of effect"

What is the muddiest point or question that you are left with about the IMLS grant and its evaluation?
"Why aren't we required to use evaluation criteria submitted in the grant"
"What are overall IMLS expected outcomes?"
"Specific timeline for what we need to do"
"Probably a sort of a time-frame type calendar for completion of all the facets (requirements) of forms, paperwork, etc."
"A concerns assessment may have been helpful"
"Needed a Q + A period for grant partners to ask Questions concerning grant mechanics"

A concern that one partner was left with at the end of the day was:"Repercussions of local obstacles/policy that may be hard to overcome…may look to IMLS as a failure, but really demonstrates grant partner differences." Maack had pointed out the potential for problems at the local level (e.g., staff turnover, Board decisions, funding cuts, etc.) that had nothing to do directly with the grant program, but which might have a negative impact on its implementation or success. Another partner was left unsure ("muddiest point") about the"need to deal internally with other 'stakeholders.'" The comments might be a further indication that some partners had not thought about grant implementation or diversity from an organizational perspective before the workshops, but had rather focused only on the immediate task of locating appropriate scholarship candidates. The overall evaluation intends to explore implementation of the grant, including the nature of the involvement of internal stakeholders and unexpected non-grant impacts, in addition to looking only at outputs.

Overall IMLS Grant Evaluation Design

The presentation on the overall grant evaluation design of REAP Change Consultants received overall median and mean ratings of "good" for both information and quality of presentation. One partner commented in response to the"muddiest point" question: "The data and evaluation scheme outlined by Steve Maack doesn't match my expectations about what we're doing, which is pretty simple. Why make it so complicated?" The second day the presentation itself went more smoothly but the ratings remained the same.

While this session did appear to communicate to the partners how what REAP Change was doing differed from what they were doing in the way of evaluation, it also raised a number of questions about grant intentions that partners had apparently not previously grasped. For example, the expectation of a relationship between the Teaching Libraries and the University partners, which is poorly defined in the original proposal and the Request for Proposals for the evaluation, was apparently news to many partners. The involvement of Boards in diversity recruitment, mentioned in the RFP for the evaluation, left partners confused (some do not have Boards) and worried about how they would fare in such an evaluation.

The presentation also surfaced the general lack of pre-set criteria for evaluating the overall grant. Part of the reason for this has to do with the multi-site, somewhat exploratory, nature of the grant. On the positive side, Maack was able to stress the extent to which the New York State Library and IMLS have given the partners freedom to design and evaluate their own projects within very broad parameters. Whether partners are ready and willing to creatively design and evaluate their own projects to respond to increasing diversity in New York libraries, and to improve library service to diverse New York communities remain open and evaluable questions.

Conclusions and Implications

In general the day of workshops achieved its objectives at a"good" to"excellent" level. The workshops engaged partners and provided needed additional exposure and technical assistance with OBE planning of individual projects through presentations by Dr. Maack and the one-on-one help of the REAP Change team. The excellent work of Drs. Chu and Stauffer exposed partners to new ways of thinking about and approaching diversity. Over the course of the day, thanks to presentations by Dr. Maack and clarifications by Mary Linda Todd, the partners came to better understand the Making It REAL! goals and expectations, although more work may be needed in this area.

The REAP Change Consultant evaluators were introduced to, and generally well received by the partners. Mary Linda Todd also met many partners for the first time and was able to transmit additional information about the grant, its goals, and procedures to them. Partners started working and collaborating with one another and coalescing as a group.

The partners themselves underlined the importance of meeting one another and coming together as a grant stakeholder group. This came out in the following responses to the open-ended questions:

What was the worst thing about the workshop?
"Should have been all partners in one day."
"That we couldn't mandate attendance."

How could the workshop be improved?
"Learn about the projects described on June 1."
"More participants in the program taking part in one session."
"Giving more information on projects to others."
"If possible, everybody on one day."
"We should have had the workshop earlier in the grant cycle. Other meetings would be helpful -- not just evaluation focus."
"Have all the participants there all on one day, instead of 2 days."

What is the muddiest point or question that you are left with about the IMLS grant and its evaluation?
"What others are doing."
"A follow up meeting announced in the future (perhaps at NYLA 2005)?"

The partners have spoken and are eager for more and better communication with each other and with the NYSL project staff about expectations, procedures, and overall grant evaluation criteria. At least one partner recommended NYLA 2005 as a locale for such additional communication to take place. The Making It REAL! Yahoo! Group is now set up and has almost all partners virtually enrolled in potential communication with another via the Internet and the web. The evaluative conclusion is that the partners as a whole were left inspired by the workshops to move forward to being an important stakeholder group in formation that started to gel during the workshops. It is up to those in charge of programming to determine how best to move forward with this stakeholder group.

REAP Change Consultants NYSL IMLS Grant Workshop Evaluation

1) On which day did you attend the workshop?

__ June 1

__ June 2

 

 

         
2) Do you represent

__ a Teaching Library

__ a University

__ Other

 

         
3) How experienced were you with Outcome-Based Evaluation before today?

__ Very Experienced

__ Somewhat Experienced

__ Not Very Experienced

 

         
4) Please rate the information presented in each of the following:

 

 

 

 

Excellent

Good

Fair

Poor

Diversity Overview

__

__

__

__

IMLS Grant Programs Overview

__

__

__

__

Review of Evaluation Concepts

__

__

__

__

Overall IMLS Grant Evaluation Design

__

__

__

__

         
5) Please rate the quality of the presentation in each of the following:

 

 

 

 

 

Excellent

Good

Fair

Poor

Diversity Overview

__

__

__

__

IMLS Grant Programs Overview

__

__

__

__

Review of Evaluation Concepts

__

__

__

__

Overall IMLS Grant Evaluation Design

__

__

__

__

         
6) Please rate the group and small group parts of the workshop:

 

 

 

 

 

Excellent

Good

Fair

Poor

Diversity Dialogue

__

__

__

__

Evaluation Dialogue

__

__

__

__

Individual Program Evaluation Planning

__

__

__

__

Sharing of Workshop Results by All

__

__

__

__

         
7) What was the best thing about the workshop?

 

 

 

 

         
8) What was the worst thing about the workshop?

 

 

 

 

         
9) How could the workshop be improved?

 

 

 

 

         
10) What is the muddiest point or question that you are left with about the IMLS grant and its evaluation?

 

 

 

 

         
THANK YOU!! If you have further comments about the workshop, questions about the IMLS evaluation, or need additional technical help, please contact Stephen Maack, REAP Change Consultants, smaack@earthlink.net or 310-384-9717.

About

Agenda and Presentations

Participants

Pictures

Evaluation


Return to the Agenda and Presentations page | Return to the Making it REAL! page