It is Wednesday and the sun is shining. oregon susnhine

The thought for today is evaluation use.

A colleague of mine tell me there are four types of evaluation use:

  1. Conceptual use
  2. Instrumental use
  3. Persuasive use
  4. Process use

How are evaluation results used? Seems to me that using evaluation results is the goal–otherwise it what you’ve done really evaluation?   HOW do YOU use your evaluation results?reports 2

Is scholarship a use?

Is reporting a use?

Does use always mean taking action?

Does use always mean “so what”?

Jane Davidson (Davidson Consulting, http://davidsonconsulting.co.nz
Aotearoa New Zealand)  says there are three question that any evaluator needs to ask:

  1. What’s so?
  2. So what?
  3. Now what?

Seems to me that you need the “now what” question  to have evaluation use. What do you think? Post a comment.

Hi–Today is Wednesday, not Tuesday…I’m still learning about this technology. Thanks for staying with me.

Today I want to talk about a check list called  The Fantastic Five. (Thank you Amy Germuth for bringing this to my attention.)  This checklist  presents five questions against which to judge any/all survey questions you have written. survey image 3

The five questions are:

1. Can the question be consistently understood?

2. Does the question communicate what constitutes a good answer?

3. Do all respondents have access to the information needed to answer the question?

4. Is the question one which all respondents will be willing to answer?

5. Can the question be consistently communicated to respondents?

“Sure,” you say “…all my survey questions do that.”  Do they really? SurveyQns

Let me explain.

1. When you ask about an experience, think about the focus of the question. Is it specific for what you want? I used the question,  “When were you first elected to office?” and got all manner of answers. I wanted the year elected.  I got months (7 months ago), years (both “2 years ago” and  “in 2006”)), words (at the last election). A more specific question would have been “In what year were you first elected to office?”

2. When I asked the election question above, I did not communicate what constitutes a good answer. I wanted a specific year so that I could calculate how long the respondent had been an elected official. Fortunately, I found this out in a pilot test, so the final question gave me answers I wanted.

3. If you are looking for information that is located on supplemental documents (for example, tax forms), let the respondent know that these documents will be needed to answer your survey. Respondents will guess without having the supporting documentation ready, reducing the reliability of your data.

4. Often, we must ask questions that are of a sensitive nature, which could be seen as a violation of privacy. Using Amy’s example, “Have you ever been tested for HIV?” involves a sensitive subject.  Many people will not want to answer that question because of the implications.  Asking instead, “Have you donated blood in the last 15 years?” gets you the same information without violating privacy.  Red Cross began testing blood for HIV in 1985.

5. This point is especially important with mixed mode surveys (paper and interview for example). Again, being as specific as possible is the key. When asking an open ended question, make sure that the options included cover what you want to know. Also, make sure that the method of administration doesn’t affect the answer you want.

This post was based on a presentation by Amy Germuth, President of EvalWorks, LLC at a Coffee Break webinar sponsored by the American Evaluation Association. The full presentation can be found at:

http://comm.eval.org/EVAL/EVAL/Resources/ViewDocument/Default.aspx?DocumentKey=53951031-ef7a-4036-bc8f-3563f3946026.

Dillman's bookThe best source I’ve found for survey development is Don Dillman’s 3rd edition of “Internet, mail, and mixed-mode surveys: The tailored design method” published in 2009 and available from the publisher (Wiley).

Welcome back!   For those of you new to this blog–I post every Tuesday, rain or shine…at least I have for the past 6 weeks…:) I guess that is MY new year’s resolution–write here every week; on Tuesdays…now to today’s post…

What one thing are you going to learn this year about evaluation?

Something about survey design?

OR logic modeling?

OR program planning?

OR focus groups?focusgroups

OR…(fill in the blank and let me know…)

A colleague of mine asked me the other day about focus groups.

Specifically, the question was, “What makes a good focus group question?”

I went to Dick Krueger and Mary Anne Casey’s book (Focus Groups, 3rd ed. Sage Publications, 2000).  On page 40, they have a section called “Qualities of Good Questions”. These make sense.They say: Good questions…

focus group book--krueger

  1. …sound conversational
  2. …use words participants would use.
  3. …are easy to say.
  4. …are clear.
  5. …are short.
  6. …are open-ended.
  7. …are one dimensional.
  8. …include good directions.

Let’s explore these a bit.

  1. Since focus groups are a social experience (albeit, a data gathering one), conversational questions help set an informal tone.
  2. If participants don’t/can’t understand your questions (because you use jargon, technical terms, etc.), you won’t get good information. Without good information, your focus group will not help answer your inquiry.
  3. You don’t want to stumble over the words, so avoid complicated sentences.
  4. Make sure your participants know what you are asking. Long introductions can be confusing, not clarifying. Messages may be mixed and thus interpreted in different ways. All this results information that doesn’t answer your inquiry.
  5. Like being clear, short questions tend to avoid ambiguity and yield good data.
  6. To quote Dick and Mary Anne, “Open-ended questions are a hallmark of focus group interviewing.” You want an opinion. You want an explanation. You want rich description. Yes/No doesn’t give you good data.
  7. Using synonyms add richness to questioning–using synonyms confuses the participant. Confused participants yields ambiguous data. Avoid using synonyms–keep questions one-dimensional keeps questions clear.
  8. Participants need clear instructions when asked to do something in the focus group.  “Make a list” needs to have “on the piece of paper in front of you” added.  A list in the participants head may get lost and you loose the data.

Before you convene your focus group, make sure you have several individuals (3 – 5) who are similar to and not included in your target audience review the focus group questions. It is always a good idea to pilot any question you use to gather data.

Ellen Taylor-Powell (at University of Wisconsin Extension) has a Quick Tips sheet on focus groups for more information. To access it go to: http://www.uwex.edu/ces/pdande/resources/pdf/Tipsheet5.pdf

newyearresolution1 Ok–Christmas is over and now is the time to reflect on what needs to be different…self-deception, not with standing. I went looking for something salient to say today and found the following 10 reasons to NOT make new year’s resolutions posted on the Happy Lists (a blog about personal development and positive change for those who love lists found at http://happylists.wordpress.com/)

Do these make sense? Remember–you evaluate everyday.

1. They set you up to fail.

2. Everybody does it.

3. Losing weight should be a whole lifestyle change.

4. January is the wrong reason. T

5. Just make one.

6. There are better ways.

7. Bad economy.

8. No need for reminders.

9. You’re already stressed out enough.

10. They’re probably the same as last year.

I want to suggest one resolution that I’d like you to consider in 2010…one that will succeed…one that makes sense.

Learn one thing you didn’t know about evaluation.  Practice it.  If you need suggestions about the one thing, let me know (comment on the blog; email me if you want to remain anonymous on the blog; or call me–they all work.)

So if you’ve made some new year’s resolutions, throw them away. Make the changes in your life because you want to.

My wish for you is a wonderful 2010–make some part of your life better than 2009.

HAPPY NEW YEAR!!!

Merry Christmas–the greeting for the upcoming holiday–Hanukkah ended December 18 (I hope your was very happy–mine was); Solstice was last night (and the sun returned today–a feat in Oregon, in winter, so Solstice was truly blessed); kwanzaa 1

Kwanzaa won’t happen until Dec 26–and the greeting there is Habari Gani (Swahili for “What’s the news?”).

Now, how do I get an evaluation topic from that opening…hmmm…perhaps a gift…yes…a gift.

The gift I give y’all is this:

Think about your blessings.

Think about the richness of your life.

Think about those for whom you care.

And remember…even those thoughts are evaluative because you know how blessed you are; because you know how rich (we are not talking money here…) your life is; because you have people in your life for whom you care AND who care for you.

The light returns regardless of the tradition you follow, and that, too, is evaluative–because you can ask yourself is the light enough–and if it isn’t you CAN figure out how to solve that problem.

newyearresolution1 Next week, I’ll suggest some New Year’s  resolutions–evaluative, of course with no self-deception–you CAN do evaluation!

lightening and moountains d775bc83-0fcf-487a-b17f-56c97384c8e9Hi! It’s Tuesday, again.

I was thinking–If evaluation is an everyday activity, why does it FEEL so monumental–you know–over whelming, daunting, aversive even

I can think of several reasons for that feeling:

  • You don’t know how.
  • You don’t want to (do evaluation).
  • You have too much else to do.
  • You don’t like to (do evaluation).
  • Evaluation  isn’t important.
  • Evaluation limits your passion for your program.

All those are good reasons. Yet, in today’s world you have to show your programs are making a difference. You have to provide evidence of impact. To do that (show impact, making a difference) you must evaluate your program.

How do you make your evaluation manageable? How do you make it an everyday activity? Here are several ways.

Utilization-Focused Evaluation

  • Set boundaries around what you evaluate.
  • Limit the questions to ones you must know. Michael Patton says only collect data you are going to use, then use it. (To read more about evaluation and use,  see Patton’s book, Utilization-Focused Evaluation).
  • Evaluate key programs, not every program you conduct.
  • Identify where your passion lies and focus your evaluation efforts there.
  • Start small. You probably won’t be able to demonstrate that your program ensured  world peace; you will be able to know that your target audience has made an important change in the desired direction.

We can talk more about the how, later. Now it is enough to know that evaluation isn’t as monumental as you thought.

Welcome back.  It is Tuesday.

Some folks have asked me–now that I’ve pointed out that all of you are evaluators–where will I take this column.  That was food for thought…and although I’ve got a topic ready to go, I’m wondering if jumping off into working evaluation is the best place to go next.  One place I did go is to update the “About” tab on this page…

Maybe thinking some more about what you evaluated today; maybe thinking about a bigger picture of evaluation; maybe just being glad that the sun is shining is enough (although the subfreezing temperatures remind me of Minnesota without the snow).   The saying goes, “Minnesota has two seasons–green and white.”  Maybe Oregon has two seasons–dry and wet.   That is an evaluative question, by-the-way.  Hmmm…thinking about evaluation, having an evaluation question of the week sounds like a good idea.  What’s yours—small or large?  I may not have an answer, I will have an idea.

Ok–so now that I’ve dealt with the evaluative question of the day–I think it is time to go to more substance, like “what exactly IS program evaluation?”  Good question–if we are going to have this conversation, then we need to be using the same language.

First, let me address why the 489px-Wikipedia-logo-en-big link to Wikipedia is on the far right in my Blogroll list.  I’ve learned Wikipedia is a readily available, general reference that gets folks started understanding a subject.  It is NOT the definitive word on that subject.  Wikipedia (see link on the right) describes program evaluation as “…a systematic method for collecting, analyzing, and using information to answer basic questions about projects, policies, and programs.”  Well…yes…except that Wikipedia seems to be defining evaluation when it includes “projects and policies.”  Program evaluation deals with programs.  Wikipedia does have an entry for evaluation as well as an entry for program evaluation. Read both.

Evaluation can be applied to projects, policies, personnel, processes, performances, proposals, products AND programs. I won’t talk much about personnel, performances, proposals, or products. Projects may be another word for program; policies usually result in programs; and processes are often part of programs, so they may be talked about sometimes.

Most of what thisScriven book cover blog will address is program evaluation because most of you (including me) have programs that need evaluating.  When I talk about program evaluation I am talking about “…determining the merit, worth, or value of (a program).” (Michael Scriven uses this definition in his book, Evaluation Thesaurus, 1991, Sage Publications.)

It is available at the following site (or through the publisher).

So for me and what you need to know about program evaluation is this:

  • The root of evaluation is value (OED lists the etymology as [a.  Fr.  évaluation, f.  évaluer, f.  é- =es- (: L.  ex) out + value])
  • Program evaluation IS systematic.
  • Program evaluation DOES collect, analyze, and utilize information.
  • Program evaluation ATTEMPTS to determine the merit, worth, or value of a program.
  • Program evaluation ANSWERS this question:

“What difference does this program make in the lives and well being of (fill in the blank here—citizens of Oregon, my 4-H club, residents of the watershed, you get the idea).”

NOTE: I talk about “lives and well-being” because most programs are delivered to individuals who will be CHANGED as a result of participating in the program, i.e., experience a difference.

For those of us in Extension, when we do evaluation we are trying to determine if we “improved something;” we are not trying to “prove” that what we did accomplished something.  Peter Bloom always said, “Extension is out to improve something (attribution), not prove something (causation).”  We are looking for attribution not causation.

Many references exist that talk more about what is program evaluation.  My favorite reference is by Jody Fitzpatrick, Jim Sanders, and Blaine Worthen and is called, Program Evaluation: Alternative Approaches and Practical Guidelines (2004, Pearson Education)Fitzpatrick book cover_

It is available at the following site (or through the publisher).

So you need to evaluate what you do?

And you don’t know where to begin?

And you don’t even think you can?

You don’t think of yourself as an evaluator?

If you answered “yes” to any of these questions, I want to persuade you that you (going in reverse order):

  1. ARE an evaluator
  2. CAN evaluate
  3. DO know where to begin

Keep in mind, “Evaluation IS an everyday activity!”

Let me tell you how:

42-15654033

You wake up and you ask:

  • Do I  get out of bed?

The criteria used is relatively simple:

  • Is it a work day?
  • Do you feel OK?

Answer: It is a work day AND you feel fine.

You then (after a variety of preparatory activities) must decide what to wear.

The criteria you use gets more complicated.

  • What time of the year is it?
  • What will you be doing today?
  • What is the weather now?
  • What is the forecast for the rest of the day?
  • What is clean and pressed?

I can go on…the point is …FROM THE FIRST MOMENT OF ANY DAY TO THE END OF THAT DAY…

  • YOU EVALUATE

Evaluation is an everyday activity–what did you evaluate today?

Want to know more–about what? about how? about why? (evaluate)

Subscribe to the blog and let me know what you think.