There are a three topics on which I want to touch today.

  • Focus group participant composition
  • Systems diagrams
  • Evaluation report usePatton's utilization focused evaluation

In reverse order:

Evaluation use: I neglected to mention Michael Quinn Patton’s book on evaluation use. Patton has advocated use before most everyone else.  The title of his book  is Utilization-Focused Evaluation. The 4th edition is available from the publisher (Sage) or from Amazon (and if I knew how to insert links to those sites, I’d do it…another lesson…).

cartoon of systems diagramSystems diagrams: I had the opportunity last week to work with a group of Extension faculty all involved in Watershed Education (called the WE Team). This was an exciting experience for me. I helped them visualize what their concept of the WE Team looked like using the systems tool of drawing a systems diagram. This is an exercise whereby individuals or small groups quickly draw a visualization of a system (in this case the WE Team).  This is not art; it is not realistic; it is only a representation from one perspective.

This is a useful tool for evaluators because it can help evaluators see where there are opportunities for evaluation; where there are opportunities for leverage; and where there there might be resistance to change (force fields). generic systems diagramIt also helps evaluators see relationships and feedback loops. I have done workshops on using systems tools in evaluating multi-site systems (of which a systems diagram is one tool) with Andrea Hegedus for the American Evaluation Association. Although this isn’t the diagram the WE Team created, it is an example of what a system diagram could look like. I used the soft ware called Inspiration to create the WE Team diagram. Inspiration has a free 30- day download  and it is inexpensive (the download  for V. 9 is $69.00).

Focus group participant composition.

The composition of focus groups is very important if you want to get data that you can use AND that answers your study question(s). Focus groups tend to be homogeneous, with variations to allow for differing opinions. Since the purpose of the focus group is to elicit in-depth opinions, it is important to compose the group with similar demographics (depending on your topic) in

  • age
  • occupation
  • use of program
  • gender
  • background

Comfort and use drive the composition. More on this later.

It is Wednesday and the sun is shining. oregon susnhine

The thought for today is evaluation use.

A colleague of mine tell me there are four types of evaluation use:

  1. Conceptual use
  2. Instrumental use
  3. Persuasive use
  4. Process use

How are evaluation results used? Seems to me that using evaluation results is the goal–otherwise it what you’ve done really evaluation?   HOW do YOU use your evaluation results?reports 2

Is scholarship a use?

Is reporting a use?

Does use always mean taking action?

Does use always mean “so what”?

Jane Davidson (Davidson Consulting, http://davidsonconsulting.co.nz
Aotearoa New Zealand)  says there are three question that any evaluator needs to ask:

  1. What’s so?
  2. So what?
  3. Now what?

Seems to me that you need the “now what” question  to have evaluation use. What do you think? Post a comment.

Hi–Today is Wednesday, not Tuesday…I’m still learning about this technology. Thanks for staying with me.

Today I want to talk about a check list called  The Fantastic Five. (Thank you Amy Germuth for bringing this to my attention.)  This checklist  presents five questions against which to judge any/all survey questions you have written. survey image 3

The five questions are:

1. Can the question be consistently understood?

2. Does the question communicate what constitutes a good answer?

3. Do all respondents have access to the information needed to answer the question?

4. Is the question one which all respondents will be willing to answer?

5. Can the question be consistently communicated to respondents?

“Sure,” you say “…all my survey questions do that.”  Do they really? SurveyQns

Let me explain.

1. When you ask about an experience, think about the focus of the question. Is it specific for what you want? I used the question,  “When were you first elected to office?” and got all manner of answers. I wanted the year elected.  I got months (7 months ago), years (both “2 years ago” and  “in 2006”)), words (at the last election). A more specific question would have been “In what year were you first elected to office?”

2. When I asked the election question above, I did not communicate what constitutes a good answer. I wanted a specific year so that I could calculate how long the respondent had been an elected official. Fortunately, I found this out in a pilot test, so the final question gave me answers I wanted.

3. If you are looking for information that is located on supplemental documents (for example, tax forms), let the respondent know that these documents will be needed to answer your survey. Respondents will guess without having the supporting documentation ready, reducing the reliability of your data.

4. Often, we must ask questions that are of a sensitive nature, which could be seen as a violation of privacy. Using Amy’s example, “Have you ever been tested for HIV?” involves a sensitive subject.  Many people will not want to answer that question because of the implications.  Asking instead, “Have you donated blood in the last 15 years?” gets you the same information without violating privacy.  Red Cross began testing blood for HIV in 1985.

5. This point is especially important with mixed mode surveys (paper and interview for example). Again, being as specific as possible is the key. When asking an open ended question, make sure that the options included cover what you want to know. Also, make sure that the method of administration doesn’t affect the answer you want.

This post was based on a presentation by Amy Germuth, President of EvalWorks, LLC at a Coffee Break webinar sponsored by the American Evaluation Association. The full presentation can be found at:

http://comm.eval.org/EVAL/EVAL/Resources/ViewDocument/Default.aspx?DocumentKey=53951031-ef7a-4036-bc8f-3563f3946026.

Dillman's bookThe best source I’ve found for survey development is Don Dillman’s 3rd edition of “Internet, mail, and mixed-mode surveys: The tailored design method” published in 2009 and available from the publisher (Wiley).

Welcome back!   For those of you new to this blog–I post every Tuesday, rain or shine…at least I have for the past 6 weeks…:) I guess that is MY new year’s resolution–write here every week; on Tuesdays…now to today’s post…

What one thing are you going to learn this year about evaluation?

Something about survey design?

OR logic modeling?

OR program planning?

OR focus groups?focusgroups

OR…(fill in the blank and let me know…)

A colleague of mine asked me the other day about focus groups.

Specifically, the question was, “What makes a good focus group question?”

I went to Dick Krueger and Mary Anne Casey’s book (Focus Groups, 3rd ed. Sage Publications, 2000).  On page 40, they have a section called “Qualities of Good Questions”. These make sense.They say: Good questions…

focus group book--krueger

  1. …sound conversational
  2. …use words participants would use.
  3. …are easy to say.
  4. …are clear.
  5. …are short.
  6. …are open-ended.
  7. …are one dimensional.
  8. …include good directions.

Let’s explore these a bit.

  1. Since focus groups are a social experience (albeit, a data gathering one), conversational questions help set an informal tone.
  2. If participants don’t/can’t understand your questions (because you use jargon, technical terms, etc.), you won’t get good information. Without good information, your focus group will not help answer your inquiry.
  3. You don’t want to stumble over the words, so avoid complicated sentences.
  4. Make sure your participants know what you are asking. Long introductions can be confusing, not clarifying. Messages may be mixed and thus interpreted in different ways. All this results information that doesn’t answer your inquiry.
  5. Like being clear, short questions tend to avoid ambiguity and yield good data.
  6. To quote Dick and Mary Anne, “Open-ended questions are a hallmark of focus group interviewing.” You want an opinion. You want an explanation. You want rich description. Yes/No doesn’t give you good data.
  7. Using synonyms add richness to questioning–using synonyms confuses the participant. Confused participants yields ambiguous data. Avoid using synonyms–keep questions one-dimensional keeps questions clear.
  8. Participants need clear instructions when asked to do something in the focus group.  “Make a list” needs to have “on the piece of paper in front of you” added.  A list in the participants head may get lost and you loose the data.

Before you convene your focus group, make sure you have several individuals (3 – 5) who are similar to and not included in your target audience review the focus group questions. It is always a good idea to pilot any question you use to gather data.

Ellen Taylor-Powell (at University of Wisconsin Extension) has a Quick Tips sheet on focus groups for more information. To access it go to: http://www.uwex.edu/ces/pdande/resources/pdf/Tipsheet5.pdf