Ryan asks a good question: “Are youth serving programs required to have an IRB for applications, beginning and end-of-year surveys, and program evaluations?”  His question leads me to today’s topic.

The IRB is concerned with “research on human subjects”.  So you ask, When is evaluation a form research?

It all depends.

Although evaluation methods have evolved from  social science research, there are important distinctions between the two.

Fitzpatrick, Sanders, and Worthen list five differences between the two and it is in those differences that one must consider IRB assurances.

These five differences are:

  1. purpose,
  2. who sets the agenda,
  3. generalizability of results,
  4. criteria, and
  5. preparation.

Although these criteria differ for evaluation and research, there are times when evaluation and research overlap.    If the evaluation study adds to knowledge in a discipline or research informs our judgments about a program, then the distinctions are blurred and a broader view of the inquiry is needed and possibly an IRB approval.

IRB considers children a vulnerable population.  Vulnerable populations require IRB protection.  Evaluations with vulnerable populations may need IRB assurances.  IF you have a program that involves children AND you plan to use the program activities as the basis of an effectiveness evaluation (ass opposed to program improvement) AND use that evaluation as scholarship you will need IRB.

Ryan asks “what does publish mean”.  That question takes us to what is scholarship.  One definition of scholarship is that scholarship is creative work, that is validated by peers and communicated.  Published means communicating to peers in a peer reviewed journal or professional meeting not, for example, in a press release.

How do you decide if your evaluation needs IRB?  How do you decide if your evaluation is research or not?   Start with the purpose of your inquiry.  Do you want to add knowledge in the field?   Do you want to see if what you are doing is applicable in other settings?  Do you want others to know what you’ve done and why?  They you want to communicate this.  In academics, that means publishing it in a peer reviewed journal or presenting it at a professional meeting.  And to do that and use the information provided you by your participants who are human subjects, you will need IRB assurance that they are protected.

Every IRB is different.  Check with your institution.  Most work done by Extension professionals falls under the category of “exempt from full board review”.  It is the shortest review and the least restrictive.  Vulnerable populations, audio and/or video taping, or asking sensitive questions typically is categorized as expedited, a more stringent review than the “exempt” category, which takes a little longer.  IF you are working with vulnerable populations and asking for sensitive information,  doing an invasive procedure, or involving participants in something that could be viewed as coercive, then the inquiry will probably need full board review (which takes the longest turn around time.

September 25 – October 2 is Banned Book Week.

All of the books shown below have been or are banned.

and the American Library Association has once again published a list of banned or challenged books.  The September issue of the AARP Bulletin listed 50 banned books.  The Merriam Webster Dictionary was banned in a California elementary school in January 2010.

Yes, you say, so what?  How does that relate to program evaluation?

Remember the root of the work “evaluation” is value.  Someplace in the United States, some group used some criteria to “value” (or not) a book– to lodge a protest, successfully (or not), to remove a book from a library, school, or other source.  Establishing a criteria means that evaluation was taking place.  In this case, those criteria included being “too political,” having “too much sex,” being “irreligious,” being “socially offensive,” or some other criteria.   Some one, some place, some where has decided that the freedom to think for your self, the freedom to read, the importance of the First Amendment, the importance of free and open access to information are not important parts of our rights and they used evaluation to make that decision.

Although I don’t agree with censorship–I agree with the right that a person has to express her or his opinion as guaranteed by the First Amendment.  Yet in expressing an opinion, especially an evaluative opinion, an individual has a responsibility to express that opinion without hurting other people or property; to evaluate responsibly.

To aid evaluators to evaluate responsibly, the The American Evaluation Association has developed a set of five guiding principles for evaluators and even though you may not consider yourself a professional evaluator, considering these principals when conducting your evaluations is important and responsible.  The Guiding Principles are:

A. Systematic Inquiry: Evaluators conduct systematic, data-based inquiries;

B. Competence: Evaluators provide competent performance to stakeholders;

C. Integrity/Honesty: Evaluators display honesty and integrity in their own behavior, and attempt to ensure the honesty and integrity of the entire evaluation process;

D.  Respect for People:  Evaluators respect the security, dignity, and self-worth of respondents, program participants, clients, and other evaluation stakeholders; and

E. Responsibilities for General and Public Welfare: Evaluators articulate and take into account the diversity of general and public interests and values that may be related to the evaluation.

I think free and open access to information is covered by principle D and E.  You may or may not agree with the people who used evaluation to challenge a book and in doing so used evaluation.  Yet, as someone who conducts evaluation, you have a responsibility to consider these principles, making sure that your evaluations respect people and are responsible for general and public welfare (in addition to employing systematic inquiry, competence, and integrity/honesty).  Now–go read a good (banned) book!

There are a three topics on which I want to touch today.

  • Focus group participant composition
  • Systems diagrams
  • Evaluation report usePatton's utilization focused evaluation

In reverse order:

Evaluation use: I neglected to mention Michael Quinn Patton’s book on evaluation use. Patton has advocated use before most everyone else.  The title of his book  is Utilization-Focused Evaluation. The 4th edition is available from the publisher (Sage) or from Amazon (and if I knew how to insert links to those sites, I’d do it…another lesson…).

cartoon of systems diagramSystems diagrams: I had the opportunity last week to work with a group of Extension faculty all involved in Watershed Education (called the WE Team). This was an exciting experience for me. I helped them visualize what their concept of the WE Team looked like using the systems tool of drawing a systems diagram. This is an exercise whereby individuals or small groups quickly draw a visualization of a system (in this case the WE Team).  This is not art; it is not realistic; it is only a representation from one perspective.

This is a useful tool for evaluators because it can help evaluators see where there are opportunities for evaluation; where there are opportunities for leverage; and where there there might be resistance to change (force fields). generic systems diagramIt also helps evaluators see relationships and feedback loops. I have done workshops on using systems tools in evaluating multi-site systems (of which a systems diagram is one tool) with Andrea Hegedus for the American Evaluation Association. Although this isn’t the diagram the WE Team created, it is an example of what a system diagram could look like. I used the soft ware called Inspiration to create the WE Team diagram. Inspiration has a free 30- day download  and it is inexpensive (the download  for V. 9 is $69.00).

Focus group participant composition.

The composition of focus groups is very important if you want to get data that you can use AND that answers your study question(s). Focus groups tend to be homogeneous, with variations to allow for differing opinions. Since the purpose of the focus group is to elicit in-depth opinions, it is important to compose the group with similar demographics (depending on your topic) in

  • age
  • occupation
  • use of program
  • gender
  • background

Comfort and use drive the composition. More on this later.

It is Wednesday and the sun is shining. oregon susnhine

The thought for today is evaluation use.

A colleague of mine tell me there are four types of evaluation use:

  1. Conceptual use
  2. Instrumental use
  3. Persuasive use
  4. Process use

How are evaluation results used? Seems to me that using evaluation results is the goal–otherwise it what you’ve done really evaluation?   HOW do YOU use your evaluation results?reports 2

Is scholarship a use?

Is reporting a use?

Does use always mean taking action?

Does use always mean “so what”?

Jane Davidson (Davidson Consulting, http://davidsonconsulting.co.nz
Aotearoa New Zealand)  says there are three question that any evaluator needs to ask:

  1. What’s so?
  2. So what?
  3. Now what?

Seems to me that you need the “now what” question  to have evaluation use. What do you think? Post a comment.