Complexity.

I predict a bright future for complexity. Have you ever considered how complicated things can get, what with one thing always leading to another?

~~E. B. White, Quo VADIMUS? OR THE CASE FOR THE BICYCLE (Garden City. Publishing 1946)

 

Every thing is connected.

One thing does lead to another. And connections between them can be drawn. So let’s connect the dots.

In 2002, as AEA president, I chose for the theme of the meeting, Evaluation a Systematic Process that Reforms Systems.

Evaluation doesn’t stand in isolation; it is not something that is added on at the end as an afterthought.

Many program planners see evaluation that way, unfortunately. Only as an add on, at the end.

Contrary to may peoples’ belief, evaluators need to be included at the outset of the program. They also need to be included at each stage thereafter (Program Implementation, Program Monitoring, and Program Delivery; Data Management and Data Analysis; Program Evaluation Utilization).

Systems Concepts.

Shortly after Evaluation 2002, (in 2004) the Systems Evaluation Topical Interest Group was formed.

AEA published (2007) “Systems Concepts in Evaluation: A Expert Anthology” (scroll to the end). It was edited by Bob Williams and Iraj Imam (who died as this volume was going to press). To order, see this link.

This volume does an excellent job of demonstrating how evaluation and systems concepts are related.

It connects the dots.

In that volume, Gerald Midgley writes about “the intellectual development of systems field, how this has influenced practice and importantly the relevance for all this to evaluators and evaluation”. It is the pivotal chapter (according to the editors).

While it is possible to trace the idea to trace the ideas about holistic thinking back to the ancient Greeks, systems thinking is probably best attributed to the ideas of von Bertalanffy [Bertalanffy, L. von. (1950). Theory of open systems in physics and biology. Science, III: 23-29.]

I would argue that the complexity concept will go back to at least Alexander von Humboldt . (Way before von Bertalanffy.) He was an intrepid explorer and created modern environmentalism. Alexander von Humboldt lived between 1769–1859. Although environmentalism is a complex word, it really is a system. With connections. And complexity.

Suffice it to say, there are no easy answers to the problems faced by professionals today. Complex. Complicated. And one thing leading to another.

my .

Professional Development.

AEA365 shares some insights into its use in evaluating professional development.

The authors cite Thomas Guskey (1, 2). I didn’t know Thomas Guskey. I went looking.

Turns out, Donald Kirkpatrick (1924-2014) was the inspiration for the five level evaluation model of Thomas Guskey.

Kirkpatrick has four levels in his model (reaction, learning, behavior, results). I’ve talked about them before here and here. I won’t go into them again.

Guskey has added a fifth level. In the middle.

He talks about participant reaction (level 1) and participant learning  (level 2) (like Kirkpatrick).

His third level is different. Here he talks about organization support and change.

Then he adds two additional levels that are representative of Kirkpatrick’s model (level 3 and 4). He adds participant use of new knowledge and skills (Kirkpatrick’s behavior; Guskey’s level 4) and participant learning outcomes (Kirkpatrick’s results; Guskey’s level 5). Continue reading

Trustworthiness. An interesting topic.

Today is November 9, 2016. An auspicious day, to be sure. (No, I’m not going to rant about November 8, 2016; just post this and move on with my living.) Keep in mind trustworthiness, I remind myself.

I had the interesting opportunity to review a paper recently that talked about trustworthiness. This caused me much thought as I was troubled by what was written. I decided to go to my source on “Naturalistic Inquiry”lincoln book . Given that the paper used a qualitative design, employed a case study method, and talked about trustworthiness, I wanted to find out more. This book was written by two of my long time evaluation guides, Yvonna Lincoln yvonna lincolnand Egon Gubaegon guba bw. (Lincoln’s name may be familiar to you from the Sage Handbook of Qualitative Research which she co-edited with Norman Denzin.)

Trustworthiness

On page 218, they talk about trustworthiness. About the conventional criteria for trustworthiness (internal validity, external validity, reliability, and objectivity). They talk about the questions underlying those criteria (see page 218).

They talk about how the criteria formulated by conventional inquirers are not appropriate for naturalistic inquiry. Guba (1981a) offers four new terms as they have “…a better fit with naturalistic epistemology.” These four terms and the terms they propose to replace are: Continue reading

AEA365 ran a blog on vulnerability vulnerability linkrecently (August 5, 2016). It cited the TED talk by Brené Brown brene brown on vulnerability on the same topic. Although I really enjoyed the talk (I haven’t met a TED talk I didn’t like), it was more than her discussion of vulnerability that I enjoyed (although I certainly enjoyed learning that vulnerability is the birth place of joy and connection is why we are here .

She talked about story and its relationship to qualitative data. She says that she is a qualitative researcher and she collects stories. She says that “stories are just data with a soul”. That made a lot of sense to me.

See, I’ve been struggling to figure out how to turn the story into a meaningful outcome without reducing it to a number. (I do not have an answer, yet. If any of you have any ideas, let me know.) She says (quoting a former research professor) that if you cannot measure it, it does not exist. If it doesn’t exist then is what ever you are studying a figment of your imagination? So is there a way to capture a story and aggregate that story with other similar stories to get an outcome WITHOUT REDUCING IT TO A NUMBER? So given that stories are often messy, and given that stories are often complicated, and given that stories are rich in what they tell the researcher, it occurred to me that stories are more than themes and and content analysis. Stories are “data with a soul”.

Qualitative Data

Yet any book on qualitative data analysis (for example qualitative data coding or Qualitative data analysis ed. 3 or Bernard qualitative data analysis ed 1) you will see that there is confusion in the analysis process. Is it the analysis of qualitative data OR is it the qualitative analysis of data. Where do you put the modifier “qualitative”? To understand the distinction, a 2×2 visual might be helpful. (Adapted from Bernard, H. R. & Ryan, G. W. (1996). Qualitative data, quantitative analysis. Cultural Anthropology Methods Journal, 8(1), 9 – 11. Copyright © 1996 Sage Publications.)

2x2 data analysis

We are doing data analysis in all four quadrants. We are analyzing and capturing the deeper meaning of the data in cell A. Yes, we are analyzing data in other cells (B, C, and D) just not the capturing the deeper meaning of those data. Cell D is the quantitative analysis of quantitative data; Cell B is the qualitative analysis of quantitative data; and Cell C is the quantitative analysis of qualitative data. So the question becomes “Do you want deeper meaning from your data?” or “Do you want a number from your data?” (I’m still working on relating this to story!)

It all depends on what you want when you analyze your data. If you want to reduce it to a number, focus on cells B, C, and D. If you want deeper meaning, focus on cell A. Depending on what you want (and how you interpret the data) will be the place where the personal and situational bias occur. No, you cannot be the “objective and dispassionate” scientist. Doesn’t happen in today’s world (probably ever–only I can only speak of today’s world). Everyone has biases and they rear their heads (perhaps ugly heads) when least expected.

You have to try. Regardless.

my two cents.

molly.

 

 

It has been almost a month since I last blogged. When I last blogged, I talked about evaluation history. That blog was a bunny path from what I had been talking about: methodology. I was talking about the implementationimplementation, monitoringmonitoring-2, and delivery deliveryof interventions which are to be evaluated. Another methodology I want to talk about is case study. I did go through the archives to locate the blogs relating to case study. They are below.

http://blogs.oregonstate.edu/programevaluation/2015/01/15/blogging-case-study/

http://blogs.oregonstate.edu/programevaluation/2010/04/13/other-ways-to-gather-information-the-case-study/

http://blogs.oregonstate.edu/programevaluation/2013/06/12/causation/

http://blogs.oregonstate.edu/programevaluation/2013/06/07/one-of-the-5cs-clarity/ Continue reading

Focus groupsfocusgroups  are a wonderful data gathering collection methodology. Not only are there different skills to learn for interviewing, analysis gives you the opportunity to explore qualitative data analysis. (It is all related after all.)

Now, I will confess that I’ve only ordered the 5th edition of the Krueger and Casey book (I don’t have it). I’m eager to see what is new. So I’ll settle for the 4th edition and try and regale you with information you may not know. (I will talk in a future post about the ways virtual focus groups data base 2 are envisioned.)

Focus group describes (although sometimes incorrectly) a variety of group processes. Krueger and Casey give the reader a sense of to what to pay attention and to what is based on faulty data. So starting at the beginning, let’s look at an overview of what exactly is a focus group.

Groups are experiences that affect the individual throughout life and are used for planning, decision making, advising, learning, sharing, self-help, problem solving, among others. Yet group membership often leaves the individual Continue reading

Previously, I talked about Survey’s (even though I posted it April 27, 2016). Today, I’ll collect all the posts about focus groups and add a bit more.

2010/01/05 Talks about the type of questions to use in a Focus Group

2010/01/27 One of three topics mentioned

2010/09/09 Talks about focus groups in terms of analyzing a conversation

2011/05/31 Talks about focus groups in the context of sampling

2011/06/23 Mentions Krueger, my go to

2013/11/15 Mentions focus groups

2014/10/23  Mentions focus groups and an individual with information

2015/02/11 Mentions focus groups…

2015/05/08 Virtual focus groups

Discovery

Although focus groups are a mentioned throughout many of my posts, there are few that are exclusively devoted to focus groups. That surprises me. I need to talk more about focus groups. I especially need to talk about what I found when I did the virtual focus groups, more than with the specific post. From the interest at AEA last year, there needs to be much discussion.

So OK. More about focus groups.

Although Dick  Krueger dick-1997 is my go to reference for focus groups krueger 4th ed(I studied with him, after all), there are other books on focus groups. (I just discovered that Krueger and Casey have also revised and published a 5th edition.) krueger 5th edition

The others for example (in no particular order),

  1. Stewart, D. W. & Shamadasani, P. N. David Stewart Focus groups 1 ed(1990). Focus groups: Theory and practice. Newbury Park, CA: Sage Publications. There is a 3rd edition of this book available David Stewart Focus groups
  2. Morgan, D. L. (ed.)david morgan focus groups (1993). Successful focus groups: Advancing the state of the art. Newbury Park, CA: Sage Publications.
  3. Greenbaum, T. L. greenbaum-focus groups (2000). Moderating focus groups. Thousand Oaks, CA: Sage Publications.
  4. Greenbaum, T. L. (2nd edition). Greenbaum-focus group research(1998). The handbook for focus group research. Thousand Oaks, CA: Sage Publications.
  5. Carey, M. A. & Asbury, J-E. Carey book cover(2012). Focus group research. Walnut Creek, CA: Left Coast Press, Inc.

Plus many others that are published by Sage, available from Amazon, and others. I think you can find one that works for you.

Mary Marczak and Meg Sewell have an introduction to focus groups here (it is shorter that reading the book by Krueger and Casey).

I think it is important to remember that focus groups:

  1. Yield qualitative data;
  2. Are used in evaluation (just not in a pre-post sense);
  3. Are a GROUP activity of people who are typically unfamiliar with each other.

Next time: More on focus groups.

my two cents.

molly.

NOTE: This was written last week. I didn’t have time to post. Enjoy.

 

Methodologymethodology 2, aka implementationimplementation, monitoringmonitoring-2, and deliverydeliveryis important. What good is it if you just gather the first findings that come to mind. Being rigorous here is just as important as when you are planning and modeling the program. So I’ve searched the last six years of blogs posts and gathered some of them for you. They are all about Survey, a form of methodology.survey image 3 Survey is a methodology that is often used by Extension, as it is easy to use. However, organizing the surveysurvey organization, getting the survey’s backsurvey return, and dealing with non-response are problematicnonresponse (another post, another time).

The previous posts are organized by date from the oldest to the most recent:

 

2010/02/10

2010/02/23

2010/04/09

2010/08/25

2012/08/09

2012/10/12

2013/03/13

2014/03/25

2014/04/15

2014/05/19

2015/06/29

2015/07/24

2015/12/07

2016/04/15

2016/04/21 (today’s post isn’t hyperlinked)

Just a few words on surveys today: A colleague asked about an evaluation survey for a recent conference. It will be an online survey probably using the University system, Qualtrics. My colleague jotted down a few ideas. The thought occurred to me that this book (by Ellen Taylor-Powell and Marcus Renner) would be useful. On page ten of this book, it asks for the type of information that is needed and wanted. It lists five types of possible information:

  1. Participant reaction (some measure of satisfaction);
  2. Teaching and facilitation (strengths and weaknesses of the presenter, who may (or may not) change the next time);
  3. Outcomes (what difference/benefits/intentions did the participant experience);
  4. Future programming (other educational needs/desires); and
  5. Participant background (who is attending and who isn’t can be answered here).

Thinking through these five categories made all the difference for my colleague. (Evaluation was a new area.) I had forgotten about how useful this booklet is for people being exposed to evaluation for the first time and to surveys, as well. I recommend it.

Alan Rickman quote

Alan Rickman Alan-Rickman died this month. He was an actor of my generation; one that provided me with much entertainment. I am sad. Then I saw this quote on the power of stories. How stories explain. How stories can educate. How stories can help reduce bias.  And I am reminded how stories are evaluative.

Dick Krueger dick-1997 did a professional development session (then called a “pre-session”) many years ago. It seems relevant now. Of course, I couldn’t find my notes (which were significant) so I did an online search, using “Dick Krueger and stories” as my search terms. I was successful! (See link.) When I went to the link, he had a whole section on story and story telling. What I remember most about that session is what he has listed under “How to Analyze the Story”. Specifically the four points he lists under problems with credibility:

  • Authenticity – Truth
  • Accuracy – Memory Problems
  • Representativeness and Sampling
  • Generalizability / Transferability

The next time you tell a story think of it in evaluative terms. And check out what Dick Krueger has to say. Continue reading