The OSU Extension Service conference started today (#OSUExtCon). There are concurrent sessions, plenary sessions, workshops, twitter feeds, (Jeff Hino is Tweeting), tours, receptions, and meal gatherings. There are lots of activities and they cover four days. But I want to talk about conference evaluation.

The thought occurs to me: “What difference is this making?” Ever the evaluator, I realize that the selection will be different next year (it was different last year) so I wonder how valuable is it to evaluate the concurrent sessions? Given that time doesn’t stand still (fortunately {or not, depending}), the plenary sessions will also be different. Basically, the conference this year will be different from the conference the next time. Yes, it will be valuable for the presenters to have feedback on what they have done and it will be useful for conference planners to have feed back on various aspects of the conference. I still have to ask, “Did it make a difference?”

A long time colleague of mine (formerly at Pennsylvania State University penn-state-logo), Nancy Ellen Kiernannancy ellen kiernan proposed a method of evaluating conferences that I think is important to keep and use. She suggested the use of “Listening Post” as an evaluation method. She says, “The “Listening Posts” consisted of a group of volunteer conference participants who agreed beforehand to “post” themselves in the meeting rooms, corridors, and break rooms and record what conferees told them about the conference as it unfolded [Not unlike Twitter, but with value; parenthetical added]. Employing listening posts is an informal yet structured way to get feedback at a conference or workshop without making participants use pencil and paper.” She put it in “Tipsheet #5” and published the method in Journal of Extension (JoE), the peer reviewed monthly on-line publication.

Quoting from the abstract of the JoE article, “Extension agents often ask, “Isn’t there an informal but somewhat structured way to get feedback at a conference or workshop without using a survey?” This article describes the use of ‘Listening Posts’ and the author gives a number of practical tips for putting this qualitative strategy to use. Benefits include: quality feedback, high participation and enthusiastic support from conferees and the chance to build program ownership among conference workers. Deficits: could exclude very shy persons or result in information most salient to participants.”

I’ve used this method. It works. It does solicit information about what difference the conference made, not whether the participants liked or didn’t like the conference. (This is often what is asked in the evaluation.) Nancy Ellen suggests that the listening post collectors ask the following questions:

  1. “What did you think of the idea of …this conference? and
  2. What is one idea or suggestion that you found useful for your professional work? (The value/difference question)
  3. Then, she suggests, that the participant tell anything else about the conference that is important for us to know.

Make sure the data collectors are distinctive. Make sure they do not ask any additional questions. The results will be interesting.

The US just celebrated Thanksgiving, the annual day of thankfulness. Thanksgiving Canada celebrated in mid October (October 12). Although other countries celebrate versions of the holiday, originally the US and Canada celebrated in honor of the previous harvest.

Certainly, the Guiding Principles Guiding principles and the Program Evaluation Standards program evaluation standards provide evaluators with a framework to conduct evaluationEvaluation3 work. The work for which I am thankful.

Continue reading

My friend and colleague, Patricia Rogers, says of cognitive bias , “It would be good to think through these in terms of systematic evaluation approaches and the extent to which they address these.” This was in response to the article herecognitive bias The article says that the human brain is capable of 10 to the 16th power (a big number) processes per second. Despite being faster than a speeding bullet, etc., the human brain has ” annoying glitches (that) cause us to make questionable decisions and reach erroneous conclusions.”

Bias is something that evaluators deal with all the time. There is desired response bias, non-response bias, recency and immediacy bias, measurement bias, and…need I say more? Isn’t evaluation and aren’t evaluators supposed to be “objective”? That we as evaluators behave in an ethical manner? That we have dealt with potential bias and conflicts of interest. That is where cognitive bias appear. And you might not know it at all. Continue reading

KASA. You’ve heard the term many times. Have you really stopped to think about what it means? What evaluation approach you will use if you want to determine a difference in KASA? What analyses you will use? How you will report the findings?

Probably not. You just know that you need to measure KNOWLEDGE, ATTITUDE, SKILLS, and ASPIRATIONS.

The Encyclopedia of Evaluation (edited by Sandra Mathisonsandra mathison) says that they influence the adoption of selected practices and technologies (i.e., programs). Claude Bennett Claude Bennett uses KASA in his TOP model  Bennett Hierarchy.I’m sure there are other sources. Continue reading

I’ve been stuck.

I haven’t blogged for three weeks. I haven’t blogged because I don’t have a topic. Oh, I’ve plenty to say (I am never for a loss of words… 🙂 ) I want something to relate to evaluation. Relate clearly. Without question. Evaluation.

So after 5 years, I’m going to start over. Evaluation is an everyday activity!

Evaluative thinking is something you do everyday; probably all day. (I don’t know about when you are a sleep, so I said probably.) I think evaluative thinking is one of those skills that everyone needs to learn systematically. I think everyone learns at least a part of evaluative thinking as they grow; the learning may not be systematic. I would put that skill in the same category as critical (not negative but thoughtful) thinking, team building, leadership, communication skills (both verbal and written), technological facility as well as some others which escape me right now. I would add systematic evaluative thinking.

Everyone has criteria on which decisions are based. Look at how you choose a package of cookies or a can of corn can of corn 2 at the grocery store. What criteria do you use for choosing? Yet that wasn’t taught to you; it was just something you developed. Evaluative thinking is more than just choosing what you want for dinner. AARP lists problem solving as part of the critical thinking skills. I think it is more than just problem solving; I do agree that it is a critical thinking skill (see graphic, from Grant Tilus, Rasmussen College).Core Critical Thinking Skills

So you think thoughtfully about most events/activities/things that you do throughout the day. And you learn over time what works and what doesn’t; what has value and what doesn’t. You learn to discern the conditions under which something works; you learn what changes the composition of the outcome. You begin to think evaluatively about most things. One day you realize that you are critically thinking about what you can, will, and need to do. Evaluative thinking has become systematic. You realize that it depends on many factors. You realize that evaluative thinking is a part of who you are. You are an evaluator, even if you are a psychologist or geologist or engineer or educator first.

 

my two cents.

molly.

I just got back from a road trip across Southern Alabama with my younger daughter.southern alabama We started from Birmingham and drove a very circuitous route ending in Mobile and the surrounding areas, then returned to Birmingham for her to start her second year at Birmingham-Southern College.

As we traveled, I read a book by Bill McKibben (one of many) called Oil and Honey: The Education of an Unlikely Activist. It is a memoir, a personal recounting of the early years of this decade, which corresponded with the years my older daughter was in college (2011-2014). I met Bill McKibben, who, in 2008, is credited with starting the non-profit, 350.0rg, and is currently listed as “senior adviser and co-founder”. He is a passionate, soft-spoken man, who beleives that the world is on a short fuse. He really seems to believe that there is a better way to have a future. He, like Gandhi, is taking a stand.  Oil and Honey puts into action Gandhi’s saying about being the change you want to seegandhi and change. As the subtitle indicates, McKibben is an unlikely activist. He is a self-described non-leader who led and advises the global effort to increase awareness of climate change/chaos. When your belief is on the line, you do what has to be done.

Evaluators are the same way. When your belief is on the line, you do what has to be done. And, hopefully, in the process you are the change that you want to see in the world. But know it cannot happen one pipeline at a time. The fossil fuel industry has too much money. So what do you do? You start a campaign. That is what 350.org has done:  “There are currently fossil fuel divestment campaigns at 308 colleges and universities, 105 cities and states, and 6 religious institutions.”(Wikipedia, 350.0rg) (Scroll down to the heading “Fossil Fuel Divestment” to see the complete discussion.) Those are clear numbers, hard data for consumption. (Unfortunately, the  divestment campaign at OSU failed.)

So I see the question as one of impact, though not specifically world peace (my ultimate impact). If there is no planet on which to work for world peace, there in no need for world peace. Evaluators can help. They can look at data critically. They can read the numbers. They can gather the words. This may be the best place for the use of pictures (they are, after all, worth 1000 words).  Perhaps by combining efforts, the outcome will be an impact that benefits all humanity and builds a tomorrow for the babies born today.

my two cents.

molly.

 

I keep getting comments about my posts “Does this blog make a difference?”

I want to say thank you for all who read it.

thank-you

 

I want to say thank you for all who follow this blog.

thank-you

Mostly, I am continually amazed that people find what I have to say interesting to come back.

So: Thank you. For reading. For following. For coming back.

I think that is making a difference.

my two cents.

molly.

P. S. See you in two weeks!

“fate is chance; destiny is choice”.destiny-calligraphy-poster-c123312071

Went looking for who said that originally so that I could give credit. Found this as the closest saying: “Destiny is no matter of chance. It is a matter of choice: It is not a thing to be waited for, it is a thing to be achieved.

William Jennings Bryan

 

Evaluation is like destiny. There are many choices to make. How do you choose? What do you choose?

Would you listen to the dictates of the Principal Investigator even if you know there are other, perhaps better, ways to evaluate the program?

What about collecting data? Are you collecting it because it would be “nice”? OR are you collecting it because you will use the data to answer a question?

What tools do you use to make your choices? What resources do you use?

I’m really curious. It is summer and although I have a list (long to be sure) of reading, I wonder what else is out there, specifically relating to making choices? (And yes, I could use my search engine; I’d rather hear from my readers!)

Let me know. PLEASE!

my two cents.

molly.