NOTE: This was written last week. I didn’t have time to post. Enjoy.
Methodology, aka implementation, monitoring, and deliveryis important. What good is it if you just gather the first findings that come to mind. Being rigorous here is just as important as when you are planning and modeling the program. So I’ve searched the last six years of blogs posts and gathered some of them for you. They are all about Survey, a form of methodology. Survey is a methodology that is often used by Extension, as it is easy to use. However, organizing the survey, getting the survey’s back, and dealing with non-response are problematic (another post, another time).
The previous posts are organized by date from the oldest to the most recent:
2016/04/21 (today’s post isn’t hyperlinked)
Just a few words on surveys today: A colleague asked about an evaluation survey for a recent conference. It will be an online survey probably using the University system, Qualtrics. My colleague jotted down a few ideas. The thought occurred to me that this book (by Ellen Taylor-Powell and Marcus Renner) would be useful. On page ten of this book, it asks for the type of information that is needed and wanted. It lists five types of possible information:
Thinking through these five categories made all the difference for my colleague. (Evaluation was a new area.) I had forgotten about how useful this booklet is for people being exposed to evaluation for the first time and to surveys, as well. I recommend it.
The WECT program arbitrarily divided the WECT program into four parts. Those “modules” are:
“Speak your mind, even if your voice shakes.”
The Gray Panthers is a group of people advocating for the rights of oldsters (among other things). Aging is the brunt of many jokes. At least in the US. Unfortunately.
Another long time friend relayed the NPR story about aging, which says anchovies, rosemary, vino, and leisure are the answers. Now I’m not saying that anchovies, rosemary, vino, and leisure are the reason evaluation as a discipline has come as far as it has in the last 50+ years; I’m just saying that perhaps we need to look a little deeper than just the surface. I think Maggie Kuhn says it clearly: “Speak your mind, even if your voice shakes.”
Stand up for what you believe! (even if your voice shakes).
I believe that evaluation makes a difference.
I believe that there is a need for evaluation.
So how will you stand up today? What choice will you make? Speak your mind unambiguously!
New Topic: I learned today that Will Shadish died on March 27, 2016.
Will was very active as a quantitative psychologist and an evaluator. We served AEA together. I will miss him.
I am a social scientist. I look for the social in the science of what I do.
I am an evaluator as a social scientist. I want to determine the merit, worth, value of what I do. I want to know that the program I’m evaluating (or offering) made a difference. (After all, the root of evaluation is value.)
Keeping that in mind has resulted (over the years) in the comment, “no wonder she is the evaluator” when I ask an evaluative question. So I was surprised when I read a comment by a reader that implied that it didn’t matter. The reader said, “The ugly truth is, it does not matter if it makes a difference. Somewhere down the road someone will see your post and may be it will be useful for him.” (Now you must know that I’ve edited the comment, although the entire comment doesn’t support my argument: Evaluators need to know if the program made a difference.)
So the thought occurred to me, what if it didn’t make a difference? What if the program has no value? No worth? No merit? What if by evaluating the program you find that it won’t be useful for the participant? What does that say about you as an evaluator? You as a program designer? You as an end user? Is it okay for the post to be useful “somewhere down the road”? Is blogging truly “a one way channel to transfer any information you have over the web.” How long can a social-scientist-always-looking-at-the-social continue to work when the information goes out and rarely comes back? I do not know. I do know that blogging is hard work. After six and one-half years of writing this blog almost weekly, writer’s block is my constant companion. (although being on a computer, I do not have a pile of paper, just blank screens). So I’m turning to you, readers:
Does it make a difference whether I write this blog or not?
Am I abdicating my role as an evaluator when I write the blog?
I don’t know. Over the years I have gotten some interesting comments (other than the “nice job” “keep up the work” types of comments). I will pause (not in my writing; I’ll continue to do that) and think about this. After all, I am an evaluator wanting to know what difference this program makes.
Today, I’m going to talk about evaluation use that is, the using of evaluation findings. Now, Michael Patton wrote the book (actually more than one) on the topic. And I highly recommend that book (and the shorter version, Essentials of Utilization-Focused Evaluation [461 pages including the index as opposed to 667]).
I firmly believe that there is no point in conducting an evaluation if the final report of that evaluation sits on someone’s shelf and IS NEVER USED! Not just read (hopefully!), USED to make the program better. To make a difference.
Today, though, I want to talk about how that final report is put together. It doesn’t matter if it is an info-graphic, a dash-board, an executive summary, a 300-page document, it all has to be your best effort. So I want to talk about your best effort.
That best effort is accurate, not only reporting the findings, also the spelling, the grammar, the syntax.
For example: The word “data” is a plural word and takes a plural noun. Yep. Check the dictionary folks. Websters Seventh New Collegiate Dictionary says (under the entry data) plural of DATUM. (I’ll bet you didn’t know that the plural of OPUS is OPERA. Just another example of the peculiarities of the English language.) The take away here: When in doubt, check it out!
When I put together a final report (regardless of the format), I use the 5Cs as a guideline. (I also use it as a basis of reviewing manuscripts.) Those 5Cs are: Clarity. Coherence. Conciseness. Correctness. Consistency. Following the 5Cs results in a product in which I can be proud.
How do you use your evaluation report? Keep these things in mind!
The Highest Appreciation
– John F. Kennedy
Gratitude must be a habit. Each day needs to be began and ended with gratefulness. Then if you can live by that gratefulness, you will utter the words and be grateful. That is what evaluation is all about–holding to the higher ground. Not just doing something to get it done; doing something (in this case the evaluation) because it is right as you know it today, in this moment, under these circumstances.
Doing evaluation just for the sake of evaluating, because it would be nice to know, is not the answer. Yes, it may be nice to know; does it make a difference? Does the program (policy, performance, product, project, etc.) make a difference in the lives of the participants. As a social scientist, it is important for me to look at the “social” side of what I do; that means dealing with people, the participants, you know the social part. I want to determine what the participants are thinking, feeling, doing. That means, I must walk my talk. And be grateful.
There are lots of resources available that help the nascent evaluator do just that. My recommendation is to start with Jody Fitzpatrick’s volume . I would also check out the American Evaluation Association site. There is a lot of information available to non-members (becoming a member is worth the cost). Then depending on what you specifically want to know, let me know. I’ll suggest references to you.
How many times have you shaken your head in wonder? in confusion? in disbelief?
Regularly throughout your life, perhaps. (Now if you are a wonder kid, you probably have just ignored the impossible and moved on to something else.) Most of us will have been in awe; uncertainty; incredulity. Most of us will always look at that which seems impossible and then be amazed when it is done. (Mandela was such a remarkable man who had such amazing insights.) Read the rest of this entry »
Alan Rickman died this month. He was an actor of my generation; one that provided me with much entertainment. I am sad. Then I saw this quote on the power of stories. How stories explain. How stories can educate. How stories can help reduce bias. And I am reminded how stories are evaluative.
Dick Krueger did a professional development session (then called a “pre-session”) many years ago. It seems relevant now. Of course, I couldn’t find my notes (which were significant) so I did an online search, using “Dick Krueger and stories” as my search terms. I was successful! (See link.) When I went to the link, he had a whole section on story and story telling. What I remember most about that session is what he has listed under “How to Analyze the Story”. Specifically the four points he lists under problems with credibility:
The next time you tell a story think of it in evaluative terms. And check out what Dick Krueger has to say. Read the rest of this entry »
Recently, I read that 45% of individuals make New Year’s Resolutions and only 8% actually achieve success. Hmmm…not a friendly probability. Perhaps intentions about behavior are indeed more realistic. (Haven’t seen the statistics on that potential change. Mazanian (et al, 1998) does say stated intention to change is the most significant behavioral indicator.) My intention for 2016 is to provide content related to or about evaluation that provides you with something you didn’t have before you read the post (Point one). Examples follow: Read the rest of this entry »