Mar
23
Filed Under (criteria, program evaluation) by Molly on 23-03-2017

Audience.

We all have an audience (unless we are a reclusive hermit). The issue is what kind an audience.

An absent audience?

An all inclusive audience?

A captive audience?

An exclusive audience?

Reminds me of the saying, “You can please some of the people, some of the time; all of the people some of the time and not all of the people all of the time.”

So when I teach evaluation (or try to teach evaluation–evaluation is a scary concept), I begin by saying that we are all evaluators, that we evaluate daily. It is truly an everyday activity.

Some people engage at that point. Others are skeptical. Others are deniers.

Like the old saying–some of the people…

Unconsciously (maybe consciously) you identify criteria that will help you make a decision.

That decision will provide you with a decision tree, to be able to make decisions throughout the day.

Criteria.

The other day, I had the opportunity to explore a “teacher-made” test; to determine if it had face validity. The audience was similar to and different from the audience who will ultimately take the test (survey, actually). I needed to capture the concerns about the instrument; to determine if it did what it was supposed to do; and wording issues.

Then a decision needed to be made about its use. We were using the TOP model (Bennett and Rockwell, 1995*).  Specifically the KASA portion of the model.  This model can be used for program planning/development AND program evaluation/performance. Just one  example of a logic model.

There were concerns that the “knowledge” wasn’t specific enough; that the words used were not clear; that “skills” were not specific enough; that not all response options were included in the response set. Some KASA questions   could be easily fixed (the response set); some were not (the specificity issue). I sent the compiled responses to the people in charge of the project. I did not make a decision on the specificity. Perhaps if I knew more about the topic, I could have. Obviously, the survey needed changing.

Decisions.

Is all evaluation about making decisions? It all depends.

Evaluation is determining the merit, value, worth of a program or project.

You want to find out if you have made a difference in the lives of the target audience.

If you can answer that, you have captured your audience.

Citation.

*Bennett, C. & Rockwell, K. (1995, December). Targeting outcomes of programs (TOP): An integrated approach to planning and evaluation. Unpublished manuscript. Lincoln, NE: University of Nebraska.

https://www.uaf.edu/files/ces/reporting/logicmodel/TOP.pdf

 

 

Feb
13
Filed Under (criteria, program evaluation) by Molly on 13-02-2017 and tagged , , , , ,

Love.

Tomorrow is Valentine’s Day, the day traditionally set aside for lovers–you know the lovey dovey kind. And if you forgot…watch out.

It is the day when Saint Valentine    (officially Saint Valentine of Terni), a widely recognized third-century Roman saint, has his feast day. Since the  High Middle Ages it is associated with a tradition of courtly love. It is said that Valentine’s day was established to counteract the pagan celebration of Lupercalia. There is much we do not know about St. Valentine.

Not courtly love.

I want to talk about a different kind of love (and I do not mean the various definitions of  that word). I want to talk about  your calling; your passion.

A good friend of mine said:  Know what your calling is, your vocation, and follow it faithfully.

She also said in that same missive: “When you are most disgruntled, take a moment of conscious breath or five moments of conscious play!”

This is the love I’m talking about. The love for your calling; your vocation (passion).

And what to do when you feel disgruntled (breathe/play).

Passion.

Susan Kistler,  AEA Executive Director Emeritus, shares perhaps an important message about love:

“Success is made manifest in health and happiness, confidence that you are loved and the capacity to love with others.”

That is passion.

How does that relate to evaluation?

We are all evaluators and  live and work by criteria, whether they are implicit or explicit. Our passions are found in the criteria. We continue that passion for long in our lives–some of us because of family responsibilities; some of us because it is fun. When we get tired, we stop. We still have the passion and that passion comes out when we least expect it. Because once an evaluator (whether formally or not), always an evaluator.

So celebrate your passion tomorrow. And remember to breath…or play!

Jan
16
Filed Under (program evaluation) by Molly on 16-01-2017 and tagged , , , , ,

Resolutions. Renewal.

Renewal is appropriate for the new year. So are resolutions.new-years-resolutions

It has been over a month since I blogged here. And the longer I wait for inspiration, the harder it is to write.

But I’m waiting for inspiration. Really difficult, to be sure.

We all know that resolutions have a great tendency to fail.

So how can one find renewal in these difficult times?

Perhaps it is time to re-evaluate your priorities.

Priorities can change. Depending on circumstances.

Is this a time for you to be more articulate?

Or a time to be more proactive?

A time to be more (fill in the blank)?

Writer’s block

Sheila Robinson the sometime Saturday contributor for AEA 365 Read the rest of this entry »

Dec
07
Filed Under (program evaluation) by Molly on 07-12-2016

Our similarities bring us to a common ground; our differences allow us to be fascinated by each other.

~~Tom Robbins

Fascinated. comet-fascination

I find this quote so interesting by one of my favorite authors. My friends fit this description. I find them fascinating. They are all different; all smart; all creative.

So are my daughters. Both different, smart, and creative. I got good material in the nature/nurture discussion. I find them fascinating.

How do we find the common ground?

Perhaps it is like planning a program.

You want accomplish something (the outcome of a program in economic, environmental, or social terms). You outline what you want to accomplish, make it fit some criteria. Run the program.

Oops. Somewhere the outcome changed. You go back to the drawing board (The Journal of Irreproducible not withstanding). You look at your logic model and at your theory of change to figure out Read the rest of this entry »

Nov
10

Trustworthiness. An interesting topic.

Today is November 9, 2016. An auspicious day, to be sure. (No, I’m not going to rant about November 8, 2016; just post this and move on with my living.) Keep in mind trustworthiness, I remind myself.

I had the interesting opportunity to review a paper recently that talked about trustworthiness. This caused me much thought as I was troubled by what was written. I decided to go to my source on “Naturalistic Inquiry”lincoln book . Given that the paper used a qualitative design, employed a case study method, and talked about trustworthiness, I wanted to find out more. This book was written by two of my long time evaluation guides, Yvonna Lincoln yvonna lincolnand Egon Gubaegon guba bw. (Lincoln’s name may be familiar to you from the Sage Handbook of Qualitative Research which she co-edited with Norman Denzin.)

Trustworthiness

On page 218, they talk about trustworthiness. About the conventional criteria for trustworthiness (internal validity, external validity, reliability, and objectivity). They talk about the questions underlying those criteria (see page 218).

They talk about how the criteria formulated by conventional inquirers are not appropriate for naturalistic inquiry. Guba (1981a) offers four new terms as they have “…a better fit with naturalistic epistemology.” These four terms and the terms they propose to replace are: Read the rest of this entry »

Evaluation is political. I am reminded of that fact when I least expect it.

In yesterday’s AEA 365 post, I am reminded that social justice and political activity may be (probably are) linked; are probably sharing many common traits.

In that post the author lists some of the principles she used recently:

  1. Evaluation is a political activity.
  2. Knowledge is culturally, socially, and temporally contingent.
  3. Knowledge should be a resource of and for the people who create, hold, and share it.
  4. There are multiple ways of knowing (and some ways are privileged over others).

Evaluation is a trans-discipline, drawing from many many other ways of thinking. We know that politics (or anything political) is socially constructed. We know that ‘doing to’ is inadequate because ‘doing with’ and ‘doing as’ are ways of sharing knowledge. (I would strive for ‘doing as’.) We also know that there are multiple ways of knowing.

(See Belenky belenky, Clinchy [with Belenky] belenkyclinchy_trimmed, Goldberger nancy_goldberger, and Tarulejill-mattuck-tarule, Basic Books, 1986 as one.)

OR

(See: Gilligan carol-gilligan, Harvard University Press, 1982; among others.)

How does evaluation, social justice, and politics relate?

What if you do not bring representation of the participant groups to the table?

If they are not asked to be at the table or for their opinion?

What if you do not ask the questions that need to be asked of that group?

To whom ARE your are your questions being addressed?

Is that equitable?

Being equitable is one aspect of social justice. There are others.

Evaluation needs to be equitable.

 

I will be in Atlanta next week at the American Evaluation Association conference. atlanta-georgia-metropolitan

Maybe I’ll see you there!

my two cents.

molly.

 

 

 

 

Oct
08
Filed Under (program evaluation, program planning) by Molly on 08-10-2016 and tagged , ,

Process is the “how”.

Recently,  reminded of the fact that process is the “how”, I had the opportunity to help develop a Vision Vision Road Sign with dramatic blue sky and clouds.

and a Mission mission statement.

The person who was facilitating the session provided the group with clear guidelines.

The Vision statement, defined as “the desired future condition”, will happen in 2-5 years (i.e., What will change?). We defined the change occurring (i.e., in the environment, the economy, the people). The group also identified what future conditions would be possible. We would write the vision statement so that it would happen within 2-5 years, be practical, be measurable, and be realistic. OK…

And be short…because that is what vision statements are.

The Mission statement (once the Vision statement was written and accepted) defined “HOW” we would get to the vision statement. This reminded me of process–something that is important in evaluation. So I went to my thesaurus to find out what that source said about process. Scriven  Scriven to the rescue, again.

 

 

Process Evaluation

Scriven, in his Evaluation Thesaurus Scriven book cover defines process as the activity that occurs “…between the input and the output, between the start and finish”. Sounds like “how” to me. Process relates to process evaluation. I suggest you read the section on process evaluation on page 277 in the above mentioned source.

Process evaluation rarely functions as the sole evaluation tool because of weak connections between “output quantity and quality”. Process evaluations will probably not generalize to other situations.

However, PROCESS evaluation “…must be looked at as part of any comprehensive evaluation, not as a substitute for inspection of outcomes…” The factors include “the legality of the process, the morality, the enjoyability, the truth of any claims involved, the implementation…, and whatever clues…” that can be provided.

Describing “how ” something is to be done is not easy. It is not output nor outcome.  Process is the HOW something will be accomplished if you have specific inputs . It happens between the inputs and the outputs.

To me, the group needs to read about process evaluation in crafting the mission statement in order to get to the HOW.

my two  two cents      .

molly.

 


 

Cartoons

 

Chris Lysy chris lysydraws cartoons.

Evaluation  and research cartoons.http://blogs.oregonstate.edu/programevaluation/files/2014/06/evaluation-and-project-working.jpg

 

http://blogs.oregonstate.edu/programevaluation/files/2014/06/research-v.-evaluation.jpg  http://blogs.oregonstate.edu/programevaluation/files/2014/06/I-have-evidence-cartoon.png

Logic Model cartoons.   http://i2.wp.com/freshspectrum.com/wp-content/uploads/2014/03/Too-complex-for-logic-and-evidence.jpg

Presentation cartoons.BS cartoon from fresh spectrum

 

Data cartoons.  http://i0.wp.com/freshspectrum.com/wp-content/uploads/2013/09/wpid-Photo-Sep-27-2013-152-PM1.jpg

More Cartoons

He has offered an alternative to presenting survey data. He has a wonderful cartoon for this.

Survey results are in. Who's ready to spend the next hour looking at poorly formatted pie charts?

He is a wonderful resource. Use him. You can contact him through his blog, fresh spectrum.

my  two cents   .

molly.

 

 

Sep
14

Decisions

How do we make decisions when we think none of the choices are good?   decision

(Thank you for this thought, Plexus Institute.)

No, I’m not talking about the current political situation in the US. I’m talking about evaluation.

The lead for this email post was “Fixing the frame alters more than the view“. fixing the frame

Art Markman makes this comment (the “how do we make decisions…” comment) here. He says “If you dislike every choice you’ve got, you’ll look for one to reject rather than one to prefer—subtle difference, big consequences.” He based this opinion on research, saying that the rejection mind-set allows us to focus on negative information about options and fixate on the one with the smallest downside. Read the rest of this entry »

Sep
07
Filed Under (program evaluation) by Molly on 07-09-2016

Making a difference make a difference

I wrote a blog about making a difference. Many people have read the original post, recently. And there have been many comments about it and the follow-up posts. Most people have made supportive comments. For example:

  1. “I think you’re on the right track – being consistent about adding fresh content and trying to make it meaningful for your audience.”–Kevin;
  2. “Mr. Schaefer is taking stock of his blog–a good thing to do for a blog that has been posted for a while. So although he lists four innovations, he asks the reader to “…be the judge if it made a difference in your life, your outlook, and your business.”– Ưu điểm của máy lọc nước nano;
  3. “Yes, your posts were made sense and a difference. If you think that your doing able to help others, keep going and do the best.”– Samin Sadat;
  4. “Its refreshing to see an academic even pose the question “does this blog make a difference’. Success for You.”– Raizaldi; and
  5. “You are getting the comments and that eventually means that yes this blog is making a difference out there. Keep the good work up.”– Himanshu.

Less than a supportive comment

Some people have made a less than supportive comment. For example:

  1. Wow this pretty outdated by 2016 standards..any updates to the post?–Dan Tanduro (admittedly, this comment refers to a post I did not link above although linked here); and

Some other comments

Some people have made comments that do not relate to content yet are relevant. For example:

  1. “Hello, I have some knowledge of blogspot, but you can teach how to make the blog more faster and enough to our visits. I Think WordPress is better than blogspot, but is only my opinion…”– John Smith; and
  2.  “It’s interesting how careers cross paths, while I am not directly connected to the world of qualitative research, I have found myself trying to understand and integrate it into my daily workload more and more.” –Steinway

Responses

Making a difference. I will keep writing.  Making a difference needs to be measured. I keep in mind that stories (comments) are data with soul.

Less than a supportive comment. What is outdated? I need specific comments to which to respond, please. Also, the post to which is being referred is from April, 2012…over four years ago.

Some other comments. I can’t teach how to blog faster for I know nothing about blogspot.  I only know a little about WordPress. Stories are data with a soul–important to remember when dealing with qualitative data.

my two cents.

molly.