Evaluability assessment.

I get my ideas to blog from a lot of places. One place I get ideas are from other blogs.

The content of this blog was Evaluability Assessment.

The blog author says that evaluability assessments tend to cover the following topics:

  • Clarity of the intervention and its objectives: Is there a logical and clear theory of change that articulates how and under what conditions intervention activities influence particular processes of change?
  • Availability of data: which data are available that can be used in assessing the merit and worth of the intervention (e.g. generated by the intervention, external data sets, policy and academic literature)
  • Stakeholder interest and intended use: to what extent is there a clear interest (and capacity) among stakeholders to use the evaluation’s findings and recommendations in strategic decision-making, program improvement, learning about what works, etc.?

What, you ask, is evaluability assessment?

You certainly can go to the blog and read what it says there. OR…You can go to Scriven’s book and read the history on page 138 .

Suffice it to say that evaluability is the extent to which they (projects and programs) can be evaluated. Scriven goes on to say: “It should be thought of as the first commandment of accountability or as the last refinement of Popper’s (Sir Karl Raimond Popper) requirement of falsifiability.

 

Sources.

I learned about evaluability assessment (EA) from Midge Smith (shown here with her husband Carl Wisler) in her book by the same name (published by Springer, search for it by title).  She says that EA is “…a diagnostic and prescriptive tool for improving programs and making evaluations more useful.” Like all tools used in evaluation, it is systematic and describes the structure of a program.

There is a newer volume  of that name. It is by Michael S. Trevisan and Tamara M. Walser (they do an AEA365 blog on that topic). It is not, unfortunately, on my shelf.  The blurb that accompanies the book (by the publisher, Sage) says: “Evaluability assessment (EA) can lead to development of sound program theory, increased stakeholder involvement and empowerment, better understanding of program culture and context, enhanced collaboration and communication, process and findings use, and organizational learning and evaluation capacity building.”

More detail than Midge offers, then her book is copyrighted in 1989.

EA is getting a lot of press lately (you may need to search for evaluability assessment when you go to AEA365).

I find it amazing how previously important things (EA) are now once again in vogue.

my .

 

 

Resolutions. Renewal.

Renewal is appropriate for the new year. So are resolutions.new-years-resolutions

It has been over a month since I blogged here. And the longer I wait for inspiration, the harder it is to write.

But I’m waiting for inspiration. Really difficult, to be sure.

We all know that resolutions have a great tendency to fail.

So how can one find renewal in these difficult times?

Perhaps it is time to re-evaluate your priorities.

Priorities can change. Depending on circumstances.

Is this a time for you to be more articulate?

Or a time to be more proactive?

A time to be more (fill in the blank)?

Writer’s block

Sheila Robinson the sometime Saturday contributor for AEA 365 Continue reading

Process is the “how”.

Recently,  reminded of the fact that process is the “how”, I had the opportunity to help develop a Vision Vision Road Sign with dramatic blue sky and clouds.

and a Mission mission statement.

The person who was facilitating the session provided the group with clear guidelines.

The Vision statement, defined as “the desired future condition”, will happen in 2-5 years (i.e., What will change?). We defined the change occurring (i.e., in the environment, the economy, the people). The group also identified what future conditions would be possible. We would write the vision statement so that it would happen within 2-5 years, be practical, be measurable, and be realistic. OK…

And be short…because that is what vision statements are.

The Mission statement (once the Vision statement was written and accepted) defined “HOW” we would get to the vision statement. This reminded me of process–something that is important in evaluation. So I went to my thesaurus to find out what that source said about process. Scriven  Scriven to the rescue, again.

 

 

Process Evaluation

Scriven, in his Evaluation Thesaurus Scriven book cover defines process as the activity that occurs “…between the input and the output, between the start and finish”. Sounds like “how” to me. Process relates to process evaluation. I suggest you read the section on process evaluation on page 277 in the above mentioned source.

Process evaluation rarely functions as the sole evaluation tool because of weak connections between “output quantity and quality”. Process evaluations will probably not generalize to other situations.

However, PROCESS evaluation “…must be looked at as part of any comprehensive evaluation, not as a substitute for inspection of outcomes…” The factors include “the legality of the process, the morality, the enjoyability, the truth of any claims involved, the implementation…, and whatever clues…” that can be provided.

Describing “how ” something is to be done is not easy. It is not output nor outcome.  Process is the HOW something will be accomplished if you have specific inputs . It happens between the inputs and the outputs.

To me, the group needs to read about process evaluation in crafting the mission statement in order to get to the HOW.

my two  two cents      .

molly.

 


 

AEA365 is honoring living evaluators for Labor Day (Monday, September 5, 2016).

Some of the living evaluators I know (Jim Altschuld, Tom Chapel, Michael Patton, Karen Kirkhart, Mel Mark, Lois-Ellin Datta, Bob Stake); Some of them I don’t know (Norma Martinez-Rubin, Nora F. Murphy, Ruth P. Saunders, Art Hernandez, Debra Joy Perez). One I’m not sure of at all (Mariana Enriquez).  Over the next two weeks, AEA365 is hosting a recognition of living evaluator luminaries.

The wonderful thing is that this give me an opportunity to check out those I don’t know; to read about how others see them, what makes them special. I know that the relationships that develop over the years are dear, very dear.

I also know that the contributions that  these folks have made to evaluation cannot be captured in 450 words (although we try). They are living giants, legends if you will.

These living evaluators have helped move the field to where it is today. Documenting their contributions to evaluation enriches the field. We remember them fondly.

If you don’t know them, look for them at AEA ’16 in Atlanta atlanta-georgia-skyline. Check out their professional development sessions or their other contributions (paper, poster, round-table, books, etc). Many of them have been significant contributors to AEA; some have only been with AEA since the early part of this century. All have made a meaningful contribution to AEA.

Many evaluators could be mentioned and are not. Sheila B. Robinson suggests that “…we recognize that many, many evaluators could and should be honored as well as the 13 we feature this time, and we hope to offer another invitation next year for those who would like to contribute a post, so look for that around this time next year, and sign up!

Evaluators honored

altschuld       Thomas J. Chapel

James W. Altschuld            Thomas J. Chapel

Norma Martinez-Rubin              Patton

Norma Martinez-Rubin            Michael Quinn Patton

 

       Ruth P. Saunders

Nora F. Murphy                                     Ruth P. Saunders

 

ArthurHernandez                  Kirkhart

Art Hernandez                          Karen Kirkhart

Melvin Mark            Loisellen datta

Mel Mark                                       Lois-Ellin Datta

debra-perez-thumbnail-340x340       bob stake 2

Debra Joy Perez                           Bob Stake

ghost_person_60x60_v1

Mariana Enriquez (Photo not known/found)

my two cents.

molly.

Thinking. We do it all the time (hopefully). It is crucial to making even the smallest decisions (what to wear, what to eat), and bigger decisions (where to go, what to do). Given this challenging time, even news watchers would be advised to use evaluative and critical thinking.  Especially since evaluation is an everyday activity.

This graphic was provided by WNYC. (There are other graphics; use your search engine to find them.)This graphic makes good sense to me and this applies to almost every news cast (even those without a shooter!). Continue reading

The only way to do great work is to love what you do. If you have not found it yet, keep looking. Do not settle. ~~Steve Jobs.

Last week I wrote about an epiphany I had many years ago, one in which I did not settle. don't settle cropped

I made choices about the work I did. I made choices about the life I lived. I did not settle.

It is an easy life to “go with the flow”; to settle, if you will. Convenience is not always the best way even though it might be the easiest. Did I do great work? I don’t know. Did I hear stories of the work I did? I was told after the fact that I had made a difference because of the work I had done. Perhaps, making a difference is doing great work. Perhaps.

However, this quote from Steve Jobs reminded me that loving what one does is important, even if one does not do “great work”. If one does not love what one does, one needs to do what one loves.love Continue reading

Having written about evaluation history previously, I identified  those who contributed, not those who could be called evaluation pioneers; rather those who had influenced my thinking.  I think it is noteworthy to mention those evaluation pioneers who set the field on the path we see today, those whom I didn’t mention and need to be. As a memorial (it is Memorial Day weekend Memorial-Day-weekend, after all), Michael Patton (whom I’ve mentioned previously) is coordinating an AEA365 to identify and honor those evaluation pioneers who are no longer with us. (Thank you, Michael). The AEA365 link above will give you more details.  I’ve also linked the mentioned evaluation pioneers that have been remembered. Some of these pioneers I’ve mentioned before; all are giants in the field; some are dearly loved as well. All those listed below have died. Patton talks about the recent-dead, the sasha, and the long-dead, the zamani. He cites the Historian James W. Loewen when he makes this distinction. Some of the listed are definitely the sasha (for me); some are zamani (for me). Perhaps photos will help (for whom photos could be found) and dates. There are Continue reading

I am a social scientist. I look for the social in the science of what I do.

I am an evaluator as a social scientist. I want to determine the merit, worth, value of what I do. I want to know that the program I’m evaluating (or offering) made a difference. (After all, the root of evaluation is value.)

Keeping that in mind has resulted (over the years) in the comment, “no wonder she is the evaluator” when I ask an evaluative question. So I was surprised when I read a comment by a reader that implied that it didn’t matter. The reader said, “The ugly truth is, it does not matter if it makes a difference. Somewhere down the road someone will see your post and may be it will be useful for him.” (Now you must know that I’ve edited the comment, although the entire comment doesn’t support my argument:  Evaluators need to know if the program made a difference.)

So the thought occurred to me, what if it didn’t make a difference? What if the program has no value? No worth? No merit? What if by evaluating the program you find that it won’t be useful for the participant? What does that say about you as an evaluator? You as a program designer? You as an end user? Is it okay for the post to be useful “somewhere down the road”? Is blogging truly “a one way channel to transfer any information you have over the web.” How long can a social-scientist-always-looking-at-the-social continue to work when the information goes out and rarely comes back? I do not know. I do know that blogging is hard work. After six and one-half years of writing this blog almost weekly,  writer’s block is my constant companion.writers-block 2 (although being on a computer, I do not have a pile of paper, just blank screens). So I’m turning to you, readers:

Does it make a difference whether I write this blog or not?

Am I abdicating my role as an evaluator when I write the blog?

I don’t know. Over the years I have gotten some interesting comments (other than the “nice job” “keep up the work” types of comments). I will pause (not in my writing; I’ll continue to do that) and think about this. After all, I am an evaluator wanting to know what difference this program makes.

my two cents.

molly.

Today, I’m going to talk about evaluation use that is, the using of evaluation findings. Now, Michael Patton Patton wrote the book (actually more than one) on the topic. Patton's utilization focused evaluation And I highly recommend that book (and the shorter version, Essentials of Utilization-Focused EvaluationEssentials of UFE [461 pages including the index as opposed to 667]).

I firmly believe that there is no point in conducting an evaluation if the final report of that evaluation sits on someone’s shelf and IS NEVER USED! Not just read (hopefully!), USED to make the program better. To make a difference.

Today, though, I want to talk about how that final report is put together. It doesn’t matter if it is an info-graphic, a dash-board, an executive summary, a 300-page document, it all has to be your best effort. So I want to talk about your best effort.

That best effort is accurate, not only reporting the findings, also the spelling, the grammar, the syntax.

For example: The word “data” is a plural word and takes a plural noun. Yep. Check the dictionary folks. Websters Seventh New Collegiate Dictionary says (under the entry data) plural of DATUM. (I’ll bet you didn’t know that the plural of OPUS is OPERA. Just another example of the peculiarities of the English language.) The take away here: When in doubt, check it out!

When I put together a final report (regardless of the format), I use the 5Cs as a guideline. (I also use it as a basis of reviewing manuscripts.) Those 5Cs are: Clarity. Coherence. Conciseness. Correctness. Consistency. Following the 5Cs results in a product in which I can be proud.

How do you use your evaluation report? Keep these things in mind!

my two cents

molly.

I got back to the office Monday after spending last week in Chicago at the AEA annual conference, Evaluation 2015.Evaluation 2015 theme Next year AEA will be in Atlanta, October 24-29, 2016.  atlanta-georgia-skyline Mark your calendars!

I am tired. I take a breath (many breaths), try to catch up (I don’t), and continue to read my email (hundreds of email). I’m sure there are some I will miss–I always do.  In the meantime, I process what I experienced. And pass the conference through my criteria for a successful conference: Did I

  1. See three (and visit with) long time friends: yes.
  2. Get three new ideas: maybe.
  3. Meet three new people I’d like to add to my “friendlies” category: maybe.

Why three. Seemed like a good number; more than one (not representative) and less than five (too hard to remember). Continue reading