What have you listed as your goal(s) for 2013?

How is that goal related to evaluation?

One study suggests that you’re 10 times more likely alter a behavior successfully (i.e. get rid of a “bad” behavior; adopt a “good” behavior) than you would if you didn’t make resolution.  That statement is evaluative; a good place to start.  10 times!  Wow.  Yet, even that isn’t a guarantee you will be successful.

How can you increase the likelihood that you will be successful?

  1. Set specific goals.  Break the big goal into small steps; tie those small steps to a time line.  You want to read how many pages by when?  Write it down.  Keep track.
  2. Make it public.  Just like other intentions, if you tell someone there is an increased likelihood you will complete them.  I put it in my quarterly reports to my supervisors.
  3. Substitute “good” for “less than desirable”.  I know how hard it is to write (for example).  I have in the past and will this year again, schedule and protect a specified time to write those three articles that are sitting partly complete.  I’ve substituted “10:00 on Wednesdays and Fridays” for the vague “when I have a block of time I’ll get it done”.  The block of time never materializes.
  4. Keep track of progress.  I mentioned it in number 1; I’ll say it again:  Keep track; make a chart.  I’m going to get those manuscripts done by X data…my chart will reflect that

So are you going to

  1. Read something new to you (even if it is not new)?
  2. Write that manuscript from that presentation you made?
  3. Finish that manuscript you have started AND submit it for publication?
  4. Register for and watch a webinar on a topic you know little about?
  5. Explore a topic you find interesting?
  6. Something else?

Let me hear from you as to your resolutions; I’ll periodically give you an update.

 

And be grateful for the opportunity…gratitude is a powerful way to reinforce you and your goal setting.

 

Hanukkah ended last Saturday, two days after the December new moon.

Solstice happens Friday at 11:11 GMT (or the end of the world according to the Mayan calendar).

Christmas is next Tuesday.

Kwanzaa (a harvest festival that includes candles–light) starts the day after Christmas for seven days.

The twelfth day of Christmas occurs on January 6 (according to  some western Christian calendars).

All of these holidays are festivals of light…Enjoy.

And may 2013 bring you health, wealth, happiness, and the time to enjoy them.

 

(I’ll be gone next week hence two posts this week.)

At the end of January, participants in an evaluation capacity building program I lead will provide highlights of the evaluations they completed for this program.  That the event happens to be in Tucson and I happen to be able to get out of the wet and dreary northwest is no accident.  The event will capstone WECT (Western [Region] Evaluation Capacity Training–Say ‘west’) participants evaluations of the past 17 months.  Since each participant will be presenting their programs and the evaluations they did of those programs.  There will be a lot of data (hopefully).  The participants and those data could use (or not) a new and innovative take on data visualization.  Susan Kistler, AEA’s Executive Director, has blogged in AEA365 several times about data visualization.  Perhaps these reposts will help.

 

Susan Kistler says • “Colleagues, I wanted to return to this ongoing discussion. At this year’s conference (Evaluation ’12), I did a presentation on 25 low-cost/no-cost tech tools for data visualization and reporting. An outline of the tools covered and the slides may be accessed via the related aea365 post here http://aea365.org/blog/?p=7491. If you download the slides, each tool includes a link to access it, cost information, and in most cases supplementary notes and examples as needed.

A couple of the new ones that were favorites included wallwisher and poll everywhere. I also have on my to do list to explore both datawrapper and amCharts over the holidays.

But…am returning to you all to ask if there is anything out there that just makes you do your happy dance in terms of new low-cost, no-cost tools for data visualization and/or reporting. (This is a genuine request–if there is something out there, let Susan know.  You can comment on the blog, contact her through AEA (susan@eval.org), or let me know, I’ll forward it.

Susan also says in Saturday’s (December 15 , 2012) blog (and this would be very timely for WECT participants):

Enroll in the Free Knight Center’s Introduction to Infographics and Data Visualization: The course is online, and free, and will be offered between January 12 and February 23. According to the course information, we’ll learn the basics of:

“How to analyze and critique infographics and visualizations in newspapers, books, TV, etc., and how to propose alternatives that would improve them.

How to plan for data-based storytelling through charts, maps, and diagrams.

How to design infographics and visualizations that are not just attractive but, above all, informative, deep, and accurate.

The rules of graphic design and of interaction design, applied to infographics and visualizations.

Optional: How to use Adobe Illustrator to create infographics.”

 

What do I know that they don’t know?
What do they know that I don’t know?
What do all of us need to know that few of us knows?”

These three questions have buzzed around my head for a while in various formats.

When I attend a conference, I wonder.

When I conduct a program, I wonder, again.

When I explore something new, I am reminded that perhaps someone else has been here and wonder, yet again.

Thinking about these questions, I had these ideas

  • I see the first statement relating to capacity building;
  • The second statement  relating to engagement; and
  • The third statement (relating to statements one and two) relating to cultural competence.

After all, aren’t both of these statements (capacity building and engagement)  relating to a “foreign country” and a different culture?

How does all this relate to evaluation?  Read on…

Premise:  Evaluation is an everyday activity.  You evaluate everyday; all the time; you call it making decisions.  Every time you make a decision, you are building capacity in your ability to evaluate.  Sure, some of those decisions may need to be revised.  Sure, some of those decisions may just yield “negative” results.  Even so, you are building capacity.  AND you share that knowledge–with your children (if you have them), with your friends, with your colleagues, with the random shopper in the (grocery) store.  That is building capacity.  Building capacity can be systematic, organized, sequential.  Sometimes formal, scheduled, deliberate.  It is sharing “What do I know that they don’t know (in the hope that they too will know it and use it).

Premise:  Everyone knows something.  In knowing something, evaluation happens–because people made decisions about what is important and what is not.  To really engage (not just outreach which much of Extension does), one needs to “do as” the group that is being engaged.  To do anything else (“doing to” or “doing with”) is simply outreach and little or no knowledge is exchanged.  Doesn’t mean that knowledge isn’t distributed; Extension has been doing that for years.  Just means that the assumption (and you know what assumptions do) is that only the expert can distribute knowledge.  Who is to say that the group (target audience, participants) aren’t expert in at least part of what is being communicated.  Probably are.  It is the idea that … they know something that I don’t know (and I would benefit from knowing).

Premise:  Everything, everyone is connected.  Being prepared is the best way to learn something.  Being prepared by understanding culture (I’m not talking only about the intersection of race and gender; I’m talking about all the stereotypes you carry with you all the time) reinforces connections.  Learning about other cultures (something everyone can do) helps dis-spell stereotypes and mitigate stereotype threats.  And that is an evaluative task.  Think about it.  I think it captures the What do all of us need to know that few of us knows?” question.

 

 

 

Needs Assessment is an evaluative activity; the first assessment that a program developer must do to understand the gap between what is and what needs to be (what is  desired).  Needs assessments are the evaluative activity in the Situation box of a linear logic model. 

Sometimes, however, the target audience doesn’t know what they need to know and that presents challenges for the program planner.  How do you capture a need when the target audience doesn’t know they need the (fill in the blank).  That challenge is the stuff of other posts, however.

I had the good fortune to talk with Sam Angima, an Oregon  Regional Administrator who has been tasked with the charge of developing expertise in needs assessment.  Each Regional Administrator (there are 12) has been tasked with different charges to whom faculty can be referred.  We captured Sam’s insights in a conversational Aha! moment.  Let me know what you think.