Feb
21
Filed Under (Uncategorized) by Molly on 21-02-2013

Four weeks ago (January 17, 2013), I asked if this blog was making a difference and asked that y’all post specific examples of how it is making that difference–I was/am looking for change, specifically.  I said I would summarize the responses and post a periodic update.  This is the first update.

I’ve gotten many (more than 50) posts on that blog.  They are interesting.  No one has offered me a specific example of how this blog is making a difference.  Several agree that page views are NOT an adequate measure of effectiveness.  Several (again) agreed that length of time of a visit might be a good indicator.  A few are reading the blog for marketing tips; a few are using the blog to entice me to go to their blog–I don’t think so, especially when the response is in another language that I have to translate.  (I’m sure this sounds elitist–not my intention, to be sure–rather just a time factor in finding a translator.)  Most comments are just encouraging me to keep up the writing because it is 1) clear; 2) quickly loaded; 3) they like /love the blog/blog content; or 4) can be applied to their marketing strategy and their blog (that actually may be a change, only I’d have to do a lot of research to know if their site benefited).  Some folks just make a comment that seems to be a non-sequitur.

So I really don’t know.  Judging from the comments (random though they may be), people seem to be reading it.  I am curious how many people regularly go to this blog–regularly like weekly, not once in a while).  If I’m representative, I go to other blogs regularly, though not the same blogs each time, so I’m probably one of those once in a while people–even with evaluation blogs.  There are so many out there and the number is growing.  What I’ve learned is that the title of an individual blog is what captures the folks.  Coming up with catchy titles is difficult; coming up with catchy titles which are maximized in search engines is even harder.

I didn’t post a survey this time;  maybe I should.  I will post another update in about a month.

Feb
18
Filed Under (program evaluation) by Molly on 18-02-2013

One of the expectations for the evaluation capacity building program that just finished is that the program findings will be written up for publication in scientific journals.

Easy to say.  Hard to do.

Writing is HARD.

To that end, I’m going to dig out my old notes from when I taught technical writing to graduate students, medical students, residents, and young faculty and give a few highlights.

  1. Writing only happens when words are put on paper (or typed into a computer).  Thinking about writing (I do that a lot) doesn’t count as writing.  The words don’t have to be perfect; good writing happens with multiple revisions.
  2. Schedule time for writing; write it in your planner.  You are making an appointment with yourself and writing.   At 10:00am every MWF I will write for one hour; then stop.  Protect this time.  You protect your program time; you need to protect your writing time.
  3. Keep in mind paper organization.  Generally, the IMRAD structure works for all manuscripts.  IMRAD stands for Introduction; Methods, Results, And Discussion.  Introduction is the literature review and ends with the research question.  Methods section is how the program, experiment, research was conducted in EXCRUCIATING detail.  Another evaluator should be able to pick up your manuscript and replicate your program.  Results are what you discovered, the lessons learned, the what worked and didn’t work.  They are quantitative and/or qualitative.  The Discussion is where you get to speculate; it highlights your conclusions and discusses the implications.  It also ties back to the literature.  If you have done the reporting correctly, you will have gone from the general to the specific back to the general.  Think two triangles placed together with their points (apex) touching.
  4. Follow the five Cs.  This is the single most important piece of advice (after number 2 above) about writing.   The five Cs are  Clarity, Coherence, Conciseness, Correctness, and Consistency.  If you keep those five Cs in mind, you will write well.  The writing is clear–you have not obfuscated the material.  The writing is coherent–it makes sense.  The writing is concise–you do not babble on or use jargon.  The writing is correct–you remember that the word data is a plural noun and takes a plural verb (use proper grammar and syntax).  The writing is consistent–you call your participants the same thing all the way through (no it is not boring).
  5. Start with the section you know best.  That may be what is  most familiar; it may be what is the  most recent; it may be what is the most concrete.  What ever you do, DO NOT start with the abstract; write it last.
  6. Have a style guide on your desk.  Most social sciences use APA; some use MLA or Chicago Style.  Have one (or more) on your desk.  Use it.  Follow and use the style that the journal requires.  That means you have read the “Instructions to authors” somewhere in the publication.
  7. Once you have finished the manuscript, READ IT OUT LOUD TO YOUR SELF.
  8. Run a spell and grammar check on the manuscript–it won’t catch everything; it will only catch most errors.
  9. Have more than one person read the manuscript AFTER you have read it out loud to your self.
  10. Persist.  More than one manuscript has been published because the author has persisted with the journal

Happy writing.

Feb
13
Filed Under (program evaluation) by Molly on 13-02-2013

One of the outcomes of learning about evaluation is informational literacy.

Think about it.  How does what is happening in the world affect your program?  Your outcomes?  Your goals?

When was the last time you applied that peripheral knowledge to what you are doing.  Informational literacy is being aware of what is happening in the world.  Knowing this information, even peripherally, adds to your evaluation capacity.

Now, this is not advocating that you need to read the NY Times daily (although I’m sure they would really like to increase their readership); rather it is advocating that you recognize that none of your programs (whether little p or big P) occur in isolation. What your participants know affects how the program is implemented.  What you know affects how the programs are planned.  That knowledge also affects the data collection, data analysis, and reporting.  This is especially true for programs developed and delivered in the community, as are Extension programs.

Let me give you a real life example.  I returned from Tucson, AZ and the capstone event for an evaluation capacity program I was leading.  The event was an outstanding success–not only did it identify what was learned and what needed to be learned, it also demonstrated the value of peer learning.  I was psyched.  I was energized.  I was in an automobile accident 24 hours after returning home.  (The car was totaled–I no longer have a car; my youngest daughter and I experienced no serious injuries.)  The accident was published in the local paper the following day.  Several people saw the announcement; those same several people expressed their concern; some of those several people asked how they could help.  Now this is a very small local event that had a serious effect on me and my work.   (If I hadn’t had last week’s post already written, I don’t know if I could have written it.)  Solving simple problems takes twice as long (at least).  This informational literacy influenced those around me.  Their knowing changed their behavior to me.  Think of what September 11, 2001 did to people’s behavior; think about what the Pope’s RESIGNATION is doing to people’s behavior.  Informational literacy.  It is all evaluative.  Think about it.

 

Graphic URL: http://www.otterbein.edu/resources/library/information_literacy/index.htm

Feb
05
Filed Under (program evaluation) by Molly on 05-02-2013

I try to keep politics out of my blogs.  Unfortunately, (or fortunately, depending on your world view), evaluation is a political activity.  Recently there have been several posts by others that remind me that evaluation is a political activity.  I try to point out how everyday activities are evaluative.

One is the growing discussion (debate?) about gun regulation.  Recently, the Morning Joe show included a clip that was picked up by MoveOn.org.  If you haven’t seen it, you need to.  Although the evaluative criteria are not clear, the outcome is and each commentator addressing the issue with a different lens (go here to view the clip).

In addition, a colleague of mine posted on her blog another blogger’s work (we are all connected, you know) that demonstrates the difficulty evaluators have being responsive to a client, especially one with whom you do not share the value in question (see Genuine Evaluation).  If you put your evaluator’s hat aside, the original post could be viewed as funny.

How many times have you smelled the milk and decided it was past prime?  Or seen mold growing on the yogurt?  This food blog also has many evaluative aspects (insert use by date blog).  Check it out.

 

I’m back from Tucson where it was warm and sunny–I wore shorts!  The best gift that I got serendipitously was the observation of peer learning from the participants.  Now I have to compile an evaluation of the program because I want to know what the participants thought, systematically.  I took a lot of notes and I know what needs to be added; what worked; what didn’t.  I got a lot of spontaneous and unsolicited comments about the value of the program–so OK–I’ve got the qualitative feed back (e.g., 18 months ago I wouldn’t have thought of this; knowing I’m not alone in the questions I have helps; I can now find an  answer…).  Once I get the quantitative feedback, I’ll triangulate the comments, the quantitative data, and any other data I have.  I am hoping to USE these findings to offer the program again.  More on that later.

 

An update on my making a difference query.  I’ve gotten a couple of responses and NO examples.  One response was about not using page views as a measure of success; instead use average time viewing a page.  A lot of responses think that this is a marketing blog.  Since evaluation is such a big part of marketing, I can see how that fits.  Only, this is an evaluation blog.  I’m not posting the survey.  It has been closed for weeks and weeks.  I was hoping for examples about how it changed your thinking, practice, world view.

 

Also, just so you know, I was in an auto accident 24 hours after I returned from Tucson.  Mersedes and I have aches and pains and NO serious injuries.  We do not have a car any more.  Talk about evaluating an activity–think about what you would do without a car (or if you don’t have one, what you would do with one).  I had to.