At a student conference recently, a man in a plaid jacket with elbow patches was very upset about my poster.  He crossed his arms across his chest and made a lot of noises that sounded like “Hmph!”  I asked if he had any questions.  “Where’s your control?? How can you say anything about your results without having a control group?”

Both the natural sciences and social sciences both share common roots in the discipline of philosophy, but the theoretical underpinnings and assumptions of those two fields are completely different.  I don’t know what happened when science and psychology were young siblings, but man, those are two are separate monsters now.

Just so you know, I have never taken the Qualitative Methods course.  I have never conducted interviews, or analyzed qualitative data.  But I find myself in situations where I need to explain and defend this type of research, as I plan to interview citizen science volunteers about their experience in the program.  I am learning about the difference between qualitative and quantitative data which each step of my project, but there is a LOT I still need to learn.

Here’s what I know about interviews:

  1. They are an opportunity for the participant to think about and answer questions they literally may have never thought about before.  Participants create/reflect on reality and make sense of their experience on the spot, and share that with the interviewer.   Participants are not revealing something to you that necessarily exists already.  Interviewers are not “looking into the mind” of the participant.
  2. It’s important to avoid leading questions, or questions where the answer is built in.  Asking a volunteer “Tell me how you got interested in volunteering…” assumes they were interested when they started volunteering.  Instead, you can ask them to provide a narrative of the time they started volunteering.  When volunteers respond to the prompt “Tell me about when you started volunteering with this program…”  they may tell you what interested them about it, and you can follow up using their language for clarification.  Follow-up and probing questions are the most important.  Good default probes include “Tell me more about that,” and “What do you mean by that?”
  3. You don’t necessarily set the sample size ahead of time, but wait for data saturation.  Let’s say you do 12 interviews and participants all give completely different answers.  You do 12 more interviews and you get fewer new types of responses.  You do 12 more and you don’t get any new types of responses.  You might be done!  Check for new discrepant evidence against your existing claims or patterns.
  4. Reporting qualitative data involves going through your analysis claim by claim, and supporting each claim with (4-5 paragraphs of) supporting evidence from the interviews.  I’ve read that there’s no one right way to analyze qualitative data, and your claims will be valid as long as they represent consistent themes or patterns that are supported by evidence.  Inter-rater reliability is another way to check the validity of claims.

And to the man in the plaid jacket, there are plenty of fields within the natural sciences that are similar to qualitative research in that they are descriptive, like geology or archeology, or in that it may be impossible to have a control, like astronomy.

Let me know what your experience is defending qualitative research, and what your favorite resources are for conducting interviews!

 

When thinking about creating outreach for a public audience, who should the target audience be? What types of questions can you ask yourself to help determine this information? If is ok to knowingly exclude certain age groups when you are designing an outreach activity? What setting is best for my outreach setting? How many entry or exist points should my activity have? Should there be a take-away thing or just a take-away message? How long should the outreach activity run? How long will people stay once my activity is completed? What types of materials are ok to use with a public audience? For example is there anything I should avoid like peanuts? Am I allowed to touch the people doing the activity to help them put something on to complete the activity? What types of things need to be watched in between each activity to avoid spreading germs? How much information should I “give away” about the topic being presented? What type of questions should I ask the participants in regards to the activity or information around the activity? How much assumed knowledge can I assume the audience has about the topic? Where do I find this information out? What are some creditable resources for creating research based educational activities?

These are some of the questions that I was asked today during a Pre-college Program outreach meeting by another graduate student who works with me on OSU’s Bioenergy Program. Part of our output for this grant is to create and deliver outreach activities around Bioenergy. We plan on utilizing the connections among SMILE, Pre-college Programs and Hatfield Marine Science Center since there are already outreach opportunities that exist within these structures. As we were meeting, it dawned on me that someone who has not ever been asked to create an outreach activity as part of their job may see this task as overwhelming. As we worked through the questions, activities and specific audience needs of the scheduled upcoming outreach, it was both rewarding and refreshing to hear the ideas and thoughts of someone new to the field of outreach.

What are some questions you have when creating outreach? What are some suggestions about creating outreach to the general public verse middle school students verse high school students? Do you have any good resources you can share? What are your thoughts?

Last week, Dr. Rowe and I visited Portland Art Museum to help assist with a recruitment push for participants in their Conversations About Art evaluation and I noticed all of the education staff involved have very different styles of how they recruited visitors to participate in the project. Styles ranged from the apologetic (e.g. “do you mind if I interrupt you to help us”), to incentive-focused (e.g. “get free tickets!) to experiential (e.g. “participating will be fun and informative!”)

This got me thinking a lot about  the significance of people skills and a researcher’s recruitment style in educational studies this week. How does the style in which you get participants involved influence a) how many participants you actually recruit, and b) the quality of the participation (i.e. do they just go through the motions to get the freebie incentive?) Thinking back to prior studies of FCL alum here from OSU, I realized that nearly all the researchers I knew had a different approach to recruitment, be it in person, on the phone or via email, and that in fact it is a learned skill that we don’t often talk too much about.

I’ve been grateful for my success at recruiting both docents and visitors for my research on docent-visitor interactions, which is mostly the result of taking the “help a graduate student complete their research” approach – one that I borrowed from interacting with prior Marine Resource Management colleagues of mine, Abby Nickels and Alicia Christensen during their masters research on marine education activities. Such an approach won’t be much help in the future once I finally get out of grad school, so the question to consider is what factors make for successful participant recruitment? It seems the common denominator is people skills, and by people skills I mean the ability to engage a potential recruit on a level that removes skepticism around being commandeered off the street.  You have to be not only trustworthy, but also approachable. I’ve definitely noticed with my own work that on off days where I’m tired and have trouble maintaining a smiley face for long periods of time at the HMSC entrance, recruitment seems harder. All those younger years spent in customer service jobs and learning how to deal with the public in general seem so much more worthwhile!

So fellow researchers and evaluators, my question for you is what are your strategies for recruiting participants? Do you agree people skills are an important underlying factor? Do you over/under estimate your own personal influence on participant recruitment?

 

 

 

 

We spent this morning doing renovations on the NOAA tank. We deep cleaned, rearranged rocks and inserted a crab pot to prepare for the introduction of some tagged Dungeness crabs. NOAA used to be a deep-water display tank with sablefish and other offshore benthic and epibenthic species, but it has lost some of its thematic cohesion recently. Live animal exhibits bring unique interpretive complications.

All in-tank elements must meet the needs and observable preferences of the animals. This is an area where we cannot compromise, so preparations can take more time and effort than one might expect. For example, our display crab pot had to be sealed to prevent corrosion of the chicken wire. This would not be an issue in the open ocean, but we have to consider the potential effects of the metal on the invertebrates in our system.

Likewise, animals that may share an ecosystem in the ocean might seem like natural tankmates, but often they are not. One species may prey on the other, or the size and design of the tank may bring the animals into conflict. For example, we have a kelp greenling in our Bird’s Eye tank who “owns” the lower 36 inches of the tank. If the tank were not deep enough, she would not be able to comfortably coexist with other fish.

We’re returning the NOAA tank to a deep-water theme based on species and some simple design elements. An illusion of depth can be accomplished by hiding the water’s surface and using minimal lighting. The Japanese spider crab exhibit next door at Oregon Coast Aquarium also makes good use of these principles. When this is done right, visitors can get an intuitive sense of the animals’ natural depth range—regardless of the actual depth of the tank—before they even read the interpretive text.

We’re also using a new resident to help us clean up. The resident in question is a Velcro star (Stylasterias spp.) that was donated a couple of months back. It is only about eight inches across, but the species can grow quite large. Velcro stars are extremely aggressive, and will even attack snails and the fearsome sunflower stars (Pycnopodia helianthoides) that visitors know from our octopus tank. Our Velcro star will, we hope, cull the population of tiny marine snails that have taken over the NOAA tank’s front window in recent months.

Colleen has been very proactive in taking on major exhibit projects like this, and she has recruited a small army of husbandry volunteers—to whom I’ll refer hereafter as Newberg’s Fusiliers—to see them through. Big things are happening on all fronts, and with uncommon speed.

In the last couple of weeks Katie and I have been testing some options for capturing better quality visitor conversation for the camera system using external mics.

As Katie mentioned last month, each camera’s built-in microphones are proving to be a little unfruitful in capturing good quality audio for the eventual voice recognition system in “hot-spot” areas such as the touch tanks and front desk. As a result, we purchased some pre-amplified omni-directional microphones and set about testing their placement and audio quality in these areas. This has been no easy process, as the temporary wiring we put in place to hook the mics to the cameras is  not as aesthetically pleasing in a public setting as one might hope, and we discovered that the fake touch tank rocks are duct-tape’s arch enemy. Plus the mics have been put through their paces through various visitor kicks, bumps and water splashes.

As well as the issue of keeping the mics in place, testing has also meant a steep learning curve about mic level adjustment. When we initially wired them up, I adjusted each mic (via a mixer) one by one to reduce “crackly” noises and distortion during loud conversations. However, I later realized the adjustment overlooked necessary camera audio setup changes, and gain adjustments, affecting just how close a visitor has to get to one of the mics to actually hear them, particularly over the constant noise of running water around tanks.

So today I am embarking on a technical adventure. Wearing wireless headphones and brandishing a flathead screwdriver, I am going to reset all the relevant cameras’ audio settings to a zero gain, adjust the mic levels for mic balance (there are multiple mics per camera) rather than crackly noises, and adjust the gain until the sample audio I pull from the camera system comes out cleaner. I’m not expecting to output audio with the clarity of a seastar squeak, but I will attempt to get output that allows us to capture focal areas of clear conversation, even with the quietest of visitors. Avast me hearties, I be a sound buccaneer!

Well the data collection for my research has been underway for nearly 2 months now, how time flies! For those of you new to this project, my research centers on documenting the practice of science center docents as they interact with visitors. Data collection includes video observations of voluntary docents at HMSC using “visitor-mounted” looxcie cameras, as well as pre- and post-observation interviews with those participating docents.

“Visitor-eye view using the looxcies”

My current focus is getting the video observations of  each of the 10 participating docents collected. In order to conduct a post observation interview (which asks docents to reflect on their practice), I need to get about 10-15 minutes of video data of each of the docents interacting with the public. This doesn’t sound like much, but when you can’t guarantee a recruited family will interact with a recruited docent,  and an actual interaction will likely only last from 30 seconds to a few minutes, it takes a fair few families wearing cameras to get what you need. However, I’m finding this process really enjoyable both in getting to know the docents and meeting visitors.

When I first started this project I was worried that visitors would be a little repelled about the idea of having their whole visit recorded. What I’m actually finding is that either a) they want to help the poor grad student complete her thesis, b) they think the cameras are fun and “want a go” or c) they totally want one of the HMSC tote bags being used as an incentive (what can I say, everyone loves free stuff right?!) The enthusiasm for the cameras has gone as far as one gentleman running up to a docent, jumping up and down and shouting “I’m wearing a camera, I’m wearing a camera!” Additionally, and for those star trek fans out there, a number of visitors and colleagues alike have remarked how much wearing a looxcie makes a person look like a borg (i.e. cyborg), particularly with that red light thing…

Now how, may you ask, does that not influence those lovely naturalistic interactions you’re supposed to be observing? Well, as many of us qualitative researchers know, that unless you hide the fact you are observing a person (an element our IRB process is not particularly fond of) you can never truly remove that influence, but you can assume that if particular practices are observed often enough, they are part of the landscape you are observing. The influence of the cameras may alter how naturalistic that interaction may be, but that interaction is still a reflection of social behaviors taking place. People do not completely change their personality and ways of life simply because a camera is around; more likely any behavior changes may simply be over- or under-exaggerated normative actions. And I am finding patterns, lots of patterns, in the discourse and action taking place between docents and visitors.

However, I am paying attention to how visitors and docents react to the cameras. When filtering the footage for interactions, I look out for any discourse that indicates camera influence is an issue. As examples, the docent in the “jumping man” footage reacts surprised to the man’s sudden shouting, open’s his eyes wide and nervously laughs – to which I noted on the video that the interaction from then on may irregular. In one clip I have a docent talking non-stop about waves seemingly without taking a breath for nearly 8 minutes – to which I noted seemed unnatural in comparison to their other shorter dialogue events. Another clip has a docent bursting out laughing at a visitor wearing one of the looxices attached to his baseball cap using a special clip I have (not something I expected!) – to which I noted would have likely made the ability for the visitor to forget about the looxcie less possible.

All in all, however, most visitors remark they actually forget they are wearing the camera as they visit goes on, simply because they are distracted by their actual visit. This makes me happy, as the purpose of incorporating the looxcies was to reduce the influence of being videod as a whole. Visitors forget to a point where, during pilots, one man actually walked into the bathroom wearing his looxcie, and recorded some footage I wasn’t exactly intending to observe… suffice to say, I instantly deleted that video and and updated my recruitment spiel to include a reminder not to take the cameras in to the bathroom. Social science never ceases to surprise me!