Summer is flying by and the hard work in the Cyberlab continues.  If you have been keeping up with previous posts, we have had researchers in residence as part of our Cyber Scholar program, movement on our facial recognition camera installations, and conference presentations taking place around the country and internationally.  Sometimes I forget just how amazing the implementation of unobtrusive audio and video collection methods are to the field of visitor research and exhibit evaluation until I talk to another researcher or educator working at another informal learning center.  The methods and tools we are applying have huge implications to streamlining these types of projects.  It is exciting to be a part of an innovative project in an effort to understand free choice learning and after a year in the lab, I have gained several new skills, particularly learning by doing.

As with any research, or project in general, there are highs and lows with trying to get things done and working.  Ideally, everything will work the first time (or when plugged in), there are no delays, and moving forward is the only direction.  Of course in reality there are tool constraints, pieces to reconsider and reconfigure, and several starts and stops in an effort to figure it out.  There is no Cyberlab “manual” – we are creating it as we go – and this has been a great lesson for me personally when it comes to my approach to both personal and professional experiences, particularly with future opportunities in research.

Speaking of research, this past week I started the data that will go towards my Master’s thesis.  As I am looking at family interactions and evidence of learning behaviors around the Ideum touchtable, I am getting the chance to use the tools of the Cyberlab, but also gain experience recruiting and interviewing visitors.  My data collection will last throughout the month of August, as I perform sampling during morning and afternoon hours on every day of the week.  This will allow for a broad spectrum of visitors, though I am purposively sampling “multi-generational” family groups, or at least one adult and one child using the exhibit.  After at least one minute of table use, I am interviewing the group about their experience using the touch table, and will be looking at the footage to further analyze what types of learning behaviors may be occurring.

During my observations, I have been reflecting on my time as an undergraduate conducting research in marine biology.  At that point, I was looking at distribution and feeding habitats of orange sea cucumbers in the Puget Sound.  Now the “wildlife” I am studying is the human species and as I sit and observe from a distance, I think about how wildlife biologists wait in the brush for the animal they are studying to approach, interact, and depart the area.  Over the course of my sampling sessions I am waiting for a family group to approach, interact, and depart the gallery.  There are so many questions that I have been thinking about with regards to family behavior in a public science center.  How do they move through the space, what exhibits attract particular age groups, how long do they decide to stay in any particular area, and what do they discuss while they are there?  I am excited to begin analyzing the data I will get.  No doubt it will likely lead to more questions…

Maybe I’ve been around universities too long, but fall always seems like New Year’s to me.  Part of it, of course, is the excitement of a new school year – new classes, new students and colleagues, new projects.  Classes start this week in Corvallis, and I’m gearing up to teach a class I’ve taught many times before – Communicating Ocean Sciences with Informal Audiences.  If you are not familiar with the class, check out the website here.  One of the reasons I love teaching this class is because even though I was involved from the get go in helping imagine and design it, it seems new every time I teach it.  Part of it is that constant tweaking that comes with reflecting on what we like and don’t like about our teaching.  But the COSIA class also seems to be a great palate for thinking about and working on a whole variety of themes and ideas and topics that emerge in informal science education and free-choice learning work.  The twin themes that are running through my head as I develop the class this year are identity and community.

We just learned last week that we were awarded a new NSF AISL grant called COASSTal Communities of Science. The project partners the FCL Lab with University of Washington researchers Julia Parrish and Jane Dolliver who run a very successful and impressive citizen science project, COASST, that spans beaches from Alaska through Northern California.  With this new grant, COASST is responding to volunteers, communities they serve, and national calls for citizen scientists to address the issue of marine debris in the Pacific Northwest.  COASST will be developing new protocols and modules for monitoring marine debris that should bring to that realm the same level of rigor and engagement that their current program has been recognized for.  I’m excited because our role in this project is to carry out research on recruitment and retention of citizen scientists in both COASST’s traditional programming as well as the new marine debris modules.  We’ll be looking at a host of factors that affect both, trying to understand the complex relationships among personal, social, cultural and ecological factors supporting the program.  I’m even more excited because we have developed an Activity Theory framework for the qualitative and quantitative parts of the study and will be looking explicitly at COASST as a community (or communities) of practice.  We’ll be researching participants’ identities vis-à-vis the science they are involved in and how those identities develop and change over time.

This research focus on communities of practice and identity change will inevitably shape the look and feel of the COSIA class this fall as well.  At the most basic level, we’ll all be working in the class to develop a short-term community of practice around communicating ocean sciences.  But at the larger level, the class itself is designed to help scientists and educators in graduate school at OSU develop identities as people who are comfortable and expert not only in their science, but also expert at communicating it.  For many folks who take the class this means changing their understanding of a whole variety of things – from the nature of science to the nature of teaching and learning.  We are encouraging them to do nothing less than become a different kind of person—and they are learning that when we ask people to learn about OUR science, we may be asking them to become different kinds of people – the kind of people who care about and want to be involved in science.  And that’s identity change at work.  Once you recognize that, models of communication based on experts getting knowledge out to publics just don’t hold any water anymore.  Communication is about shifting and shaping identities as much as about shaping knowledge.  That means that the stakes are always higher than you think and that even the simple act of facilitating a density activity at a local museum might be about negotiating identity as much as having fun with water!

Awhile ago, I promised to share some of my experiences in collecting data on visitors’ exhibit use as part of this blog. Now that I’ve actually been back at it for the past few weeks, I thought it might be time to actually share what I’ve found. As it is winter here in the northern hemisphere, our weekend visitation to the Hatfield Visitor Center is generally pretty low. This means I have to time my data collection carefully if I don’t want to spend an entire day waiting for subjects and maybe only collect data on two people. That’s what happened on a Sunday last month; the weather on the coast was lovely, and visitation was minimal. I have been recently collecting data in our Rhythms of the Coastal Waters exhibit, which has additional data collection challenges in that it is basically the last thing people might see before they leave the center, it’s dim because it houses the projector-based Magic Planet, and there are no animals, unlike just about every other corner of the Visitor Center. So, I knocked off early and went to the beach. Then I definitely rescheduled another day I was going to collect data because it was a sunny weekend day at the coast.

On the other hand, on a recent Saturday we hosted our annual Fossil Fest. While visitation was down from previous years, only about 650 compared to 900, this was plenty for me, and I was able to collect data on 13 people between 11:30 and 3:30, despite an octopus feeding and a lecture by our special guest fossil expert. Considering data collection, including recruitment, consent, the experiment, and debrief probably runs 15 minutes, I thought that this was a big win. In addition, I only got one refusal from a group that said they were on their way out and didn’t have time. It’s amazing how much better things go if you a) lead with “I’m a student doing research,” b) mention “it will only take about 5-10 minutes”, and c) don’t record any video of them. I suspect it also helps that it’s not summer, as this crowd is more local and thus perhaps more invested in improving the center, whereas summer tourists might be visiting more for the experience, to say they’ve been there, as John Falk’s museum visitor “identity” or motivation research would suggest. This would seem to me like a motivation that would not make you all that eager to participate. Hm, sounds like a good research project to me!

Another reason I suspect things went well was that I am generally approaching only all-adult groups, and I only need one participant from each group, so someone can watch the kids if they get bored. I did have one grandma get interrupted a couple times, though, by her grandkids, but she was a trooper and shooed them away while she finished. When I was recording video and doing interviews about the Magic Planet, the younger kids in the group often got bored, which made recruiting families and getting good data somewhat difficult, though I didn’t have anyone quit early once they agreed to participate. Also, as opposed to prototyping our salmon forecasting exhibit, I wasn’t asking people to sit down at a computer and take a survey, which seemed to feel more like a test to some people. Or it could have been the exciting new technology I was using, the eye-tracker, that was appealing to some.

Interestingly, I also had a lot of folks observe their partners as the experiment happened, rather than wander off and meet up later, which happened more with the salmon exhibit prototyping, perhaps because there was not much to see if one person was using the exhibit. With the eye-tracking and the Magic Planet, it was still possible to view the images on the globe because it is such a large exhibit. Will we ever solve the mystery of what makes the perfect day for data collection? Probably not, but it does present a good opportunity for reflection on what did and didn’t seem to work to get the best sample of your visitorship. The cameras we’re installing are of course intended to shed some light on how representative these samples are.

What other influences have you seen that affect whether you have a successful or slow day collecting exhibit use data?

 

Last week, Dr. Rowe and I visited Portland Art Museum to help assist with a recruitment push for participants in their Conversations About Art evaluation and I noticed all of the education staff involved have very different styles of how they recruited visitors to participate in the project. Styles ranged from the apologetic (e.g. “do you mind if I interrupt you to help us”), to incentive-focused (e.g. “get free tickets!) to experiential (e.g. “participating will be fun and informative!”)

This got me thinking a lot about  the significance of people skills and a researcher’s recruitment style in educational studies this week. How does the style in which you get participants involved influence a) how many participants you actually recruit, and b) the quality of the participation (i.e. do they just go through the motions to get the freebie incentive?) Thinking back to prior studies of FCL alum here from OSU, I realized that nearly all the researchers I knew had a different approach to recruitment, be it in person, on the phone or via email, and that in fact it is a learned skill that we don’t often talk too much about.

I’ve been grateful for my success at recruiting both docents and visitors for my research on docent-visitor interactions, which is mostly the result of taking the “help a graduate student complete their research” approach – one that I borrowed from interacting with prior Marine Resource Management colleagues of mine, Abby Nickels and Alicia Christensen during their masters research on marine education activities. Such an approach won’t be much help in the future once I finally get out of grad school, so the question to consider is what factors make for successful participant recruitment? It seems the common denominator is people skills, and by people skills I mean the ability to engage a potential recruit on a level that removes skepticism around being commandeered off the street.  You have to be not only trustworthy, but also approachable. I’ve definitely noticed with my own work that on off days where I’m tired and have trouble maintaining a smiley face for long periods of time at the HMSC entrance, recruitment seems harder. All those younger years spent in customer service jobs and learning how to deal with the public in general seem so much more worthwhile!

So fellow researchers and evaluators, my question for you is what are your strategies for recruiting participants? Do you agree people skills are an important underlying factor? Do you over/under estimate your own personal influence on participant recruitment?

 

 

 

How much progress have I made on my thesis in the last month? Since last I posted about my thesis, I have completed the majority of my interviews. Out of 30 I need, I have all but four completed, and three of the four remaining scheduled. Out of about 20 eyetracking sessions, I have completed all but about 7, with probably 3 of the remaining scheduled. I also presented some preliminary findings around the eye-tracking at the Geological Society of America conference in a digital poster session. Whew!

It’s a little strange to have set a desired number of interviews at the beginning and feel like I have to fulfill that and only that number, rather than soliciting from a wide population and getting as many as I could past a minimum. Now, if I were to get a flood of applicants for the “last” novice interview spot, I might want to risk overscheduling to compensate for no-shows (which, as you know, have plagued me). On the other hand, I risk having to cancel if I got an “extra” subject scheduled, which I suppose is not a big deal, but for some reason I would feel weird canceling on a volunteer – would it put them off from volunteering for research in the future??

Next up is processing all the recordings, backing them up, and then getting them transcribed. I’ll need to create a rubric to score the informational answers as something along the lines of 100% correct, partially correct, or not at all correct. Then it will be coding, finding patterns in the data and categorizing those patterns, and asking someone to serve as a fellow coder to verify my codebook and coding once I’ve made a pass through all of the interviews. Then I’ll have to decide if the same coding will apply equally to the questions I asked during the eyetracking portion, since I didn’t dig as deeply to root out understanding completely as I did in the clinical interviews, but I still asked them to justify their answers with “how do you know” questions.

We’ll see how far I get this month.

I think what finally turned the tide for me in recruitment was emails to specific colleges at OSU. I guess I was confused because I thought I wasn’t allowed to email students, but it seems really I wasn’t allowed to use the “All-Student-Email” list. Sending emails to particular department administrators to forward to their own lists apparently is perfectly kosher, if not exactly completely unbiased recruitment. It did generate a flurry of responses, 50 or so in a few days, with maybe 20% of those guys (going by names only). Email to fraternities, however, seemed to be a dud (I’m not even sure any of them got forwarded), unless it just took a few days for the guys to sign up and I am confusing them with the ones I thought came from department emails.

The best scheduling method so far has been calling those folks who provided a telephone number; I got one on the phone who recalled seeing the doodle poll I sent with available interview times, but he also said he wasn’t sure what it was about. So, despite the end of the sign-up survey noting that a doodle poll would be sent, again, that information seemed to get overlooked.

Another rather wasted effort at recruiting was me sitting with a sign in the Dutch Bros. coffee shop, even when I was offering gift cards to their establishment for participation. I got one guy who was an engineer inquire why I wasn’t signing up engineers, but otherwise, no bites. Ditto for hanging out in the dining hall; one guy eyed the sign but said he wasn’t a Dutch Bros. guy. Cash, it seems, is king, as long as you can convince your funding source you are not laundering money (hint: get receipts).

Now the question is whether all of them will show up. So far, I’ve had one no-show after the phone calls for scheduling. The rest of the week I have about 6 more interviews, which will get me pretty close to finished if all of them show up. I’m sending email reminders the day before, so I’m crossing my fingers.