Despite our fancy technology, there are some pieces of data we have to gather the old-fashioned way: by asking visitors. One piece we’d like to know is why visitors chose to visit on this particular occasion. We’re building off of John Falk’s museum visitor motivation and identity work, which began with a survey that asks visitors to rate a series of statements on Likert (1-5) scales as to how applicable they are for them that day, and reveals a rather small set of motives driving the majority of visits. We also have used this framework in a study of three of our local informal science education venues, finding that an abbreviated version works equally well to determine which (if any) of these motivations drives visitors. The latest version, tried at the Indianapolis Museum of Art, uses photos along with the abbreviated number of statements for the visitors to identify their visit motivations.

We’re implementing a version on an iPad kiosk in the VC for a couple of reasons: first, we genuinely want to know why folks are visiting, and want to be able to correlate identity motivations with the automated behavior, timing, and tracking data we collect from the cameras. Second, we hope people will stop long enough for us to get a good reference photo for the facial recognition system. Sneaky, perhaps, but it’s not the only place we’re trying to position cameras for good reference shots. And if all goes well with our signage, visitors will be more aware than ever that we’re doing research, and that it is ultimately aimed at improving their experience. Hopefully that awareness will allay most of the final fears about the embedded research tools that we are hoping will be minimal to start with.

Last weekend a number of us headed off to the Oregon coast for the FCL annual retreat. This year it was at William H. Tugman state park near Winchester Bay, OR. As true Oregonians, we stayed in yurts and ran our activities outdoors. Although a little chilly (hey, it IS the Oregon coast!), the weather was beautiful and good times were had by all.

 

The FCL retreat is a student-led professional development opportunity involving a number of grad student and social-centered activities. It’s also an opportunity for us to get to know each other a little better, and enjoy some hang-out time for community-building across the FCL-related programs at OSU.  Over 20 people attended this year, including Dr. Rowe, Dr. John Falk and Dr. Lynn Dierking, as well as partners, dogs and babies, which made for an academic as well as all-round family atmosphere! The annual retreat was started last year at the Oregon Hatchery Research Center in Alsea, OR, and we are hoping it will become a tradition for years to come.

 

Activities were centered on a variety of topics, and included

  • Team building
  • Grant writing
  • Sensory drawing
  • Principles of interpretation
  • Working with culturally and linguistically diverse populations
  • Irish dancing
  • Night hiking
  • Yoga
  • Health

Plus, a couple of extra fun campfires and lots of eating! A big thank you to everyone who helped organize and/or participated in the retreat. Some the highlights included creating interpretive sculptures with modeling clay, watching everyone try to dance in unison during Irish dancing whilst falling over their own feet, and learning some crazy new things we never knew about each other in Dr. Dierking’s icebreaker game. We also discovered Laia is amazing at cooking chili over a fire, and Dr. Rowe makes a mean burger!

Check out our photos here. You will also find them on our facebook page.

 

 

For those of you just joining us, I’m developing a game called Deme for my master’s project. It’s a tactical game that models an ecosystem, and it’s meant primarily for adults. I’m studying how people understand the game’s mechanics in relation to the real world, in an effort to better understand games as learning and meaning-making tools.

I stumbled across Roll20, quite by accident, while reading the PA Report. What I like about Roll20 is the fact that your table session can be shared as a link (apparently—I haven’t started digging yet as I only found out about it a few hours ago). Also, each token can be assigned a hit counter. Damage tracking is something of a hassle in Deme’s current incarnation.

I’ll have more to report after I play around with this for a while. Moving the game from one incarnation and environment to another has forced me to think of it as a system, rather than a product. I want Deme to be portable, and a robust system can be used with just about any tabletop, real or virtual. For an example of a game system, see Wizards of the Coast’s d20 System. The d20 System happens to be a handy model for quantizing events and behaviors—handy enough to inform the data collection framework for our observation systems in the Visitor Center.

Of course, Deme cannot be run single-player as a tabletop game. That’s a double-edged sword. A tabletop game (even a virtual one) is an immediate social experience. A single-player game is a social experience too, but it’s an asynchronous interaction between the developer(s) and the player. I rather like the tabletop approach because each species has a literal voice. The unearthly torrent of resulting qualitative data may be tough to sort out, but I think that’s a good problem to have so long as I know what I’m looking for.

At this phase, the tabletop version is still officially—as much as I can make something official—just a pilot product. I don’t know if it will become something more, but I feel like it deserves a shot.

If you think you get low response rates for research participants at science centers, try recruiting first-and second-year non-science-major undergrads in the summer. So far, since posting my first flyers in May, I have gotten 42 people to even visit the eligibility survey (either by Quick Response/QR code or by tinyurl), and a miserable 2 have completed my interview. I only need 18 participants total!

Since we’re a research outfit, here’s the breakdown of the numbers:

Action Number Percentage of those viewing survey
Visit eligibility survey 42 100
Complete eligibility survey 18 43
Schedule Interview 5 12
Complete Interview 2 5

Between scheduling and completing, I’ve had 2 no shows, and 1 who was actually an engineering major and didn’t read the survey correctly. I figure that of the people who visit the survey and don’t complete it, most figure out they are not eligible (and didn’t read the criteria on the flyer), which is ok.

What is baffling and problematic is the low percentage who complete the survey but then don’t respond to schedule an interview – the dropoff from 18 to 5. I can only figure that they aren’t expecting, don’t find, or don’t connect the Doodle poll I send via email with available time slots. It might go to junk mail, or it may not be clear what the poll is about. There’s a section at the end of the eligibility survey to let folks know there is a doodle poll coming, and I’ve sent it twice to most folks who haven’t responded. I’m not sure what else I can do, short of telephoning people who give me phone numbers. I think that’s my next move, honestly.

Then there’s the no-shows, which is just plain rude. One did email me later and ask to reschedule; that interview did get done. Honestly, this part of “research” is no fun; it’s just frustrating. However, this week is the week before school starting in these parts; I will probably soon set up a table in the Quad with my computer and recruit and schedule people there. Might not solve the no-show problem, but if I can get 100 people scheduled, if half of them no-show, I’ll have a different, much better, problem – cancelling on everyone else! I’m also asking friends who are instructors to let their classes know about the project.

On a side note to our regular readers, as it’s been almost a year of blogging here, we’re refining the schedule a bit. Starting in October, you should see posts about the general Visitor Center research activities by any number of us on Mondays. Wednesdays and Fridays will most often be about student projects for theses and such. Enjoy, and as always, let us know what you think!

 

Katie Woollven tells us about how she’s learning more about getting everyone DOING science research, aka Citizen Science or Public Participation in Science Research:

“I’ve been interested in Citizen Science research since I began my grad program, so I was really excited to attend the Public Participation in Scientific Research (PPSR) conference Aug 4-5 in Portland. The speakers were great, and it was nice to see how my questions about citizen science fit with the current research in this field.

Although public participation has always been important to science throughout history and is NOT new, the field of research on citizen science IS relatively new, and is somewhat disjointed. Researchers in this field lack a common language (prime example: should we call it PPSR? or citizen science?), which makes it difficult to stay abreast of the latest research. There have been calls for a national PPSR organization, one of the conference goals was to get feedback from people in the field about what they would want that organization to do.

One of my favorite talks was from Heidi Ballard of UC Davis, who is interested in all the possible individual, programmatic, and especially community-level outcomes of PPSR projects. She asked questions about the degree and quality of participation, such as: Who participates in these projects, and in what parts of the scientific process? Whose interests are being served, and to what end? Who makes the decisions, and who has the power?

Another interesting part of Heidi’s talk was when she touched on the relative strengths of the 3 models of PPSR projects. Citizen science projects can be divided into 3 categories (see the 2009 CAISE report): contributory (generally designed by scientists, and participants collect data), collaborative (also designed by scientists, but participants may be involved in project design, data analysis, or communicating results), and co-created (designed by scientists and participants, and some participants are involved in all steps of the scientific process). I found this part fascinating, because I think learning from the strengths of all 3 models can make any program more successful. And of course, learning about different citizen science projects during the poster sessions was really exciting! Below are a few of my favorites.

PolarTREC- K-12 teachers go on a 2-6 week science research expedition in a polar region, and then share the experience with their classroom. I think this is really interesting because of the motivational aspect of kids participating in (and according to Sarah Crowley, even improving) authentic scientific research.

Port Townsend Marine Science Center Plastics Project– Volunteers sample beaches for micro-plastics around the US Salish Sea. I’ve heard a lot about this center, and the strength of their volunteer base is amazing.

Nature Research Center, North Carolina Museum of Natural Sciences– I really want to visit this museum! Visitors can engage in the scientific process on the museum floor, in one case by making observations on video feed from a field station.”

Conference talks, poster abstracts, and videos

Katie Woollven is in the Marine Resource Management program, focusing on Marine Education.

ed. note – apologies for the sporadic postings these last few days. Katie Stofer has been out of town, and things weren’t quite as well set up for other lab members to start posting themselves.

Pulling it all together and making sense of things proves one of the hardest tasks for Julie:

“I can’t believe this summer is about over.  I only have 3 days left at Hatfield.  Those 3 days will be filled with frantic work getting the rest of my exhibit proposal pulled together as well as my Sea Grant portfolio and presentation done for Friday.  I go home Saturday morning and I haven’t even figured out when I’m going to pack.  Eek.

But back to the point at hand.  Doing social science has been such a fun experience.  I really loved talking to people to get their feedback and opinions on Climate Change and the exhibit.  I’m so excited for this exhibit.  I want it to be fantastic and I’ve been working very hard on it.  I am stoked to visit next summer to see it in the flesh!

One thing that I find really challenging about doing this kind of research though, is pulling together the data and putting it into a readable format for something like my End of Summer Final Presentation on Friday!  The big survey I did, for instance, was 16 questions and the data collected is very qualitative and doesn’t fit neatly into a table on a power point slide.  So I have to determine which things to pull out to show and exactly how to do it.  I feel confident that I’ll get it down, it’s just going to perhaps rob me of some sleep the next couple days.

Today (Tuesday) I finally got to do something that I should’ve done long ago.  Mark took me into the “spy room” as some call it and showed me all the awesome video footage being recorded in the visitor center.  It’s really incredible!  I was able to download a few videos of myself interpreting at the touch tank which Mark suggested would be a good addition to my portfolio.  Now I feel like a real member of the Free Choice Learning crew.”

This summer has given me a wealth of experiences that will really benefit my future…I can’t wait to see what that future holds.