So last week I posted about the evaluation project underway at Portland Art Museum (PAM) and wanted to give a few more details about how we are using the looxcie cameras.

 

Looxcies are basically bluetooth headsets, just like the ones regularly seen used with cell phones, but with a built in camera. I am currently using them as part of my research emcompassing docent-visitor interactions, and decided to use them as a data collection tool because of their ability to generate a good quality “visitor-eye-view” of the museum experience. I personally feel their potential as a research/evaluation tool in informal settings are endless, and had some wonderful conversations with other education professionals at the National Marine Educators Association conference in Anchorage, AK recently about where some other possibilities could lie – including as part of professional development practice for educators and exhibit development.

At PAM, the looxices will be used to capture that view when visitors interact with exhibition pieces, specifically those related to their Museum Stories and Conversation About Art video-based programs. Here, fitting visitors with looxcies will enable us to capture the interactions and conversations visitors have about the art on display as they visit the museum. The video data gained here can then be analyzed for repeating themes around what and how visitors talk about art in the museum setting.

During our meeting with Jess Park and Ally Schultz at PAM, we created some test footage to help with training other museum staff for the evaluation procedures. In the clip below, Jess and Ally are looking at and discussing some sculpture pieces, and were both wearing looxcies to give them a sense of how they feel to the user. This particular clip is from Ally’s perspective, and you’ll notice even Shawn and I have a go a butting in and talking about art with them!

What’s exciting about working with the looxcies, and with video observations in general, is how much detail you can capture about the visitor experience, down to what they are specifically looking at, how long they look at it, and even if they nod their head in agreement with the person they are conversing with. Multimodal discourse eat your heart out!

 

Yesterday Shawn and I met with Jess Park at Portland Art Museum (PAM) about an exciting new evaluation project utilizing our looxcie cameras. We had some great conversation about how to capture visitor conversation and interactions in relation to PAM’s Museum Stories and Conversations About Art video-based program. The project will be one the first official evaluation partnerships we have developed under the flag of FCL lab!

PAM has developed these video-based experiences for visitors in order to deepen visitors’ engagement with objects, with each other, and with the museum.  Museum Stories features short video presentations of museum staff talking about specific objects in the collection that have some personal meaning for them. All videos are available on touch screen computers in one gallery of the museum, which also houses the areas where the stories are recorded as well as some of the objects from the museum featured in the stories.  These videos are also available on-line.  Conversations about Art is a series of short videos featuring conversations among experts focused on particular objects in the museum’s collection.  These are available on hand-held devices provided by the museum, as downloads to visitors’ personal hand-held devices, and on the museum website. PAM is now looking to expand the program, and wishes to document some of the predicted and unexpected impacts and outcomes of these projects for visitors. The evaluation will recruit visitors to wear the looxcie cameras during their visit to the pertinent exhibits, including that of object stories. We will likely also be interviewing some of the experts/artists involved in creating the  videos.

We spent time going over the looxcie technologies and how best to recruit visitors in the Art Museum space. We also created some test clips to help the PAM folks working on the evaluation better understand the potential of the video data collection process. I will post a follow up next week with some more details about how we’re using the looxcies.

Shawn and I come back from PAM feeling like the A-Team – we love it when an evaluation plan comes together.

On this most summer of holidays, while her home state experiences powerful summer storms and heat waves, intern Diana adjusts to her summer home and job:

As a native Marylander, I have been thrown into an environment of cold northwest water and weather.  I was definitely not used to wearing pants in the summer or having my hooded sweatshirt as a necessity to my wardrobe.

The first challenge I faced was understanding how the west coast worked in terms of upwelling and the cold temperature of the water here.  Once I understood this, I could then understand why the biodiversity that lives and flourishes here can actually do so. I am still learning, and I probably always will be for at least this summer if not more, because the Oregon coast is a complex world.

The next step to fitting in here at Hatfield for the summer was to learn about the Visitor’s Center itself. I had to learn about the animals that live here, the activities and free choice learning aspects that are displayed as well as what my project for the summer here would be.  That is a frustrating task in itself. I do know a good bit about marine biology and ecology, but this place was intense. This is mainly because I have only seen a few science centers and aquariums that use the water around them as their water for the marine animals. Hatfield completely relies on the bay its saltwater wedge. If something happens to the water in the bay, then all heck breaks loose in the science center because that’s the water we use. I know it’s filtered a million times in many different ways, but sometimes things still make it through and that’s what effects the marine environment such as bacteria, invertebrates, etc.

Then, there are the surprises I have gotten while working at this job for almost 2 weeks…the Visitors.  No matter how many changes in the center, from the animals to the water quality to the behavior, the visitors still surprise me the most. Each family and person is different, from the moment they walk in the door and are asked for a donation rather than an entrance fee. Some give a little, some give and wish they could give more. There are people who are from out of town who just want to see the octopus and people from landlocked states and have never seen an estuary before. You also get visitors who know nothing about the Oregon coast or marine ecology. Then, before you know it there’s a kid who comes in and knows more about sea stars than you would ever know, no matter how much you studied. Each visitor has their own story, and that is what makes my job so exciting because not only is science ever changing, but so are the people that want to learn.

Prototyping describes the process of creating a first-version exhibit, then testing it out with visitors, and redesigning. Often, we iterate this several times, depending on monetary and time budgets. It’s usually a fruitful way to find out not only what buttons confuse people, but also what they enjoy playing with and what great ideas totally bomb with users.

The problem with prototyping, as with many data collection processes, is that you have to ask the right questions to get useful answers. We are currently re-developing an interactive about how scientists use ocean data to make predictions about salmon populations for future harvests. The first round surveys revealed some areas of content confusion and some areas of usability confusion. Usability confusion is easy to re-work usually, but content confusion is harder to resolve, especially if your survey questions were confusing to the visitors.

This was unfortunately the case with the survey I made up, despite a few rounds of re-working it with colleagues. The survey had multiple-choice questions which were fairly straightforward, but it was the open-ended questions that tripped people up, making the results a bit harder to interpret and know what to do with. The moral of the story? Prototype (a.k.a. pilot) your survey, too!

Harrison used an interesting choice of phrase in his last post: “time-tested.” I was just thinking as I watched the video they produced, including Bill’s dissection, that I don’t know what we’ve done to rigorously evaluate our live programming at Hatfield. But it is just this sort of “time-tested” program that our research initiatives are truly trying to sort out and put to the test. Time has proven its popularity, data is necessary to prove its worth as a learning tool. A very quick survey of the research literature doesn’t turn up much, though some science theater programming was the subject of older studies. Live tours are another related program that could be ripe for investigation.

We all know, as humans who recognize emotions in others, how much visitors enjoy these sorts of programs and science shows of all types. However, we don’t always apply standards to our observations, such as measuring specific variables to answer specific questions. We have a general sense of “positive affect” in our visitors, but we don’t have any data in the form of examples of quotes or interviews with visitors to back up our thoughts. Yet.

A good example of another need for this was in a recent dissertation defense here at OSU. Nancy Staus’ research looked at learning from a live program, and she interviewed visitors after watching a program at a science center. She found, however, that the presenter of the program had a lot of influence on the learning simply by the way they presented the program: visitors recalled more topics and more facts about each topic when the presentation was more interactive than scripted. She wasn’t initially interested in differences of this sort, but because she’d collected this sort of data on the presentations, she was able to locate a probable cause for a discrepancy she noted. So while this wasn’t the focus of her research (she was actually interested in the role of emotion in mediating learning), it pointed to the need for data to not only back up claims, but also to lead to explanations for surprising results and open areas for further study.

That’s what we’re working for: that rigorously examining these and all sorts of other learning opportunities becomes an integral part of the “time-honored tradition.”

Mark and I did some guerrilla filmmaking this morning.   Despite some hiccups and an uncooperative Sun, we got some good footage.  As I type this, Mark is preparing these and other videos for the Sea Grant all-hands meeting tomorrow and Friday.

Communicating what we do is a big part of what we do.  This is ethically necessary for human-subjects research (see Katie’s post from Monday), and it’s also a great way to teach science as a process.  It’s a somewhat recursive approach that can be, oddly enough, difficult to communicate.  I like to think we do a decent job of it.

I think the key point, as always, is that we’re all in this together.  Visitors, researchers, students and educators each have a role to play in this thing we call “Science.”  Researchers can learn about natural phenomena from the observations of the general public, while the general public can learn about research and natural phenomena from our Visitor Center exhibits and outreach products.  It’s a two-way street—nay, a busy four-way, multi-lane intersection—and our job is to facilitate the flow of information in any direction.

Much of what we do is familiar and time-tested—Bill, resplendent in his bloodstained white lab coat, holding aloft the entrails of a found shark before a crowd of excited children.  Such childhood experiences with classic museum interpretation are what drew many of us into this field.

Hopefully, the new strategies and technologies we’re in the process of introducing will come to be equally accepted and enjoyed by visitors.