Happy new year everyone!

After all the fun and frivolities of the holiday season, I am left with not only the feeling that I probably shouldn’t have munched all those cookies and candies, but also the grave realization that crunch time for my dissertation has commenced. I’d like to have it completed by Spring and, just like Katie, I’ve hit the analysis phase of my research and am desperately trying not to fall into the pit of never-ending data. All those current and former graduate students out there, I’m sure you can relate to this – all those wonderful hours, weeks and months I have to look forward to of frantically trying to make sense of the vast pool of data I have spent the last year planning for and collecting.

 

But fear not! ’tis qualitative data sir! And seeing as I have really enjoyed working with my participants and collecting data so far, I am going to attempt to enjoy discovering the outcomes of all my hard work. To me, the beauty of working with qualitative data is developing the pictures of the answers to the questions that initiated the research in the first place. It’s a jigsaw puzzle with only knowing a rough idea of what the image might look like at the end – you slowly keep adding the pieces until that image comes clear. I’m looking forward to seeing that image.

So what do I have to analyze? Well, namely ~20 interviews with docents, ~75 docent observations, ~100 visitor surveys and 2 focus groups (which will hopefully take place in the next couple of weeks).  I will be using the  research analysis tool, Nvivo, which will aid me in cross-analyzing the different forms of data using a thematic coding approach – analyzing for reoccuring themes within each data set. What I’m particularly psyched about is getting into the video analysis of the participant observations, whereby I’m finally going to get the chance to unpack some of that docent practice I’ve been harping on about for the last two years. Here, I’ll be taking a little multimodal discourse analysis and a little activity theory to break down docent-visitor interaction and interpretative strategies observed.

Right now, the enthusiasm is high! Let’s see how long I can keep it up 🙂 It’s Kilimanjaro, but there’s no turning back now.

 

Here’s a roundup of some of our technology testing and progress lately.

First, reflections from our partners Dr. Jim Kisiel and Tamara Galvan at California State University, Long Beach. Tamara recently tested the iPad and QuestionPro/Survey Pocket, Looxcie cameras and a few other apps to conduct surveys in the Long Beach Aquarium, which doesn’t have wifi in the exhibit areas. Here is Jim’s report on their usefulness:

“[We] found the iPad to be very useful.  Tamara used it as a way to track, simply drawing on a pdf and indicating times and patterns, using the app Notability.  We simply imported a pdf of the floorplan, and then duplicated it each time for each track.  Noting much more than times, however, might prove difficult, due to the precision of a stylus.  One thing that would make this even better would be having a clock right on the screen.  Notability does allow for recording, and a timer that goes into play when the recording is started.  This actually might be a nice complement, as it does allow for data collector notes during the session. Tamara was unable to use this feature, though, due to the fact that the iPad could only run one recording device at a time–and she had the looxcie hooked up during all of this. 

Regarding the looxcie.  Tamara had mixed results with this.  While it was handy to record remotely, she found that there were many signal drop-outs where the mic lost contact with the iPad.  We aren’t sure whether this was a limitation of the bluetooth and distance, or whether there was just too much interference in the exhibit halls.  While looxcie would have been ideal for turning on/off the device, the tendency to drop communication between devices sometimes made it difficult to activate the looxcie to turn on.  As such, she often just turned on the looxcie at the start of the encounter.  It is also worth noting that Tamara used the looxcie as an audio device only, and sound quality was fine.
 
Tamara had mixed experiences with Survey Pocket.  Aside from some of the formatting limitations, we weren’t sure how effective it was for open-ended questions.  I was hoping that there was a program that would allow for an audio recording of such responses.  She did manage to create a list of key words that she checked off during the open-ended questions, in addition to jotting down what the interviewee said.  This seemed to work OK.  She also had some issues syncing her data–at one point, it looked like much of her data had been lost, due in part to … [problems transferring] her data from the iPad/cloud back to her computer.  However, staff was helpful and eventually recovered the data.
 
Other things:  The iPad holder (Handstand) was very handy and people seemed OK with using it to complete a few demographic questions. Having the tracking info on the pad made it easier to juggle papers, although she still needed to bring her IRB consent forms with her for distribution. In the future, I think we’ll look to incorporate the IRB into the survey in some way.”
Interestingly, I just discovered that a new version of SurveyPocket *does* allow audio input for open-ended questions. However, OSU has recently purchased university-wide licenses from a different survey company, Qualtrics, who as yet do not have an offline app mode for tablet-based data collection. It seems to be in development, though, so we may change our minds about the company we go with when the QuestionPro/SurveyPocket license is up for renewal next year. It’s amazing how the amount of research I did on these apps last year is almost already out of date.
Along the same lines of software updates kinda messing up your well-laid plans, we’re purchasing a couple of laptops to do more data analysis away from the video camera system desktop computer and away from the eyetracker. We suddenly were confronted with the Windows 8 vs Windows 7 dilemma, though – the software for both of these systems is Windows 7-based, but now that Windows 8 is out, the school had to make a call as to whether or not to upgrade. Luckily for us, we’re skipping Windows 8 for the moment, which enables us to actually use the software on the new laptops since we will still go with Windows 7 for them, and the software programs themselves for the cameras and eye tracker won’t likely be Windows 8 ready until sometime in the new year.
Lastly, we’re still bulking up our capacity for data storage and sharing, as well as internet for video data collection. I have recently put in another new server to be dedicated to handle the sharing of data, with the older 2 servers as slaves and the cameras spread out between them. In addition, we put in a NAS storage system and five 3TB hard drives for storage. Mark assures me we’re getting to the point of having this “initial installation” of stuff finalized …

Well the data collection for my research has been underway for nearly 2 months now, how time flies! For those of you new to this project, my research centers on documenting the practice of science center docents as they interact with visitors. Data collection includes video observations of voluntary docents at HMSC using “visitor-mounted” looxcie cameras, as well as pre- and post-observation interviews with those participating docents.

“Visitor-eye view using the looxcies”

My current focus is getting the video observations of  each of the 10 participating docents collected. In order to conduct a post observation interview (which asks docents to reflect on their practice), I need to get about 10-15 minutes of video data of each of the docents interacting with the public. This doesn’t sound like much, but when you can’t guarantee a recruited family will interact with a recruited docent,  and an actual interaction will likely only last from 30 seconds to a few minutes, it takes a fair few families wearing cameras to get what you need. However, I’m finding this process really enjoyable both in getting to know the docents and meeting visitors.

When I first started this project I was worried that visitors would be a little repelled about the idea of having their whole visit recorded. What I’m actually finding is that either a) they want to help the poor grad student complete her thesis, b) they think the cameras are fun and “want a go” or c) they totally want one of the HMSC tote bags being used as an incentive (what can I say, everyone loves free stuff right?!) The enthusiasm for the cameras has gone as far as one gentleman running up to a docent, jumping up and down and shouting “I’m wearing a camera, I’m wearing a camera!” Additionally, and for those star trek fans out there, a number of visitors and colleagues alike have remarked how much wearing a looxcie makes a person look like a borg (i.e. cyborg), particularly with that red light thing…

Now how, may you ask, does that not influence those lovely naturalistic interactions you’re supposed to be observing? Well, as many of us qualitative researchers know, that unless you hide the fact you are observing a person (an element our IRB process is not particularly fond of) you can never truly remove that influence, but you can assume that if particular practices are observed often enough, they are part of the landscape you are observing. The influence of the cameras may alter how naturalistic that interaction may be, but that interaction is still a reflection of social behaviors taking place. People do not completely change their personality and ways of life simply because a camera is around; more likely any behavior changes may simply be over- or under-exaggerated normative actions. And I am finding patterns, lots of patterns, in the discourse and action taking place between docents and visitors.

However, I am paying attention to how visitors and docents react to the cameras. When filtering the footage for interactions, I look out for any discourse that indicates camera influence is an issue. As examples, the docent in the “jumping man” footage reacts surprised to the man’s sudden shouting, open’s his eyes wide and nervously laughs – to which I noted on the video that the interaction from then on may irregular. In one clip I have a docent talking non-stop about waves seemingly without taking a breath for nearly 8 minutes – to which I noted seemed unnatural in comparison to their other shorter dialogue events. Another clip has a docent bursting out laughing at a visitor wearing one of the looxices attached to his baseball cap using a special clip I have (not something I expected!) – to which I noted would have likely made the ability for the visitor to forget about the looxcie less possible.

All in all, however, most visitors remark they actually forget they are wearing the camera as they visit goes on, simply because they are distracted by their actual visit. This makes me happy, as the purpose of incorporating the looxcies was to reduce the influence of being videod as a whole. Visitors forget to a point where, during pilots, one man actually walked into the bathroom wearing his looxcie, and recorded some footage I wasn’t exactly intending to observe… suffice to say, I instantly deleted that video and and updated my recruitment spiel to include a reminder not to take the cameras in to the bathroom. Social science never ceases to surprise me!

So last week I posted about the evaluation project underway at Portland Art Museum (PAM) and wanted to give a few more details about how we are using the looxcie cameras.

 

Looxcies are basically bluetooth headsets, just like the ones regularly seen used with cell phones, but with a built in camera. I am currently using them as part of my research emcompassing docent-visitor interactions, and decided to use them as a data collection tool because of their ability to generate a good quality “visitor-eye-view” of the museum experience. I personally feel their potential as a research/evaluation tool in informal settings are endless, and had some wonderful conversations with other education professionals at the National Marine Educators Association conference in Anchorage, AK recently about where some other possibilities could lie – including as part of professional development practice for educators and exhibit development.

At PAM, the looxices will be used to capture that view when visitors interact with exhibition pieces, specifically those related to their Museum Stories and Conversation About Art video-based programs. Here, fitting visitors with looxcies will enable us to capture the interactions and conversations visitors have about the art on display as they visit the museum. The video data gained here can then be analyzed for repeating themes around what and how visitors talk about art in the museum setting.

During our meeting with Jess Park and Ally Schultz at PAM, we created some test footage to help with training other museum staff for the evaluation procedures. In the clip below, Jess and Ally are looking at and discussing some sculpture pieces, and were both wearing looxcies to give them a sense of how they feel to the user. This particular clip is from Ally’s perspective, and you’ll notice even Shawn and I have a go a butting in and talking about art with them!

What’s exciting about working with the looxcies, and with video observations in general, is how much detail you can capture about the visitor experience, down to what they are specifically looking at, how long they look at it, and even if they nod their head in agreement with the person they are conversing with. Multimodal discourse eat your heart out!

 

Yesterday Shawn and I met with Jess Park at Portland Art Museum (PAM) about an exciting new evaluation project utilizing our looxcie cameras. We had some great conversation about how to capture visitor conversation and interactions in relation to PAM’s Museum Stories and Conversations About Art video-based program. The project will be one the first official evaluation partnerships we have developed under the flag of FCL lab!

PAM has developed these video-based experiences for visitors in order to deepen visitors’ engagement with objects, with each other, and with the museum.  Museum Stories features short video presentations of museum staff talking about specific objects in the collection that have some personal meaning for them. All videos are available on touch screen computers in one gallery of the museum, which also houses the areas where the stories are recorded as well as some of the objects from the museum featured in the stories.  These videos are also available on-line.  Conversations about Art is a series of short videos featuring conversations among experts focused on particular objects in the museum’s collection.  These are available on hand-held devices provided by the museum, as downloads to visitors’ personal hand-held devices, and on the museum website. PAM is now looking to expand the program, and wishes to document some of the predicted and unexpected impacts and outcomes of these projects for visitors. The evaluation will recruit visitors to wear the looxcie cameras during their visit to the pertinent exhibits, including that of object stories. We will likely also be interviewing some of the experts/artists involved in creating the  videos.

We spent time going over the looxcie technologies and how best to recruit visitors in the Art Museum space. We also created some test clips to help the PAM folks working on the evaluation better understand the potential of the video data collection process. I will post a follow up next week with some more details about how we’re using the looxcies.

Shawn and I come back from PAM feeling like the A-Team – we love it when an evaluation plan comes together.