The lab has purchased a bunch of relatively expensive equipment for use by our researchers at HMSC, students who may work mainly at the main campus in Corvallis or on their dissertations elsewhere, and our collaborators in other states and countries. Creating a system that allows for easy movement of the physical systems yet maintains the integrity is proving to be a complicated task for many reasons.

First, the equipment resides mainly at HMSC in Newport. Right now, only Shawn actually lives and works probably 75 percent of the time in Newport. Mark lives in Corvallis (about an hour away) but spends maybe half his week, roughly, in Newport, and Laura and I live in Corvallis but usually spend less than half the week in Newport.  For all of us, the schedule of Newport vs. Corvallis vs. elsewhere work time is not at all regular. This means that no one is a good go-to person for handling the check-in and -out of the equipment unless a user is enough of a planner to know they need something (and know exactly what they need) in advance to ask one of us to bring it back to Corvallis.

And in reality, we don’t really want to have to act like overlords hoarding the equipment and doling it out when we feel like it. We’d like to have a system where people can access the equipment more freely but responsibly. But our shared spaces have other people going in and out that make it difficult to restrict access enough with the limited number of keys to the cabinet we have yet work without a main gatekeeper. Plus, things have just gone walkabout over the years since no one does keep track. People forget they have something, forget where it came from, leave the school and take it with them, not maliciously, but out of lack of time to worry about it and frankly, no one with interest in keeping up with the equipment. This especially happens with Shawn’s books. Full disclosure: I’m pretty guilty of it myself, at least of having things I borrow sit on my desk far beyond the time that it might be reasonable for me to keep them. No one else may be asking to use them, but if the resources aren’t on a shelf or in a database for browsers to know they’re available, it’s not really available in my eyes.

So we’ve struggled with this system. I tried to be the one in charge for a while, but I wasn’t travelling back and forth to Newport regularly, and it was a burden for people to come to me then me to find someone who was in Newport to pick it up and bring it to me to turn over to the borrower, and basically reverse the process when stuff was returned. Technically, the school probably wants us to have people sign off on taking equipment, even things with the small dollar values of these items, but that’s another layer of hassle to deal with.

Plus, the database programs we’ve tried to use to keep track have proved annoying for one reason or another. Again, most of the database programs are linked to one computer, so one person had to be the gatekeeper. For now, we’ve settled on a paper sign-out system on the door of the cabinet holding the equipment, but that doesn’t integrate with any computerized system that would be easy to track what’s out and in at any given time and when things are due back. The school multimedia system on campus uses barcode scanners, but the cost of implementing a system for our small use case is probably prohibitive. Peer-to-peer lending systems have the owners responsible for their own stuff, but even they often use online databases to track things. Suggestions welcome!

It’s just another thing that most people don’t think about that’s behind-the-scenes in the research process. And then when you go to do research, you spend way too much time thinking about it, or stuff gets lost.

Here’s a roundup of some of our technology testing and progress lately.

First, reflections from our partners Dr. Jim Kisiel and Tamara Galvan at California State University, Long Beach. Tamara recently tested the iPad and QuestionPro/Survey Pocket, Looxcie cameras and a few other apps to conduct surveys in the Long Beach Aquarium, which doesn’t have wifi in the exhibit areas. Here is Jim’s report on their usefulness:

“[We] found the iPad to be very useful.  Tamara used it as a way to track, simply drawing on a pdf and indicating times and patterns, using the app Notability.  We simply imported a pdf of the floorplan, and then duplicated it each time for each track.  Noting much more than times, however, might prove difficult, due to the precision of a stylus.  One thing that would make this even better would be having a clock right on the screen.  Notability does allow for recording, and a timer that goes into play when the recording is started.  This actually might be a nice complement, as it does allow for data collector notes during the session. Tamara was unable to use this feature, though, due to the fact that the iPad could only run one recording device at a time–and she had the looxcie hooked up during all of this. 

Regarding the looxcie.  Tamara had mixed results with this.  While it was handy to record remotely, she found that there were many signal drop-outs where the mic lost contact with the iPad.  We aren’t sure whether this was a limitation of the bluetooth and distance, or whether there was just too much interference in the exhibit halls.  While looxcie would have been ideal for turning on/off the device, the tendency to drop communication between devices sometimes made it difficult to activate the looxcie to turn on.  As such, she often just turned on the looxcie at the start of the encounter.  It is also worth noting that Tamara used the looxcie as an audio device only, and sound quality was fine.
 
Tamara had mixed experiences with Survey Pocket.  Aside from some of the formatting limitations, we weren’t sure how effective it was for open-ended questions.  I was hoping that there was a program that would allow for an audio recording of such responses.  She did manage to create a list of key words that she checked off during the open-ended questions, in addition to jotting down what the interviewee said.  This seemed to work OK.  She also had some issues syncing her data–at one point, it looked like much of her data had been lost, due in part to … [problems transferring] her data from the iPad/cloud back to her computer.  However, staff was helpful and eventually recovered the data.
 
Other things:  The iPad holder (Handstand) was very handy and people seemed OK with using it to complete a few demographic questions. Having the tracking info on the pad made it easier to juggle papers, although she still needed to bring her IRB consent forms with her for distribution. In the future, I think we’ll look to incorporate the IRB into the survey in some way.”
Interestingly, I just discovered that a new version of SurveyPocket *does* allow audio input for open-ended questions. However, OSU has recently purchased university-wide licenses from a different survey company, Qualtrics, who as yet do not have an offline app mode for tablet-based data collection. It seems to be in development, though, so we may change our minds about the company we go with when the QuestionPro/SurveyPocket license is up for renewal next year. It’s amazing how the amount of research I did on these apps last year is almost already out of date.
Along the same lines of software updates kinda messing up your well-laid plans, we’re purchasing a couple of laptops to do more data analysis away from the video camera system desktop computer and away from the eyetracker. We suddenly were confronted with the Windows 8 vs Windows 7 dilemma, though – the software for both of these systems is Windows 7-based, but now that Windows 8 is out, the school had to make a call as to whether or not to upgrade. Luckily for us, we’re skipping Windows 8 for the moment, which enables us to actually use the software on the new laptops since we will still go with Windows 7 for them, and the software programs themselves for the cameras and eye tracker won’t likely be Windows 8 ready until sometime in the new year.
Lastly, we’re still bulking up our capacity for data storage and sharing, as well as internet for video data collection. I have recently put in another new server to be dedicated to handle the sharing of data, with the older 2 servers as slaves and the cameras spread out between them. In addition, we put in a NAS storage system and five 3TB hard drives for storage. Mark assures me we’re getting to the point of having this “initial installation” of stuff finalized …

Despite our fancy technology, there are some pieces of data we have to gather the old-fashioned way: by asking visitors. One piece we’d like to know is why visitors chose to visit on this particular occasion. We’re building off of John Falk’s museum visitor motivation and identity work, which began with a survey that asks visitors to rate a series of statements on Likert (1-5) scales as to how applicable they are for them that day, and reveals a rather small set of motives driving the majority of visits. We also have used this framework in a study of three of our local informal science education venues, finding that an abbreviated version works equally well to determine which (if any) of these motivations drives visitors. The latest version, tried at the Indianapolis Museum of Art, uses photos along with the abbreviated number of statements for the visitors to identify their visit motivations.

We’re implementing a version on an iPad kiosk in the VC for a couple of reasons: first, we genuinely want to know why folks are visiting, and want to be able to correlate identity motivations with the automated behavior, timing, and tracking data we collect from the cameras. Second, we hope people will stop long enough for us to get a good reference photo for the facial recognition system. Sneaky, perhaps, but it’s not the only place we’re trying to position cameras for good reference shots. And if all goes well with our signage, visitors will be more aware than ever that we’re doing research, and that it is ultimately aimed at improving their experience. Hopefully that awareness will allay most of the final fears about the embedded research tools that we are hoping will be minimal to start with.

OSU ran three outreach activities at the 46th annual Smithsonian Folklife Festival, and we took the chance to evaluate the Wave Lab’s Mini-Flume wave tank activity, a related but different activity to the wave tanks in the HMSC Visitor Center.

Three activities were selected by the Smithsonian Folklife committee to best represent the diversity of research conducted at OSU, as well as the University’s commitment to sustainable solutions and family education: Tech Wizards, Surimi School, and the O.H. Hinsdale Wave Lab’s Mini-Flume activity. Tech Wizards was set up in the Family Activities area of Folklife, and Surimi School and the Mini-Flume activity shared a tent in the Sustainable Solutions area.

Given the anticipated number of visitors to the festival, and my presence as the project research assistant, we decided it would be a great opportunity to see how well people thought the activity worked, what they might learn, and what they liked or didn’t – core questions in an evaluation. The activity was led by Alicia Lyman-Holt, EOT director at the O.H. Hinsdale Wave Lab, and I developed and spearheaded the evaluation. To make the activity and evaluation happen, we also brought four undergraduate volunteers from OSU and two from Howard University in D.C, plus both the OSU Alumni Association and the festival supplied volunteers on an as-needed basis. We also wanted to try out data collection using iPads and survey software we’re working with in the FCL Lab.

Due to the sheer numbers of people we thought would be there, as well as the divided attentions of everyone, we decided to go with a straightforward survey. We ended up only collecting a small number of what we anticipated due to extreme heat, personnel, and divided attention of visitors – after they spent a lot of time with the activity, they weren’t always interested in sticking around even for a short survey.

I’m currently working on data analysis. Stay tuned for more information on the evaluation, the process, and to learn how we did on the other side of the continent.

Ok, I guess I am following suit and forgot to post on Friday! I don’t have quite as good of an excuse as Katie. Instead of prepping for conferences I was recovering from a vacation.

I thought it might be nice to provide an update about the Exploratorium project, where NOAA scientists are embedded on the museum floor with the Explainers (Exploratorium front-line staff consisting of young adults). I have collected so much data for this project I am beginning to feel overwhelmed.

Here’s the data that I have collected:
– Formal Interviews with each of the four groups of scientists, both before and after their experience.
– Informal interviews with all of the scientists. These were done in the time walking back to the hotel or when grabbing lunch. Both great times to collect data!
– Interviews with the two Explainer managers plus a survey with open- and closed-ended questions at the end of year 2.
– Interviews with each of the lead Explainers, 8 total. Also, lead Explainers during year 2 completed a survey with open- and closed-ended questions.
– Pre- mid- and post- data for what Explainers think atmospheric sciences is and what atmospheric scientists do. This was not done during the first year topic of ocean sciences.
– I also provided an optional survey for all Explainers so they could share their thoughts and opinions about the project. This provided a reflection opportunity for the Explainers that were not lead Explainers during the project.
– Visitor surveys about their experience in the scientists’ installation. During year 2 these were collected in both paper form and using survey software on the iPad.
– Field notes during meetings and time on the museum floor. During year 2 the field notes were taken on the iPad using survey software.
– And lastly…personal daily reflections.

So the question is “now what?” This data provides opportunities for triangulation but where does one start? I’m spending my final month of summer trying to figure that out.

Hopefully my next blog post will showcase my progress and some findings.

As part of the FCL Lab’s incorporation of the iPad as a research tool in museums, I have been one of a handful of people testing different survey software. Software I tested included PollDaddy, iSurvey, iForm, QuestionPro and TouchMetric/Surveyor. While each had their own unique set-up and offered basic affordances such as different question types and access to data, I found that QuestionPro far exceeded the others.

Here are a few things that I particularly liked about QuestionPro:
• Great customer service. I made a contact through their online chat and that person remained by contact, via email, throughout my trial. I was offered a free upgrade to the corporate edition and my free trial was extended to meet the amount of time I needed to complete a specific project.
• Diversity in question types. For my survey I used pretty basic questions (multiple choice- both select one answer and select multiple answers, comment/text boxes, matrices and scales) but noticed there were several other question options.
• Easy to set up surveys. Selecting a question type, making questions require a response, and branching all had to be done in separate pop-up boxes; however, once you got used to the system it didn’t take much time.
• Ability to jump/branch to follow a logical order. Enough said.
• Survey can be completed on the iPad app or by emailing a link to participants. The main focus of testing different software was to determine how easy it was to set up and use on an iPad. However, in my study I had participants who were easier to contact via email (i.e. museum staff). Having the capability to send those participants a link to complete the survey on their computers was easy and efficient. QuestionPro allows you to see the original email sent to participants, how many have viewed the survey, how many have completed the survey, and send reminder emails.
• Basic data analysis. I have not explored this feature in any great detail yet, but simple statistics and graphs are easily accessible.

There are few negative things to say about QuestionPro. It does take time and patience to figure out branching/skip logic and some other features, but I wouldn’t say more so than any other new program one tries to learn. Beware, however, of the preview option when editing the survey. Previewing your survey is helpful, especially if you set up branching, but QuestionPro puts all of the data from those previews into your data folder. It is easy enough to go into your data and delete those results, but I would recommend to do that before collecting data from participants. I have sent an email about this to QuestionPro.