The funny thing about having the money to revise an exhibit that’s already installed is that, well, there’s already a version out there that people are using, so it sometimes falls to lower priority. Even if you know there are a lot of things that could be better about it. Even if, as I said, the money is in hand to update it. That’s another thing about outreach being still sort of the afterthought of scientific grants; even when the scientists have to report on their grant progress, if the outreach effort isn’t in that report, well, the grant agencies aren’t always so concerned about that.

So we’re trying to revise our salmon fisheries exhibit, and we have a concept, but we have to constantly remind ourselves to make progress on it. It’s an item on my list that is “Important, but not Urgent,” (one of those Seven Habits of Highly Effective People things), and it keeps being shoved out for the Urgent but Not Important and even Not Urgent, Not Important (but way more interesting!) things. I think it’s like revising a paper; sometimes, the work it takes to come up with the ideas in the first place is far more interesting than more nitpicky revisions. And, again, a lot less urgent. So, we’re setting interim milestones to make progress: 1) we have our visualization collaborator working on new images, 2) we have text to re-organize and re-write, and 3) we have a basic logic about the new version that we’ve sent to the developers so they can write the automated data collection tool that records what a user does and when and for how long. So, we feel confident in progress since we’ve farmed out 1 and 3, but that still leaves #2 for us to chip away at. And sometimes, that’s what it takes. A little bit of time here, a little bit there, and eventually, there’s a lot less to get done and the task seems less overwhelming. Maybe the blog will help keep us accountable for getting this exhibit done by … the end of the summer?

We heard recently that our developer contractors have decided they have to abandon their efforts to make the first facial recognition system they investigated work. It was a tough call; they had put a lot of effort into it, thinking many times if they could just tweak this and alter that, they would get better performance than 60%. Alas, they finally decided it was not going to happen, at least without a ridiculous amount of further effort for the eventual reward. So, they are taking a different tack, starting over, almost, though they have lots of lessons learned from the first go-round.

I think this indecision about when it makes sense to try and fix the leaking ship vs. abandon ship and find another is a great parallel with exhibit development. Sometimes, you have a great idea that you try with visitors, and it flops. You get some good data, though, and see a way you can try it again. You make your changes. It flops again, though maybe not quite as spectacularly. Just enough better to give you hope. And so on … until you have to decide to cut bait and either redesign something for that task entirely or, if you’re working with a larger exhibition, find another piece to satisfy whatever learning or other goals you had in mind for the failed piece.

In either situation, it’s pretty heartbreaking to let go of all that investment. When I first started working in prototyping, this happened to our team designing the Making Models exhibition at the Museum of Science, Boston. As an intern, I hadn’t invested anything in the failed prototype, but I could see the struggle in the rest of the team, and it made such an impression that I recall it all these years later. Ultimately, the final exhibit looks rather different from what I remember, but its success is also a testament to the power of letting go. Hopefully, we’ll eventually experience that success with our facial recognition setups!

Prototyping describes the process of creating a first-version exhibit, then testing it out with visitors, and redesigning. Often, we iterate this several times, depending on monetary and time budgets. It’s usually a fruitful way to find out not only what buttons confuse people, but also what they enjoy playing with and what great ideas totally bomb with users.

The problem with prototyping, as with many data collection processes, is that you have to ask the right questions to get useful answers. We are currently re-developing an interactive about how scientists use ocean data to make predictions about salmon populations for future harvests. The first round surveys revealed some areas of content confusion and some areas of usability confusion. Usability confusion is easy to re-work usually, but content confusion is harder to resolve, especially if your survey questions were confusing to the visitors.

This was unfortunately the case with the survey I made up, despite a few rounds of re-working it with colleagues. The survey had multiple-choice questions which were fairly straightforward, but it was the open-ended questions that tripped people up, making the results a bit harder to interpret and know what to do with. The moral of the story? Prototype (a.k.a. pilot) your survey, too!

Beverly Serrell, a pioneer in tracking museum visitors (or stalking them, as some of us like to say), has just released a nice report on the Center for the Advancement of Informal Science Education (CAISE) web site. In “Paying More Attention to Paying Attention,” Serrell describes the growing use of metrics she calls tracking and timing (T&T) in the museum field since the publication of her book on the topic in 1998. As the field has more widely adopted these T&T strategies, Serrell has continued her work doing meta-analysis of these studies and has developed a system to describe some of the main implications of the summed findings for exhibition design.

I’ll leave you to read the details, but it really drove home to me the potential excitement and importance of the cyberlab’s tracking setup. Especially for smaller museums that have minimal staff, implementing an automatic tracking schemes, even on a temporary basis, could save a lot of person-hours in collecting this simple, yet vital data about exhibition and exhibit element use. It could allow more data collection of this type in the prototyping stages, especially, which might yield important data on the optimum density of exhibit pieces before a full exhibition is installed. On the other hand, if we can’t get it to work, or our automated design proves ridiculously unwieldy (stay tuned for some upcoming posts on our plans for 100 cameras in our relatively-small 15000 square foot space), it will only affirm the need for good literal legwork that Serrell also notes is a great introduction to research for aspiring practicioners. In any case, the eye tracking as an additional layer of information that we use to help explain engagement and interest in particular exhibit pieces might lead eventually to a measure that lends more insight into Serrell’s Thorough Use.

(Thanks to the Museum Education Monitor and Jen Wyld for the tip about this report.)

 

Science!

If you look carefully at the above photo, you can see Ursula sulking in the background. When I put my hand into the tank to check the new camera’s frame rate and motion blur, she turned a sort of red-on-white paisley—an unfamiliar pattern that I interpreted as a statement of disapproval inexpressible in any vertebrate language.

Our improvised test housing was a wooden box of paper towels from the touch pool, with the camera fixed in place by a wad of towels and cloth diapers. For further structural support, we rested the camera on a jar of formalin-preserved octopus eggs inside the box. The final installation will have a rather more stable and elegant housing. Prototyping is a fantastically organic and immediate process.

We’ve been struggling with this potential replacement Octocam for the past week. This was a neat, compact security camera that strongly resembled HAL from 2001. We took it into the Visitor Center, plugged it in, typed in the IP address, and…

“I’m sorry, Dave. I’m afraid I can’t allow you to do that.”

We got nothing. We tried a different Ethernet cable. We tried using another port. We tried reconfiguring the network. We tried installing new drivers. After several frustrating days of experimentation, I unplugged the AC adapter to see if one more power cycle would end our troubles. Before I could plug the cord back in, Mark stopped me. The network light was blinking! The camera was happily negotiating a connection with the server on Ethernet power alone.

Apparently, the AC adapter was turning off the Ethernet power, disabling the Ethernet connection in the process. Plugging the camera in caused it to not work. Perhaps that most insulting of tech support questions (“Is your device plugged in?”) doesn’t have as obvious a correct answer as it seems.

Once the camera started feeding to the network, we discovered a different problem: the frame rate just wasn’t high enough for our standards. This model would make a fantastic security camera, but it made a so-so Octocam. As much as we dislike prolonging our time without a tank-level Octocam, we can’t justify trading one problem for another.

We’ll have another model in soon, and hopefully this one will give us what we’ve all been waiting for.