Last month I wrote about Literacy in the 21st Century and the wonderful new project evaluation I’m working on, Project SEAL. I first want to share a blog post that the Model Classroom team wrote about their time with the Project SEAL teachers during the professional development in February. http://www.modelclassroom.org/blog/2013/03/projectsealoregonpd-intro.html. It has a wonderful synopsis of the two days as well as some teacher reflections.

Since the February professional development, I have turned my attention to the family literacy nights. I have never attended a family literacy night. They were not part of my K-12 experience and I have never heard of or seen them as a researcher/evaluator. The Project SEAL team told me that literacy nights can differ greatly and they did not have standards for the schools to follow for these events. This presented some troubles with me as an evaluator. How can you standardize an evaluation tool for something that looks different each time?

After having some conversations with the Project SEAL team, we decided on a short and sweet survey. Something parents would be willing to fill out throughout the night and something that would focus on literacy, ocean science resource use, as well as structure of the event. We hope that these literacy nights 1) lead to families checking out ocean-related books (purchased for the libraries through the grant), 2) give parents an opportunity to see technology that is being incorporated into literacy (the grant also bought a classroom set of iPad mini’s for each school), and 3) give teachers and students time to present on learning experiences they’ve had with the iPads and new reading material available in the library. Here are the questions on the Family Literacy Night survey.

1) What was your (or your child’s) favorite part of this Family Literacy Night?

2) What went well during this Family Literacy Night?

3) What suggestions for improvement do you have for future Family Literacy Nights?

4)What did you hope to take away from tonight’s Family Literacy Night?  (check all that apply)

More activities and games to do at home

Information on what is being done in my child’s classroom

Information on assessment in reading and writing

Information about how children learn to read and write

Information on how to work with the school and my child’s teacher

New resources available in the library

Ways to use technology with my child at home

How my child’s class has been using library resources

5)You or your child have checked out ocean science resources to read together at home.

6) Your child presented or talked about a class project at this Family Literacy Night.

7) You learned what you wanted to learn tonight.      Agree / Neutral / Disagree

8)Tonight I gained new information about ocean science resources available to my child through his/her school library.     Agree / Neutral / Disagree

Hopefully the data can be useful in proving the effectiveness of this project but also give the schools some ideas for future family literacy nights.

Awhile ago, I promised to share some of my experiences in collecting data on visitors’ exhibit use as part of this blog. Now that I’ve actually been back at it for the past few weeks, I thought it might be time to actually share what I’ve found. As it is winter here in the northern hemisphere, our weekend visitation to the Hatfield Visitor Center is generally pretty low. This means I have to time my data collection carefully if I don’t want to spend an entire day waiting for subjects and maybe only collect data on two people. That’s what happened on a Sunday last month; the weather on the coast was lovely, and visitation was minimal. I have been recently collecting data in our Rhythms of the Coastal Waters exhibit, which has additional data collection challenges in that it is basically the last thing people might see before they leave the center, it’s dim because it houses the projector-based Magic Planet, and there are no animals, unlike just about every other corner of the Visitor Center. So, I knocked off early and went to the beach. Then I definitely rescheduled another day I was going to collect data because it was a sunny weekend day at the coast.

On the other hand, on a recent Saturday we hosted our annual Fossil Fest. While visitation was down from previous years, only about 650 compared to 900, this was plenty for me, and I was able to collect data on 13 people between 11:30 and 3:30, despite an octopus feeding and a lecture by our special guest fossil expert. Considering data collection, including recruitment, consent, the experiment, and debrief probably runs 15 minutes, I thought that this was a big win. In addition, I only got one refusal from a group that said they were on their way out and didn’t have time. It’s amazing how much better things go if you a) lead with “I’m a student doing research,” b) mention “it will only take about 5-10 minutes”, and c) don’t record any video of them. I suspect it also helps that it’s not summer, as this crowd is more local and thus perhaps more invested in improving the center, whereas summer tourists might be visiting more for the experience, to say they’ve been there, as John Falk’s museum visitor “identity” or motivation research would suggest. This would seem to me like a motivation that would not make you all that eager to participate. Hm, sounds like a good research project to me!

Another reason I suspect things went well was that I am generally approaching only all-adult groups, and I only need one participant from each group, so someone can watch the kids if they get bored. I did have one grandma get interrupted a couple times, though, by her grandkids, but she was a trooper and shooed them away while she finished. When I was recording video and doing interviews about the Magic Planet, the younger kids in the group often got bored, which made recruiting families and getting good data somewhat difficult, though I didn’t have anyone quit early once they agreed to participate. Also, as opposed to prototyping our salmon forecasting exhibit, I wasn’t asking people to sit down at a computer and take a survey, which seemed to feel more like a test to some people. Or it could have been the exciting new technology I was using, the eye-tracker, that was appealing to some.

Interestingly, I also had a lot of folks observe their partners as the experiment happened, rather than wander off and meet up later, which happened more with the salmon exhibit prototyping, perhaps because there was not much to see if one person was using the exhibit. With the eye-tracking and the Magic Planet, it was still possible to view the images on the globe because it is such a large exhibit. Will we ever solve the mystery of what makes the perfect day for data collection? Probably not, but it does present a good opportunity for reflection on what did and didn’t seem to work to get the best sample of your visitorship. The cameras we’re installing are of course intended to shed some light on how representative these samples are.

What other influences have you seen that affect whether you have a successful or slow day collecting exhibit use data?

 

Last October, Lincoln County School District received news that they were awarded an Innovative Approaches to Literacy Grant to fund Project SEAL (Students Engaging in Authentic Literacy). Dr. Rowe and I, representing Oregon Sea Grant, are the evaluators for this project.  What I enjoy most about working on the evaluation is that it continues to push my understanding of learning, focusing not only on museums but also on the classroom and continually thinking about bridging the gap between the two in new ways.

Project SEAL has so many components to it, including buying new ocean-related books for school libraries, stocking each library with a classroom set of handheld devices such as iPads, and family literacy nights. I am sure these will come up in future blog posts, but today I want to focus on the teacher professional development part of Project SEAL. On February 8th and 9th Project SEAL hosted a Model Classroom (modelclassroom.org) training for around 60 teachers, principals, and media assistants. The Model Classroom has “teachers participate in a set of missions that take them out into the community… [where they will] develop and document project ideas to take back to the classroom.”

We started the training at the Oregon Coast Aquarium and the first mission was for teachers to go around the aquarium to look at exhibits and talk to people (anyone they could find including visitors, educators and volunteers) about a global issue that has a local impact. One group of teachers was contacting local grocery stores and talking to the aquarium gift store about plastic bags while another group was asking visitors questions like “what would you do if you found tsunami debris on the beach?” Yet another group ended up on a research vessel docked nearby. The second mission was to use their mobile devices to create a hook to draw their students into the topic, with an end goal of thinking of ways their students could use these devices to communicate ideas and projects from the field. One group of teachers used iMovie to create a trailer about picking up and properly reporting tsunami debris.

The second day of training was spent in a library of a local school. The day started with an in-depth conversation of what literacy was (when the teachers were in school) versus was literacy is now (in the 21st century). The Model Classroom leaders, project staff and I agreed this was a conversation we’d have to continually come back to because it is so BIG. For most of the rest of the day teachers divided into groups and explored the school, looking at different spaces and the learning opportunities that can occur. They took pictures, wrote descriptions and some groups came up with ideas for improvement.

Project SEAL is in its infancy but it’s such a wonderful project with so many key components. Keep your eyes out for future posts with the ongoing evaluation and tools developed. In the meantime, learn more about Project SEAL and read the teacher’s blog posts at https://sites.google.com/site/oregontestsite/home.

I want to talk today about what many of us here have alluded to in other posts: the approval (and beyond) process of conducting ethical human research. What grew out of really really unethical primarily medical research on humans many years ago now has evolved into something that can take up a great deal of your research time, especially on a large, long-duration grant such as ours. Many people (including me, until recently) thought of this process as primarily something to be done up-front: get approval, then sort of forgotten about except for the actual gaining of consent as you go and unless you significantly change your research questions or process. Wrong! It’s a much more constant, living thing.

We at the Visitor Center have several things that make us a weird case for our Institutional Review Board office at the university. First, even though it is generally educational research that we do, as part of the Science and Mathematics Education program, our research sites (the Visitor Center and other community-based locations) are not typically “approved educational research settings” such as classrooms. Classrooms have been so frequently used over the years that they have a more streamlined approval process unless you’re introducing a radically different type of experiment. Second, we’re a place where we have several types of visitor populations: the general public, OSU student groups, and K-12 school and camp groups, who each have different levels of privacy expectations, requirements for attending (public: none, OSU school groups: may be part of a grade), and thus different levels and forms of obtaining consent to do research required. Plus, we’re trying to video record our entire population, so getting signatures from 150,000+ visitors per year just isn’t feasible. However, some of the research we’re doing will be our typical video recording that is more in-depth than just the anonymized overall timing and tracking and visitor recognition from exhibit to exhibit.

What this means is a whole stack of IRB protocols that someone has to manage. At current count, I am managing four: one for my thesis, one for eyetracking in the Visitor Center for looking at posters and such, one for a side project involving concept mapping, and one for the general overarching video recording for the VC. The first three have been approved and the last one is in the middle of several rounds of negotiation on signage, etc., as I’ve mentioned before. Next up we need to write a protocol for the wave tank video reflections, and one for groundtruthing the video-recording-to-automatic-timing-tracking-and-face-recognition data collection. In the meantime, the concept mapping protocol has been open for a year and needs to be closed. My thesis protocol has bee approved nearly as long, went through several deviations in which I did things out of order or without getting updated approval from IRB, and now itself soon needs to be renewed. Plus, we already have revisions to the video recording protocol staff once the original approval happens. Thank goodness the eyetracking protocol is already in place and in a sweet spot time-wise (not needing renewal very soon), as we have to collect some data around eyetracking and our Magic Planet for an upcoming conference, though I did have to check it thoroughly to make sure what we want to do in this case falls under what’s been approved.

On the positive side, though, we have a fabulous IRB office that is willing to work with us as we break new ground in visitor research. Among them, us, and the OSU legal team we are crafting a strategy that we hope will be useful to other informal learning institutions as they proceed with their own research. Without their cooperation, though, very little of our grand plan would be able to be realized. Funders are starting to realize this, too, and before they make a final award for a grant they require proof that you’ve discussed the basics of your project at least with your IRB office and they’re on board.

As the lab considers how to encourage STEM reflection around the tsunami tank, this recent post from Nina Simon at Museum 2.0 reminds us what a difference the choice of a single word can make in visitor reflection:

“While the lists look the same on the surface (and bear in mind that the one on the left has been on display for 3 weeks longer than the one on the right), the content is subtly different. Both these lists are interesting, but the “we” list invites spectators into the experience a bit more than the “I” list.”

So as we go forward, the choice not only of the physical booth set up (i.e. allowing privacy or open to spectators), but also the specific wording can influence how our visitors choose to focus or not on the task we’re trying to investigate, and how broad or specific/personal their reflections might be. Hopefully, we’ll be able to do some testing of several supposedly equivalent prompts as Simon suggests in an earlier post as well as more “traditional” iterative prototyping.

It seems that a convenience sample really is the only way to go for my project at this stage. I have long entertained the notion that some kind of randomization would work to my benefit in some abstract, cosmic way. The problem is, I’m developing a product for an established audience. As much as I’d like to reach out and get new audiences interested, that will have to come later.

That sounds harsh, which is probably why I hadn’t actually considered it until recently. In reality, it could work toward my larger goal of bringing in new audience members by streamlining the development process.

I’ve discovered that non-gamers tend to get hung up on things that aren’t actually unique to Deme, but are rather common game elements with which they’re not familiar. Imagine trying to design a dashboard GPS system, then discovering that a fair number of your testers aren’t familiar with internal combustion engines and doubt they will ever catch on. I need people who can already drive.

Games—electronic, tabletop or otherwise—come with a vast array of cultural norms and assumptions. Remember the first time you played a videogame wherein the “Jump” button—the button that was just simply always “Jump” on your console of choice—did something other than jump?* It was like somebody sewed your arms where your legs were supposed to be, wasn’t it? It was somehow offensive, because the game designers had violated a set of cultural norms by mapping the buttons “wrong.” There’s often a subtle ergonomic reason that button is usually the “Jump” button, but it has just as much to do with user expectations.

In non-Deme news, we’re all excited to welcome our new Senior Aquarist, Colleen Newberg. She comes to us from Baltimore, but used to work next door at the Oregon Coast Aquarium. I learned last week that she is a Virginian, leaving Sid as the lone Yankee on our husbandry team. We’ve got some interesting things in the works, and Collen has been remarkably cool-headed amidst a torrent of exhibit ideas, new and changing protocols and plumbing eldritch and uncanny.

 

*I’ve personally observed that button-mapping has become less standardized as controllers have become more complex. I could be wrong, though—my gameplay habits do not constitute a large representative sample. Trigger buttons, of course, would be an exception.