Rejection. It’s an inevitable part of recruiting human subjects to fill out your survey or try out your exhibit prototype. It’s also hard not to take it personally, but visitors have often paid to attend your venue today and may or may not be willing to sacrifice some of their leisure time to improve your exhibit.

 

[Full disclosure: this blog post is 745 words long and will take you approximately 5-10 minutes to read. You might get tired as you read it, or feel your eyes strain from reading on the computer screen, but we won’t inject you with any medications. You might learn something, but we can’t pay you.]

 

First, you have to decide beforehand which visitors you’re going to ask – is it every third visitor? What if they’re in a group? Which direction will they approach from? Then you have to get their attention. You’re standing there in your uniform, and they may not make eye contact, figuring you’re just there to answer questions. Sometimes rejection is as simple as a visitor not meeting your eye or not stopping when you greet them.

You don’t want to interrupt them while they’re looking at an exhibit, but they may turn and go a different direction before you get a chance to invite them to help you. How far do you chase them once you’ve identified them as your target group? What if they’re going to the restrooms, or leaving the museum from there? When I was asking people to complete surveys about our global data display exhibit, they were basically on their way out the door of the Visitor Center, and I was standing in their way.

 

If you get their attention, then you have to explain the study and not scare them off by making it sound like a test, with right or wrong answers, even when you have right and wrong answers. You also have to make sure that you don’t take too much of their time.

 

Then there are the visitors who leave in the middle of the experiment, deciding they didn’t know what they were getting into, or being drawn away by another group member.

 

Oh, you’re still there? This isn’t too long? It’s not lunchtime, planetarium show time or time to leave for the day? I’ll continue.

 

If you have an IRB or other informed consent document, this can be another hurdle. If you’re not careful about what you emphasize, visitors could focus on the “Risks” section that you must tell them about. In exhibit evaluation and research, this is often only fatigue or discomfort when someone feels they don’t know the right answer (despite assurances that no one is judging them). But of course, you have to be thorough and make sure they do understand the risks and benefits, who will see the information they give and how it will be used. Luckily, we don’t often need to collect personal information, even signatures, if we’re not using audio or video recording.

 

Then there is the problem of children. We want to assess the visit with the true types of groups that we see, that is, mostly families or mixed adult-child groups. However, anyone under 18 needs to have consent given by a parent. Unfortunately, a grandparent, aunt, uncle, sister or brother doesn’t count, so you have to throw out those groups as well. Even if a parent is present, you have to make sure that you can explain the research to the youngest visitor you have permission to study (usually about 8 years old) and even worse, explain the assent process to him or her without scaring them off. As our IRB office puts it, consent is a process, a conversation, not just a form.

 

So who knows if we’re really truly getting a representative sample of our visitors? That’s definitely a question about sampling theory. Luckily for us at Hatfield, we’re working with our campus IRB office to try and create less-restrictive consent situations, as when we don’t have to get a signed consent form if that’s the only identifying information we ask visitors to provide. Maybe we’ll be able to craft a situation where over-18 family members will be able to provide consent for their younger relatives if a parent didn’t travel with them that day. Luckily, as this progresses, you’ll be able to follow it on our blog.

 

Wow, you’ve read this far? Thank you so much, and enjoy the rest of your visit.

 

 

Day 2’s sessions ended up focusing on the Communication side of Education/Outreach/Scientific Workforce, and I think that framing it that way drew a bigger audience. One presentation on how to create a video was very similar to Ari Daniel Shapiro’s Education talk on producing radio programs or podcasts the day before, with how-to’s, but the audience was much bigger. Is it that “education” and “outreach” are scarier terms than “communicating”? If so, we educators need to think about how to make education more “do-able” for scientists if we want them to do the education to at least some extent, rather than leaving it all to education professionals.

 

We wonder, however, why education and communication are separated? Perhaps we have slightly different goals, but perhaps not: communication may have a specific outcome in mind, such as motivating people to think a certain way or do a certain thing, where education might more broadly want learners to understand how science works.

 

One afternoon talk pointed out that between science and communication, at least, it depends on your audience. For example, scientists focus on what we don’t know, whereas policymakers need to know what science does know. So in communicating and educating, we have to decide whether we’re trying to convey what science is and how it works, or whether we’re trying to convey where science is at the moment.

 

Throughout the week COSEE is hosting a series on how to do education/communication of your science. These lunch workshops that have had about 100 participants, or roughly 2 percent of the conference attendees. Again, by a show of hands on Tuesday, many of those, however, were graduate students.

 

Last night there was a panel discussion on Bridging the Cultural Gap between Scientists and the Public. I overheard one scientist at the COSEE exhibitor booth pooh-pooh the need for him to attend the panel, as he basically said he knew there was a gap but that it was the public’s problem. COSEE staff made a valiant effort to convince him that he was actually a vital part of bridging the gap, but regardless, there was still a relatively small audience for the program. However, attendees seemed to skew a little bit more toward the early- to mid-career scientists than the other education sessions (perhaps because the grad students had all run off to get beer). We had mixed feelings about the presentation because we walked away a bit more confused about what we could do. The researchers on communication and communicators on the panel offered us ideas about what the communication breakdown was, but we didn’t get a chance to discuss many practical ideas.

 

The basic premises were that it’s not a problem of literacy, but of people tending to affiliate with groups and basically only attend to information that those groups agree with, in order to maintain that affiliation. The other presentation highlighted peculiarities of the journalistic process that complicated the communication picture, such as editors who focus on minor details to up the drama factor and sell their products. So on the one hand, we need to remove the threat that holding a position on a subject would automatically mean you’d no longer be part of a group that’s important to you, that is, we have to change the “meaning” that accompanies the facts. How to do this, however, is what remains for us to figure out.

 

 

 

 

Okay, so I started things off as a bit of a downer. Considering this is only the third Oceans conference to include education strands, it’s great that it’s being supported.

 

However, I wonder if a large scientific conference is the best place to sell outreach to scientists for a number of reasons. For one thing, the education research sessions basically competed with the scientific sessions, almost as if it was a parallel conference, scheduled at the same time so one physically could not attend both. Shawn noted that the evening and lunch workshops on outreach are often well attended, for example (at least by graduate students), but that doesn’t get our research out there.

 

For another thing, the education presentations focused a lot on specific program evaluation results, as I mentioned yesterday. In that way, they really were not speaking to scientists who were looking to get involved in outreach, at least beyond trying to make the case for it in terms of the personal fulfillment results and opportunities for increased funding. The sessions were by and large not aimed at delivering the skills to a broad audience that people could take back to whatever institution they worked with. The specificity of many projects showed more that such programs were possible and rewarding, without offering opportunities for people at other places ways to get involved. On the other hand, Ari Shapiro of Woods Hole (and often heard on NPR) gave many how-to examples for either partnering for general media publishing or do-it-yourself podcasting and multimedia presentations. The low-cost, do-it-yourself options of course appealed to the educators in the audience as well.

 

Nevertheless, for those of us that are going back to our institutions and hoping to help the scientists we work with there, there were several interesting findings from the sessions:

 

1) Scientists are still largely unaware of the work we do, especially that there is educational literature out there about what works.

2) All participants in these programs, educators, scientists, and the ostensible “audience” each play roles as both teachers/facilitators and learners at various times. Educators and public audiences both have frequent opportunities for reflecting on their experiences during programs. Most of this occurs via feedback to each other as both groups are fairly familiar with their roles in these situations. On the other hand, the scientists often lack such opportunities outside of program evaluations, to reflect on either of their roles, or even the fact that they play both of those roles during the experience. They probably also need tools to help them do that reflection.

3) There are a lot of great programs reaching maybe 50 teachers at a time. If each of those teachers reach say 200 students each per year, that’s still only 10,000 students, with perhaps a little more via “trickle down” to other teachers that the program teachers work with. In a country with maybe 100 million students, we have a lot of work to do. And we need a lot of money to do it. And we need evidence of these things working and ways to can scale them up wherever possible.

4) We have a bit of work to do even among our education community about the value of qualitative data and what it can tell you, including the fact that there are people out there that can help analyze that data if you have it.

5) We need more research that’s applicable to a lot of situations, not just evaluations of great projects.

 

I love being in an emerging field, but some days it’s not emerging fast enough.

 

No, it’s not the George Clooney movie sequel. It’s a chance for us to blog near-real-time about our experiences at getting the word out about what we do to the scientists we’re trying to work with. Several of the lab members are attending the Oceans 2012 meeting in Salt Lake City this week. The meeting is a joint offering of The Oceanography Society, Americal Geophysical Union, and the Association for the Sciences of Limnology and Oceanography.

The meeting is a typical science conference: many many many presentations in all sorts of subdisciplines of ocean sciences, and about one session on education, outreach, communication, evaluation, and student engagement. So, we still have an uphill battle to reach those folks who need to do “broader impacts” for their grants or just as part of the greater good. That is, like recycling, there isn’t always a personal gain, but when you consider the bigger picture, the argument is that it’s the right thing to do (thanks, Jude).

In listening to the first set of education and evlauation presentations this morning, it struck me that the evaluations almost all noted that the majority of their participants in delivering outreach were graduate students. The good news is, the graduate students were very enthusiastic about the programs and found it rewarding both by learning new ways of doing teaching and outreach and also by improving their own research at some times. This bodes well for exciting this new generation of scientists to get involved in outreach.

The bad news, though, is by and large, the practicing scientists were still missing. Are they thinking that their grad students will “take care of it” for them? Do they still find it a hassle, unrewarding, time-consuming, and an activity that basically takes away from their own research? Are they afraid of kids? Have they tried it and had a bad response (probably because they got little guidance on how to do it in the first place)? This is a large cadre of professionals that we can’t afford to ignore, no matter how excited the next generation is. We can’t let them, or ourselves as outreach and education professionals and researchers, cop out.

So, the question remains, how do we reach this population? I’m hoping to talk up our program and work as the week progresses, but this venue is challenging, to say the least, with only a few general mixers and the sheer number of simultaneous presentations.

Dare we hope in 2014 for a plenary talk about education and outreach for *all* scientists? Or educational presentations that aren’t competing with the scientific ones? Or scientists that start to attend education and outreach conferences? A woman can dream …

Beverly Serrell, a pioneer in tracking museum visitors (or stalking them, as some of us like to say), has just released a nice report on the Center for the Advancement of Informal Science Education (CAISE) web site. In “Paying More Attention to Paying Attention,” Serrell describes the growing use of metrics she calls tracking and timing (T&T) in the museum field since the publication of her book on the topic in 1998. As the field has more widely adopted these T&T strategies, Serrell has continued her work doing meta-analysis of these studies and has developed a system to describe some of the main implications of the summed findings for exhibition design.

I’ll leave you to read the details, but it really drove home to me the potential excitement and importance of the cyberlab’s tracking setup. Especially for smaller museums that have minimal staff, implementing an automatic tracking schemes, even on a temporary basis, could save a lot of person-hours in collecting this simple, yet vital data about exhibition and exhibit element use. It could allow more data collection of this type in the prototyping stages, especially, which might yield important data on the optimum density of exhibit pieces before a full exhibition is installed. On the other hand, if we can’t get it to work, or our automated design proves ridiculously unwieldy (stay tuned for some upcoming posts on our plans for 100 cameras in our relatively-small 15000 square foot space), it will only affirm the need for good literal legwork that Serrell also notes is a great introduction to research for aspiring practicioners. In any case, the eye tracking as an additional layer of information that we use to help explain engagement and interest in particular exhibit pieces might lead eventually to a measure that lends more insight into Serrell’s Thorough Use.

(Thanks to the Museum Education Monitor and Jen Wyld for the tip about this report.)

 

Dr. Rowe advises several students each year, from many of the programs on campus that have education tracks as well as the main Science and Math Education Free-Choice Learning program. We meet as a group regularly, and yesterday we got into the subject of time management. Dr. Rowe has responsibilities both as a professor and as the Interim Director of Education for Oregon Sea Grant, and he was sharing that in the face of his administrative responsibilities, especially, the “research activities” often get pushed to the side.

As a PhD candidate, I am in the process of tweaking my proposal to send to my committee. Yet it is so much more tempting to spend my time doing things for the development of the cyberlab tools, which I am paid to spend about 20 hours a week on. To me, right now, it seems so much more concrete and efficient. For example, for my proposal, I’ve just spent about half an hour in a frustrating (and so far, futile) search on the web and in the school library for an article to cite for a fact that I know but haven’t had to cite in a while. If I had spent a half hour updating the inventory database for the lab, however, I would have tangible results in the form of organized entries for a number of our new technology items.

Forcing myself to write or revise is a chore, but ultimately, when I get into it, intellectually rewarding, aside from the futile citation searches. Breaking writing tasks down into more manageable chunks than “write a research proposal” seems to be a lot harder than seeing the finite chunks for the lab development. What other strategies do we use as researchers to be sure to make research progress and not let things “drag” on our to-do lists as we accomplish more obvious, yet perhaps less important, tasks?