Hello everyone,

In the light of this week’s lab discussions on defining the many “literacies” there are and search for perhaps a more appropriated term (such as the the term “fluency” suggested by Katie Stofer), I would like to stretch the debate to discuss Environmental Literacy (EL) in particular. So, since I wasn’t in lab this week, here is my two cents on the literary definitions of the term:

Generally, a “desired outcome” of environmental education (EE) is to create a public that is environmentally literate (whatever that means). Many EE programs and materials have this as a stated purpose. However, the definitions and measurement tools of environmental literacy (EL) has remained elusive. Some national surveys have been conducted that attempt to measure literacy of the general public. A few states have attempted to periodically survey their citizenry to gather EL data. While these are important attempts, I believe that many of the questions asked in the instruments used still lack in accurately measuring some “degree” of EL as defined in their proposals. Further, I believe that these important instruments fail to account for cultural and educational system differences and don’t always take into consideration accepted benchmarks for EE.

As the term “literacy” first appeared, it was solely associated with the idea of being able to read and write. Michaels & O’Connor (1990) attempted to provide a better understanding of the concept, proposing that “… we each have, and indeed fail to have, many different literacies. Each of these literacies is an integration of ways of thinking, talking, interacting, and valuing, in addition to reading and writing … [literacy] is rather about ways of being in the world and ways of making meaning…” 

Dinsinger & Roth (1992), in their Environmental Literacy Digest, gave credit to Charles E. Roth as the one who coined the term “environmental literacy” in 1968. They reviewed various definitions of EL, and suggested that it should be based on an ecological paradigm, which includes interrelationships between natural and social systems. A person who is environmentally literate relates his/her values with knowledge to generate action. Here is a brief list of EL definitions given by various authors and organizations since then (some referring to it as Ecological literacy), and that highlight the complexity of such discourse:

“[EL] is the capacity of an individual to act successfully in daily life on a broad understanding of how people and societies relate to each other and to natural systems, and how they might do so sustainably. This requires sufficient awareness, knowledge, skills and attitudes to incorporate appropriate environmental considerations into daily decisions about consumption, lifestyle, career, and civics, and to engage in individual and collective action.” ( Elder, 2003) 

 “Ecological Literacy presumes a breadth of experience with healthy natural systems… a broad understanding of how people and societies relate to each other and to natural systems and how they might do so sustainably… the knowledge necessary to comprehend interrelatedness… an attitude of care or stewardship… in a phrase, it is that quality of mind that seeks out connections… Ecological Literacy is driven by the sense of wonder, the sheer delight in being alive in a beautiful, mysterious, bountiful world… to become ecologically literate, one must certainly be able to read… to know what is countable and what is not… to think broadly, to know something of what is hitched to what… to see things in their wholeness… to know the vital signs of the planet… to know that our health, well-being, and ultimately our survival depend on working with, not against, natural forces…” (Orr, 1992) 

“EL is a set of understandings, skills, attitudes, and habits of mind that empowers individuals to relate to their environment in a positive fashion, and to take day-to-day and long term actions to maintain or restore sustainable relationships with other people and the biosphere … The essence of EL is the way we respond to the questions we learn to ask about our world and our relationship with it; the ways we seek and find answers to those questions; and the ways we use the answers we have found.” (Roth, 2002) 

 “Ecological Literacy is the ability to ask: And now what?” (Garret, 1999) 

“EL should aim to develop:  

  • Knowledge of ecological and social systems, drawing upon disciplines of natural sciences, social sciences, and humanities; 
  • Go beyond biological and physical phenomena to consider social, economic, political, technological, cultural, historic, moral, and aesthetic aspects of environmental issues; 
  • Recognize that the understanding of feelings, values, attitudes, and perception at the center of environmental issues are essential to analyze and resolve these issues; 
  • Critical thinking and problem-solving skills for personal decisions and public action.” (Dinsinger & Monroe, 1994) 

“EL should aim for: 

  • Developing inquiry, investigative, and analytical skills; 
  • Acquiring knowledge of environmental processes and human systems; 
  • Developing skills for understanding and addressing environmental issues; 
  • Practicing personal and civic responsibility for environmental decisions.” (NAAEE, 1999; Archie, 2003) 

 

Even though all of the definitions above have some common attributes, based wholly or in part on the AKASA (awareness, knowledge, attitudes, skills and action) components listed in the Tbilisi declaration, some different aspects and considerations are arrived at through different perspectives:

Orr and Elder’s definitions are very similar (Orr uses the term “ecological literacy” instead of “environmental literacy”). However, Orr clearly emphasizes the importance of intrinsic values and abstract feelings, as do Dinsinger and Monroe. Dinsinger and Monroe, as well as NPEEE, mention “interdisciplinary” in their definitions; The NPEEE standards and others do not include the latest thoughts and advances in EE, such as notions of sustainability, or even locally-based educational issues. Roth takes these notions into consideration when implying the necessity to understand changes. The NAAEE definition refers not only to personal action but also goes further to mention “civic” obligation.

The question about what Environmental Literacy is and what it should approach at its core are still far from being answered in a common agreement between scientists and practitioners in the field. Morrone et al (2001) reaffirm that the study of environmental literacy is relatively new, and no definition has been given to it that is universally accepted, and consequently the attributes of an environmentally literate citizen are still subject to discussion and investigation. However, what has been discussed so far in the literature, and in the thousands of meetings of the “real world of practicing Environmental Education”, are very important for the understanding of what environmental literacy should be aiming for, even if a widely accepted definition is never agreed upon.

 

Sorry for the long post if you are interested in the literature cited here visit the link and you can see my entire thesis.

http://www.iowadnr.gov/portals/idnr/uploads/REAP/files/literacy_thesis.pdf

If you are interested, my next post can be about the applied research in environmental literacy.

Hope I didn’t bore to death with this. To me is still a fascinating subject.

Thanks!

Susan

 

 

 

 

 

 

 

Are you an educator, paid or volunteer, looking for an opportunity to improve your practice through understanding more about learning theory and evaluating exhibits and programs? You may be interested in OSU’s new professional certificate program, starting at the end of this month.

Contacts
Bridget Egan
Bridget.Egan-at-oregonstate.edu

Free-Choice Learning and Informal Education continuing education certificate
program begins April 1st

March 1, 2013

Registration is now open for the Free-Choice Learning Professional Certificate, an
online program offered by Oregon State University.

The program helps professional and volunteer educators in museums, zoos, aquaria
and educational outreach programs discover more about free-choice learning:
the study of what, where and how we choose to learn over the course of our
lifetimes. Participants will study learning theory and practice using theory and
research towards building their professional practice with in their community.
Participants will also actively engage in practice evaluations of exhibits, programs
and curriculum.

Courses are taught by experienced Oregon State faculty and researchers: Lynn
Dierking, Sea Grant professor and interim associate dean for research in the OSU
College of Education; John H. Falk, Sea Grant professor and interim director of the
Center for Research in Lifelong STEM Learning; Shawn Rowe, marine education and
learning specialist at Oregon Sea Grant Extension; and Jennifer Bachman, instructor
and Free-Choice Learning program coordinator.

Learn more about the Free-Choice Learning Professional Certificate on our website.

OSU Professional and Noncredit Education provides education and training for
businesses, organizations, associations and professionals anywhere throughout the
state and beyond. The majority of professional and noncredit students are focused
on continuing education: licensure recertification, professional development and
personal enrichment. OSU Professional and Noncredit Education is part of the
university’s Division of University Outreach and Engagement.

This innovative program immerses museum, zoo, aquarium and science outreach professionals and volunteers in free-choice learning theories. Participants will work with some of the field’s leading researchers and learn how to apply informal learning environments theories in real-world educational settings. Participants can earn the Free-Choice Learning Professional Certificate by completing each required course for this program or take individual courses without enrolling in the professional certificate program.

SPRING 2013

Designing Learning Environments: Physical dimensions of free-choice learning
Learning is influenced by the interaction of variables within three contexts — personal, socio-cultural and physical. This course focuses on how macro-scale environmental factors, like space, crowding and novelty, and micro-scale environmental factors, like design elements, real objects and different media, support free-choice learning.
Instructors: Shawn Rowe, Ph.D. and Jennifer Bachman, Ph.D.
Launches April 1, 2013

SUMMER 2013
Developing Effective Evaluations
Developing Effective Evaluations is an introductory course that focuses on providing a hands-on approach to effectively assessing/evaluating learning and behavior within the free-choice learning contexts such as museums, national parks, zoos, aquariums and broadcast media. The design and implementation of an evaluation is used as a lens for understanding the hows and whys of assessment and evaluation. This course is designed to help professionals design their own evaluation/assessment research as well as become informed consumers of others’ research.
Instructors: Marianna Adams, Ph.D. and Jennifer Bachman, Ph.D.
Launches June 24, 2013

FALL 2013

Examining The Learner’s Own Ideas: Personal dimensions of free-choice learning
Investigates the fundamental roles that identity, motivation, interest, prior knowledge and experience, and choice and control play in supporting learning and how learning leaders can build on these dimensions of learning in order to successfully engage lifelong learners.
Instructors: John Falk, Ph.D. and Jennifer Bachman, Ph.D.
Launches September 2013

WINTER 2014
Understanding Cultural Influence: Sociocultural dimensions of free-choice learning
Investigates connections between theories of free-choice learning and the fundamental concepts of sociology, social psychology and anthropology: social stratification, social structure and interaction, social institutions, and cultural background.
Instructors:  Lynn Dierking, Ph.D. and Jennifer Bachman, Ph.D.
This course and more electives launch January 2014

 

I’d like to introduce you to a type of Science Center visitor I call “Fish Stick Boyfriend.” Here’s a common demographic profile, based on my own experience:

-White

-Male

-30-35 years old

-Visiting with a female companion (and sometimes children)

The interaction generally follows a simple pattern. Fish Stick Boyfriend frowns and paces while his companion darts from exhibit to exhibit. I’m siphoning a tank, and she is too engaged with the surrounding interpretive content to notice I’m there. Fish Stick Boyfriend notices me, though. He wants to talk.

“So,” he says, pointing at an equally disinterested rockfish, “can you eat those? What do they taste like?”

He’s being sarcastic—at least that’s what he thinks he’s doing. Fortunately, I’ve seen many Fish Stick Boyfriends before, and I know what’s going on. I tell him what rockfish tastes like and where to get it. Then I tell him why it tastes the way it does, and how that relates to the animal’s life history. Then I show him an animal that tastes different, explain why, and tell him where he can go to buy it.

Fish Stick Boyfriend is now usually smiling and looking at some exhibits, and occasionally we actually start talking. His initial comment reveals some useful things:

1. He feels out of place

2. He’s familiar with fish as food

3. He wants to interact with somebody, but he chose an aquarist over a designated interpreter

On the exhibit floor, I’m “just a guy.” Visitors sometimes feel comfortable talking to me when they might avoid an interpretive volunteer or education staff member. Part of the reason may be that I’m usually facing the same direction they are—a small but significant proxemic distinction. I’m talking with them, rather than at them. I’m having a conversation, rather than giving a lecture. It’s not even much to do with what I say—the visitors’ perception makes all the difference.

When it comes to engaging the peripheral learners in a group, I’ve found that the most effective interpreters are often not interpreters at all. Fish Stick Boyfriend doesn’t think he likes Science Centers, but he’s comfortable talking to “the guy who cleans the tanks.” He sees me as a peripheral figure, too.

Over the past few years, I’ve developed a rough, conversational interpretive plan for just about every object in the Visitor Center. The octopus sculpture at the front desk can be used to talk about anatomy. Laura’s footprint decals can be used to talk about population genetics via variations in calcaneal pitch. Exhibits under construction can be used to talk about interpretation itself.

Whether you’re a trained interpreter or not, it’s important to recognize your relationship to the visitor experience. If you’re not perceived as a representative of the institution, you can use that as a position of power on behalf of the visitor. You’re “just a guy” or “just a girl,” changing a light fixture or measuring a table or feeding a frog or miming the destruction of an uncooperative video player. Some visitors may see you as the only approachable person in the building, and your response is crucial.

Fish Stick Boyfriend is bored, and only you can help him.

This past week has confirmed for me that video coding is an arduous task! Right now I’m continuing to code my video data for my dissertation , and working on my criteria for analysis that will allow me to reduce the data and finish answering my research questions. I’m basically looking at the different modes of how docents interact with visitors (speech, gesture, etc) and suggesting patterns in which they interpret science to the public. I’m cross referencing the themes that emerge from this video analysis with my interview data to come up with some overarching outcomes.

So far the themes seem fairly clear, which is a nice feeling. Plus there seems to be a lot of cross over between the patterns in docent interpretation strategies, and what the literature deems effective interpretation. What is interesting is this group of docents have little to no formal interpretive training. So perhaps good communicative practice emerges on its own when you have constant contact with your audience. Food for thought for professional development activities with informal educators…

What’s interesting about this process is how well I know my data, but how tough it is to get it down on paper. I can talk until I am blue in the face about what my outcomes are coming out as, but it’s like translating an ancient text to get it written up in to structured chapters. Ah, the right of passage that is the final dissertation.

All this video coding has also got me thinking about our development of an automated video analysis process for the lab though. What kind of parameters do we set to have it process the vast landscape of data our camera system can collect, and therefore help reduce the data from the word go? As a researcher, imagining a data set that is already partially reduced puts a smile on my face.

So back to coding. I see coded people….

This week at HMSC we have been working on clearing out Dr. Rowe’s old office to make way for a new research office. This office will become the main working area for the FCL research we conduct in the HMSC visitor center. It will also become the office of the new postdoc we will be hiring for the lab in the future.

With the help of the lovely Maureen and Susan, we cleared out old paperwork and moved furniture to create a more open space for collaborative work and equipment storage. We were very happy with the results!

Hopefully the space will simplify project management for research taking place for the lab!

 

Harrison enjoys the extra space in the new research office!

Susan makes our mark in magnets in the new office

Awhile ago, I promised to share some of my experiences in collecting data on visitors’ exhibit use as part of this blog. Now that I’ve actually been back at it for the past few weeks, I thought it might be time to actually share what I’ve found. As it is winter here in the northern hemisphere, our weekend visitation to the Hatfield Visitor Center is generally pretty low. This means I have to time my data collection carefully if I don’t want to spend an entire day waiting for subjects and maybe only collect data on two people. That’s what happened on a Sunday last month; the weather on the coast was lovely, and visitation was minimal. I have been recently collecting data in our Rhythms of the Coastal Waters exhibit, which has additional data collection challenges in that it is basically the last thing people might see before they leave the center, it’s dim because it houses the projector-based Magic Planet, and there are no animals, unlike just about every other corner of the Visitor Center. So, I knocked off early and went to the beach. Then I definitely rescheduled another day I was going to collect data because it was a sunny weekend day at the coast.

On the other hand, on a recent Saturday we hosted our annual Fossil Fest. While visitation was down from previous years, only about 650 compared to 900, this was plenty for me, and I was able to collect data on 13 people between 11:30 and 3:30, despite an octopus feeding and a lecture by our special guest fossil expert. Considering data collection, including recruitment, consent, the experiment, and debrief probably runs 15 minutes, I thought that this was a big win. In addition, I only got one refusal from a group that said they were on their way out and didn’t have time. It’s amazing how much better things go if you a) lead with “I’m a student doing research,” b) mention “it will only take about 5-10 minutes”, and c) don’t record any video of them. I suspect it also helps that it’s not summer, as this crowd is more local and thus perhaps more invested in improving the center, whereas summer tourists might be visiting more for the experience, to say they’ve been there, as John Falk’s museum visitor “identity” or motivation research would suggest. This would seem to me like a motivation that would not make you all that eager to participate. Hm, sounds like a good research project to me!

Another reason I suspect things went well was that I am generally approaching only all-adult groups, and I only need one participant from each group, so someone can watch the kids if they get bored. I did have one grandma get interrupted a couple times, though, by her grandkids, but she was a trooper and shooed them away while she finished. When I was recording video and doing interviews about the Magic Planet, the younger kids in the group often got bored, which made recruiting families and getting good data somewhat difficult, though I didn’t have anyone quit early once they agreed to participate. Also, as opposed to prototyping our salmon forecasting exhibit, I wasn’t asking people to sit down at a computer and take a survey, which seemed to feel more like a test to some people. Or it could have been the exciting new technology I was using, the eye-tracker, that was appealing to some.

Interestingly, I also had a lot of folks observe their partners as the experiment happened, rather than wander off and meet up later, which happened more with the salmon exhibit prototyping, perhaps because there was not much to see if one person was using the exhibit. With the eye-tracking and the Magic Planet, it was still possible to view the images on the globe because it is such a large exhibit. Will we ever solve the mystery of what makes the perfect day for data collection? Probably not, but it does present a good opportunity for reflection on what did and didn’t seem to work to get the best sample of your visitorship. The cameras we’re installing are of course intended to shed some light on how representative these samples are.

What other influences have you seen that affect whether you have a successful or slow day collecting exhibit use data?