As I work towards a coherent research question for my dissertation, I find myself challenging assumptions that I never dealt with before. One is that visitors trust the science that is being presented in museums. There is lots of talk about learning science, public understanding of science, public engagement, etc., but trust is frequently glossed over. When we ask someone what they learned from an exhibit, we don’t also ask them how reliable they feel the information is. Much like the various fields of science, there is an assumption that what is being presented is accurate and unbiased in the eyes of the visitor.

In accordance with this, it is also frequently assumed that visitors know the difference between good science, bad science, pseudo science, not science, science in fiction, and science fiction; and that this is reflected in their visitor experiences in science museums. Especially in the internet age, where anyone can freely and widely distribute their thoughts and opinions and agendas, how do people build their understanding of science, and how do these various avenues of information impact trust in science? Media sources have been exposed in scandals where false “science” was disseminated. Various groups deliberately distort information to suit their purposes. In this melee of information and misinformation, are science centers still viewed as reliable sources of science information by the public?

Since defending and getting back in to our lab duties full time, Katie and I realized today there are a lot of tasks to tackle for the lab before the summer break! Basically it’s all about getting the lab’s ducks in a row and putting in place some procedures for both research and equipment that will help the lab move forward in the future and ease transitions with new students and new scholars working with the lab.

Firstly, we’ve come to a space crunch with the visitor center tech storage. So the first task is to clear out unused or unwanted pieces we have been holding on to in order to minimize our space needs and make room for tech development and new equipment in the future. Secondly, we need to inventory everything we have so far. There are so many pieces of cameras, mics, etc that we have experimented with and tested that need to be accounted for and documented. This is an essential step because it will help us decide what we are still missing from our suite of tools, and put some equipment loan procedures in place to protect the quality and security of the equipment. Thirdly, there are research agendas that need developing. These will determine the next stages for getting research and data collection moving in the visitor center within the overarching lab agenda, and help drive technology development for the lab for the future. They will also help us plan next steps for ongoing IRB applications. Research tools aren’t much good if you’re not sure how you will be using them for research. Lastly, there are the next steps. We will be making data collection and technology development plans for the upcoming months to help build research “game plans” and tasks lists for the future.

All in all, it’s quite an exciting time for the lab, and for me personally I love all this organizing and “cleaning house”. Knowing where you  are in a project and where you are going in my opinion makes for great future products.

Last weekend there was a wonderful free choice learning event at Lincoln City Oregon – The Remotely Operated Vehicle Competition.  It was so much fun to watch and perform the role of judge.  This is an event that is sponsored by the Marine Advanced Technology Education Center and numerous local and national sponsors.  The most interesting thing to me is the level of excitement that surrounds these events from all involved.  However today I am going to write about one particular participant from last Saturday’s event.  This particular sophomore chaired his team for the Rovers portion of the competition which meant they were competing at the level to win the only slot to move forward to the international competition and prize money to help offset costs.  This particular participant had a serious of events on Saturday that would make any person, young or old most likely walk away from the competition.  In my mind his actions truly embodied what it means to be a good sport, but the aspects of free choice learning. 

First of all during the debriefing it was clear that another team his team was competing against had not brought all their materials nor had they read the rules.  I instantly offered to share his supplies and the printed out materials with them which he was not required to do.  When the head judge said he did nto have to do that, the instructions were clear online, he said it is all for learning and fun isn’t it – I’m I allowed to share.  We said sure.  Next thing, his team members did not show up.  This meant that he was instantly disqualified if he did not have at least one more person with him “on deck” for the trails and for the competition.  He enlisted the help of one of his family members.  The judges told him that he still mostly would not advance as the team had changed from the date of submission.  He said ok, but can I still go through the event.  Yes was the answer.  Next his ROV did not meet specs.  He was given 20 minutes to alter it – he did it passed.  He proceeded with the trails and placed higher then I actually thought his ROV could achieve.  Impressive driving for the limited machine.  However this is not all, he watched other competitors, cheered the younger competitors on.  Walked around and read the various posters that other teams produced and encouraged the other teams throughout the event.  When chatting with him, he remarked about how much fun this was and how much he was learning.  All on his own choice!  He didn’t win, he didn’t make the paper, but his actions stood out enough that he was voted to receive a Spirit Award that he did not know even existed.  Congratulations – “Abandoned Ship”

Hi all!

I have  been doing some readings for my Advanced Qualitative Methods Class and run into some interesting remarks about the challenges of  qualitative data analysis. I though I would share this with you. If you are still to dive into data analysis for your projects, I think these are good references to have as they offered many strategies to cope with the challenges of analyzing qualitative data.

The readings brought forth the idea that the steps and rationale of qualitative data analysis is often obscured in research reports.  There is no widespread understanding in the field as to how qualitative analysis is to be done. Can there ever be such an understanding? Given the very nature of qualitative analysis, no single cookbook is possible, but some strategies have been proposed by various researchers and have been proven helpful in aiding analysis of data.

Bulmer (1979) discusses concept generation, referring to previous work from other researchers who attempted to address the “categorization paradox” and the problem of validating concepts defined/used in qualitative analysis. The “sensitizing” concepts of Blumer, the “analytical induction” of Znaniecki, and the “grounded theory” of Glaser and Strauss are all, within their limitations, sources of insight for thinking about concept validation, as they bring forth the importance of conceptualizing in a way that is faithful to the data collected. I believe this was important to the development of inductive research in more rigid ways that allowed for appropriate generalizations.

Since then, other publications have emphasized the practice of qualitative data analysis and strategies to consider along the way (e.g. Emerson et al. 1995; Lofland et al. 1984; Weiss 1995).  Developments have been made in discussing concerns about data faithfulness and its interplay with the subjectivity of the researcher. I particularly like the Lofland et al.  (1984) definition of analysis as a transformative process, turning raw data into “findings/results”. Here the researcher is a central agent in the inductive analysis process, which is highly interactive, labor intensive and time consuming, and therefore requires a systematic approach to analyzing data in order to account for the interplay between the data and the researcher-produced theoretical constructs.  The authors suggest a few strategies to use while analyzing data, two of which I would like to elaborate on here: normalizing and managing anxiety and memoing.

I have read many qualitative methods materials and they all discuss the need for the qualitative researcher to recognize and be aware of his/her subjectivity in the course of preparing for, conducting and writing about a research problem.  Lofland et al. (1984) touches further on a point that I now believe to be key to subjective interference in data analysis, the issue of researcher anxiety.  At first it seemed to be an overstatement, but the more I read the more I found substance in the issue. Understanding a social situation is no easy task and requires an open-ended approach that can cause much anxiety as the researcher is confronted with the challenge of finding significance in the materials. Ethical and emotional issues come into play in the midst of making sense and organizing a rapidly growing body of data and they can negatively affect the research experience if not dealt with properly.

The authors emphasized five anxiety-management principles for researchers to think about: 1) recognition and acceptance of anxiety; 2) Start analysis during data collection; 3) be persistent and methodical; 4) accumulation of information, at minimum, will ensure some content to talk about; 5) Discuss with others in same situation.   These strategies really addressed my worries regarding the process of data analysis. High emotions, fears, and wanting to quit are all part of anxiety reactions I have been feeling myself.  I believe starting early and being methodological and persistent are key strategies to deal with anxiety issues because it can assure you have time to address the challenges, make changes and not be so frustrated in the course of doing so.

If starting early, initial coding can be done in advance of starting focused coding, giving the researcher time away from the data that may needed to reduce anxiety. Early coding assures the possibility for early memos, which can help clarify connections along the way and assure persistence will prevail due to observable progress. I believe memos are the start of the  “transformative process’’ that Lofland et al. (1984) were referring to while defining data analysis. It is the bridge between the data and the researcher’s meanings, a first draft of a completed analysis where the interplay between data and theoretical constructs take place. Consequently, writing memos become necessary rather than optional.

Both Lofland et al. (1984) and Emerson et al. (2011) extensively discuss the memoing process. Operational memos are notes to self about research procedures and strategies. Code memos clarify assumptions underlying written codes. Theoretical memos record the researcher’s ideas about the codes and relationships. These are the memos that can take place even before coding starts, and that provide the basis for the “integrative” memoing that Emerson et al. (2011) refer to as they talk about identifying, developing, and modifying broader analytic themes and arguments into narrower focused core themes. Furthermore, while Lofland et al. (1984) explores the art of writing memos, Emerson et al. (2011) emphasizes the “reading” of memos, and the importance of reading notes as a whole and in the order they were written as beneficial to this integrative process of making meaning. This aspect added a fourth layer of subjectivity in addition to the layers of observing, deciding and writing about a phenomenon – the layer of reading and making sense of them.

In the course of doing so, the researcher’s assumptions, interests and theoretical commitments influence analytical decisions. In this sense, data analysis is not just a matter of “discovering” but a matter of giving priority to certain incidents and events from data materials in order to understand them in a given case or in relationship to other events.  This idea is interesting to me as I used to think of theoretical constructs emerging from the data in a process of discovery, and now I see it as a process of immersion. The researcher not only can immerse him/herself in the phenomenon being studied during data collection, be he/she is also immersed during data analysis as these inseparable subjective decisions shape the theoretical constructs. While I still think there is an aspect of discovery, it is somewhat created rather than naturally occurring.

In sum, there are several methodological attempts to clarify the logic of qualitative data analysis. However, the use of such guidelines and strategies are not very transparent in research reports and one may be left wondering about how the data analysis was actually done, how exactly the concepts came to be in a given study. Nevertheless, such methodological strategies highly emphasize the interplay between concept use and empirical data observation. Although a logical process does take place in analysis and it is indeed crucial to the systematization of ideas and formation of concepts, it seems to me this process is as logical as the researcher makes it within his/her sociological orientation, the study of substantive framework and the nature of the phenomenon in study. In this sense, nothing is really created but transformed through a logical theorizing process that is unique to the research in question.  Nothing is discovered by chance, qualitative analysis is rather an “analytical” discovery.

 

References

Bulmer, M. (1979). Concepts in the analysis of qualitative data. Sociological Review, 27(4), 651-677.

Emerson, R. M.; Fretz, R. I.;  & Shaw, L. L. (1995). Writing Ethnographic Fieldnotes. University of Chicago Press, Chicago, IL.

Glaser, B. G. & Strauss, A. L. (1967). The discovery of grounded theory: strategies for qualitative research. Aldine de Gruyter.

Lofland, J.; Snow, D.; Anderson, L. & Lofland, L. (2011). Analyzing Social Settings: a guide to qualitative observation and analysis. Wadsworth.

Weiss, R. S. (1995). Learning from strangers: The art and method of qualitative interview studies. Simon and Schuster Inc. New York.

 

This post will be a light one, as most of my waking—and non-waking—hours are now occupied by a very small person who emerged from my wife recently. This very small person falls asleep when I play a certain type of music at a low volume, which got me thinking.

What makes a thing or circumstance “metal?” I’m not referring to metal in the material sense, but in the cultural and aesthetic sense. “Metal” as in “Slayer,” not “metal” as in “aluminum.” It’s a tough question I often amuse myself with, but it does have some relevance to my work as I wait to collect data.

The target audience for my game project is adult tabletop gamers, and I’ve observed a significant overlap between the tabletop gamer/metalhead communities of practice. I think it has something to do with an affinity for dragons and medieval imagery, but that’s conjecture on my part. I’m a very enthusiastic but somewhat peripheral participant in both areas.

I’ve found difficulty identifying the exact criteria used to determine if something is metal, but it’s fairly easy to reach consensus as to what is or is not metal. It would be easy to say it’s a subjective assessment, but this doesn’t appear to be the case. The criteria are difficult to pin down, but there’s a high degree of intersubjectivity here nonetheless. This is what intrigues me.

“Metalness” is a valuable—if not strictly necessary—aesthetic attribute to a large potential audience segment for my work. Ian Christe’s “Sound of the Beast” is a good primer on metal music and culture. Sam Dunn has done some work on metal as a cultural force and musical form, constructing a handy “heavy metal family tree” and several documentaries:

Aquarist Sid defined it rather succinctly: “Metal is black. Metal is contrast.” He elaborated that contrariness is an important aspect of a thing’s metalness. Volunteer coordinator Becca noted the importance of pain, while her husband cited common elements like death, depression, long hair, distorted guitars, double bass drum work and “long Scandinavian winters.”

What do you think? How would you define metal, musically and aesthetically? Can you give an example? What purpose do metal and its meanings serve to the audience(s)?

Let’s talk.

Both Laura and I defend next week, which is why the blog has been a little quiet of late. So, hopefully, it’s the end of our dissertations, and the beginning (or really, continuations) of careers working to create fun and engaging science learning opportunities for all. We both came into the program with a lot of years of actually doing outreach, with a little bit of experience in designing programs and even less in evaluating them. Now we’re set to leave with a great set of tools to maximize these programs and hopefully share the ideas we’ve learned with the broader field as we go.

So that’s set us to thinking about where we go from here. Now I have to build a broader research project that maybe builds off of the dissertation, but the dissertation was so self-contained, and relatively concrete in a way, that the idea of being able to do multiple things again is a bit daunting. I’m almost not sure where to begin! I will have some structure, of course, provided by the grant funding I get, and the partnerships I join. However, it’s important to think about what I want to achieve before I worry about the tools with which to do it – as always, start with the outcomes and work backwards.

It’s fortunate, then, that the lab group has started to discuss our broader research interests with the hopes of finding where they intersect in order to guide future discussions. We’ve been using prezi, creating frames for each sort of focus, then intending to “code” these frames by grouping those with similar topics and ideas. For example, one of my interests at this point is everyday scientist adults keeping current with professional science research developments, for purposes of using that information in their own personal and societal decisions, or simply for keeping tabs on how tax dollars are put to work, or for any other purpose they so desire. So, I’m interested in the hows, whens, and whys of everyday scientists accessing professional science information. This means I overlap with others in the groups working with museum exhibits, but also with people interested in public dialogue events, and in general, the affordances and constraints around learning in these ways.

As the leader of the group, Shawn has mentioned that this has been an exercise he’s used to think about his broader research goals as well, simply writing down his areas of focus, looking back at what he’s done over the past few years, and looking forward to where he wants to go. It also helps him to see what’s matched with his previous plans, and how circumstances or opportunities have changed those plans. I’m grateful to have this fortuitously-timed example of long-term goal setting and building a broader agenda, especially in such a small field where it’s likely that this is the largest group of collaborators in one place that I’ll have for a while. Hopefully, though, I’ll have my own graduate students before too long and maybe even other colleagues who focus on outside-of-school learning as well.

What sorts of tools do you use for figuring out long-term, broad, and somewhat abstract research goals?