I finished the edits and all the various fee-paying and archiving that come along with completing a dissertation. My transcript finally reflects that I completed all the requirements … so now what? I have a research position waiting for me to start in July, but as I alluded to before, what exactly do I research?

In some ways, the possibilities are wide open. I can stick with visualizations, sure, and expand on that into animations, or continue with the in situ work in the musem. I may try to do that with the new camera system at HMSC as a remote data collector, as there is not a nearby spherical system of which I am aware in my new position.

I could also start to examine modeling, a subject that I danced around a bit during the dissertation (I had to write a preliminary exam question on how it related to my dissertation topic). Modeling, simulation, and representation is big in the Next Generation Science Standards, so there’s likely money there.

Another topic of interest dovetails with Laia’s work on public trust and Katie Woollven’s work with nature of science, broader questions of what is meant by “science literacy” and just why science is pushed so hard by proponents of education. I want to know how, when, and most importantly, why, adults search for scientific information. By understanding why people seek information, we can better understand what problems exist in accessing the types of information they need and focus our efforts. A component of this research also could explore identity of non-professionals as scientists or as capable consumers of academic science information.

Finally, I want to know how all this push toward outreach and especially toward asking professional scientists to be involved in or at least fund outreach around their work impacts their professional lives. What do scientists get out of this emphasis on outreach, if anything? I imagine there are a range of responses, from sheer aggravation and resentment to pure joy at getting to share their work. Hopefully there exists a middle ground where researchers recognize the value and even want to participate to some extent in outreach but are frustrated by feeling ill-equipped to do so. That’s where my bread and butter is – in helping them out through designing experiences, training them to help, or delivering the outreach myself, while building in research questions to advance the field at the same time.

Either way, it’s exciting! I hope to be able to blog here from time to time in the future as my work and the lab allows, though I will be officially done at OSU before my next turn to post on my research work. Thanks for listening.

As I work towards a coherent research question for my dissertation, I find myself challenging assumptions that I never dealt with before. One is that visitors trust the science that is being presented in museums. There is lots of talk about learning science, public understanding of science, public engagement, etc., but trust is frequently glossed over. When we ask someone what they learned from an exhibit, we don’t also ask them how reliable they feel the information is. Much like the various fields of science, there is an assumption that what is being presented is accurate and unbiased in the eyes of the visitor.

In accordance with this, it is also frequently assumed that visitors know the difference between good science, bad science, pseudo science, not science, science in fiction, and science fiction; and that this is reflected in their visitor experiences in science museums. Especially in the internet age, where anyone can freely and widely distribute their thoughts and opinions and agendas, how do people build their understanding of science, and how do these various avenues of information impact trust in science? Media sources have been exposed in scandals where false “science” was disseminated. Various groups deliberately distort information to suit their purposes. In this melee of information and misinformation, are science centers still viewed as reliable sources of science information by the public?

Since defending and getting back in to our lab duties full time, Katie and I realized today there are a lot of tasks to tackle for the lab before the summer break! Basically it’s all about getting the lab’s ducks in a row and putting in place some procedures for both research and equipment that will help the lab move forward in the future and ease transitions with new students and new scholars working with the lab.

Firstly, we’ve come to a space crunch with the visitor center tech storage. So the first task is to clear out unused or unwanted pieces we have been holding on to in order to minimize our space needs and make room for tech development and new equipment in the future. Secondly, we need to inventory everything we have so far. There are so many pieces of cameras, mics, etc that we have experimented with and tested that need to be accounted for and documented. This is an essential step because it will help us decide what we are still missing from our suite of tools, and put some equipment loan procedures in place to protect the quality and security of the equipment. Thirdly, there are research agendas that need developing. These will determine the next stages for getting research and data collection moving in the visitor center within the overarching lab agenda, and help drive technology development for the lab for the future. They will also help us plan next steps for ongoing IRB applications. Research tools aren’t much good if you’re not sure how you will be using them for research. Lastly, there are the next steps. We will be making data collection and technology development plans for the upcoming months to help build research “game plans” and tasks lists for the future.

All in all, it’s quite an exciting time for the lab, and for me personally I love all this organizing and “cleaning house”. Knowing where you  are in a project and where you are going in my opinion makes for great future products.

Last weekend there was a wonderful free choice learning event at Lincoln City Oregon – The Remotely Operated Vehicle Competition.  It was so much fun to watch and perform the role of judge.  This is an event that is sponsored by the Marine Advanced Technology Education Center and numerous local and national sponsors.  The most interesting thing to me is the level of excitement that surrounds these events from all involved.  However today I am going to write about one particular participant from last Saturday’s event.  This particular sophomore chaired his team for the Rovers portion of the competition which meant they were competing at the level to win the only slot to move forward to the international competition and prize money to help offset costs.  This particular participant had a serious of events on Saturday that would make any person, young or old most likely walk away from the competition.  In my mind his actions truly embodied what it means to be a good sport, but the aspects of free choice learning. 

First of all during the debriefing it was clear that another team his team was competing against had not brought all their materials nor had they read the rules.  I instantly offered to share his supplies and the printed out materials with them which he was not required to do.  When the head judge said he did nto have to do that, the instructions were clear online, he said it is all for learning and fun isn’t it – I’m I allowed to share.  We said sure.  Next thing, his team members did not show up.  This meant that he was instantly disqualified if he did not have at least one more person with him “on deck” for the trails and for the competition.  He enlisted the help of one of his family members.  The judges told him that he still mostly would not advance as the team had changed from the date of submission.  He said ok, but can I still go through the event.  Yes was the answer.  Next his ROV did not meet specs.  He was given 20 minutes to alter it – he did it passed.  He proceeded with the trails and placed higher then I actually thought his ROV could achieve.  Impressive driving for the limited machine.  However this is not all, he watched other competitors, cheered the younger competitors on.  Walked around and read the various posters that other teams produced and encouraged the other teams throughout the event.  When chatting with him, he remarked about how much fun this was and how much he was learning.  All on his own choice!  He didn’t win, he didn’t make the paper, but his actions stood out enough that he was voted to receive a Spirit Award that he did not know even existed.  Congratulations – “Abandoned Ship”

Hi all!

I have  been doing some readings for my Advanced Qualitative Methods Class and run into some interesting remarks about the challenges of  qualitative data analysis. I though I would share this with you. If you are still to dive into data analysis for your projects, I think these are good references to have as they offered many strategies to cope with the challenges of analyzing qualitative data.

The readings brought forth the idea that the steps and rationale of qualitative data analysis is often obscured in research reports.  There is no widespread understanding in the field as to how qualitative analysis is to be done. Can there ever be such an understanding? Given the very nature of qualitative analysis, no single cookbook is possible, but some strategies have been proposed by various researchers and have been proven helpful in aiding analysis of data.

Bulmer (1979) discusses concept generation, referring to previous work from other researchers who attempted to address the “categorization paradox” and the problem of validating concepts defined/used in qualitative analysis. The “sensitizing” concepts of Blumer, the “analytical induction” of Znaniecki, and the “grounded theory” of Glaser and Strauss are all, within their limitations, sources of insight for thinking about concept validation, as they bring forth the importance of conceptualizing in a way that is faithful to the data collected. I believe this was important to the development of inductive research in more rigid ways that allowed for appropriate generalizations.

Since then, other publications have emphasized the practice of qualitative data analysis and strategies to consider along the way (e.g. Emerson et al. 1995; Lofland et al. 1984; Weiss 1995).  Developments have been made in discussing concerns about data faithfulness and its interplay with the subjectivity of the researcher. I particularly like the Lofland et al.  (1984) definition of analysis as a transformative process, turning raw data into “findings/results”. Here the researcher is a central agent in the inductive analysis process, which is highly interactive, labor intensive and time consuming, and therefore requires a systematic approach to analyzing data in order to account for the interplay between the data and the researcher-produced theoretical constructs.  The authors suggest a few strategies to use while analyzing data, two of which I would like to elaborate on here: normalizing and managing anxiety and memoing.

I have read many qualitative methods materials and they all discuss the need for the qualitative researcher to recognize and be aware of his/her subjectivity in the course of preparing for, conducting and writing about a research problem.  Lofland et al. (1984) touches further on a point that I now believe to be key to subjective interference in data analysis, the issue of researcher anxiety.  At first it seemed to be an overstatement, but the more I read the more I found substance in the issue. Understanding a social situation is no easy task and requires an open-ended approach that can cause much anxiety as the researcher is confronted with the challenge of finding significance in the materials. Ethical and emotional issues come into play in the midst of making sense and organizing a rapidly growing body of data and they can negatively affect the research experience if not dealt with properly.

The authors emphasized five anxiety-management principles for researchers to think about: 1) recognition and acceptance of anxiety; 2) Start analysis during data collection; 3) be persistent and methodical; 4) accumulation of information, at minimum, will ensure some content to talk about; 5) Discuss with others in same situation.   These strategies really addressed my worries regarding the process of data analysis. High emotions, fears, and wanting to quit are all part of anxiety reactions I have been feeling myself.  I believe starting early and being methodological and persistent are key strategies to deal with anxiety issues because it can assure you have time to address the challenges, make changes and not be so frustrated in the course of doing so.

If starting early, initial coding can be done in advance of starting focused coding, giving the researcher time away from the data that may needed to reduce anxiety. Early coding assures the possibility for early memos, which can help clarify connections along the way and assure persistence will prevail due to observable progress. I believe memos are the start of the  “transformative process’’ that Lofland et al. (1984) were referring to while defining data analysis. It is the bridge between the data and the researcher’s meanings, a first draft of a completed analysis where the interplay between data and theoretical constructs take place. Consequently, writing memos become necessary rather than optional.

Both Lofland et al. (1984) and Emerson et al. (2011) extensively discuss the memoing process. Operational memos are notes to self about research procedures and strategies. Code memos clarify assumptions underlying written codes. Theoretical memos record the researcher’s ideas about the codes and relationships. These are the memos that can take place even before coding starts, and that provide the basis for the “integrative” memoing that Emerson et al. (2011) refer to as they talk about identifying, developing, and modifying broader analytic themes and arguments into narrower focused core themes. Furthermore, while Lofland et al. (1984) explores the art of writing memos, Emerson et al. (2011) emphasizes the “reading” of memos, and the importance of reading notes as a whole and in the order they were written as beneficial to this integrative process of making meaning. This aspect added a fourth layer of subjectivity in addition to the layers of observing, deciding and writing about a phenomenon – the layer of reading and making sense of them.

In the course of doing so, the researcher’s assumptions, interests and theoretical commitments influence analytical decisions. In this sense, data analysis is not just a matter of “discovering” but a matter of giving priority to certain incidents and events from data materials in order to understand them in a given case or in relationship to other events.  This idea is interesting to me as I used to think of theoretical constructs emerging from the data in a process of discovery, and now I see it as a process of immersion. The researcher not only can immerse him/herself in the phenomenon being studied during data collection, be he/she is also immersed during data analysis as these inseparable subjective decisions shape the theoretical constructs. While I still think there is an aspect of discovery, it is somewhat created rather than naturally occurring.

In sum, there are several methodological attempts to clarify the logic of qualitative data analysis. However, the use of such guidelines and strategies are not very transparent in research reports and one may be left wondering about how the data analysis was actually done, how exactly the concepts came to be in a given study. Nevertheless, such methodological strategies highly emphasize the interplay between concept use and empirical data observation. Although a logical process does take place in analysis and it is indeed crucial to the systematization of ideas and formation of concepts, it seems to me this process is as logical as the researcher makes it within his/her sociological orientation, the study of substantive framework and the nature of the phenomenon in study. In this sense, nothing is really created but transformed through a logical theorizing process that is unique to the research in question.  Nothing is discovered by chance, qualitative analysis is rather an “analytical” discovery.

 

References

Bulmer, M. (1979). Concepts in the analysis of qualitative data. Sociological Review, 27(4), 651-677.

Emerson, R. M.; Fretz, R. I.;  & Shaw, L. L. (1995). Writing Ethnographic Fieldnotes. University of Chicago Press, Chicago, IL.

Glaser, B. G. & Strauss, A. L. (1967). The discovery of grounded theory: strategies for qualitative research. Aldine de Gruyter.

Lofland, J.; Snow, D.; Anderson, L. & Lofland, L. (2011). Analyzing Social Settings: a guide to qualitative observation and analysis. Wadsworth.

Weiss, R. S. (1995). Learning from strangers: The art and method of qualitative interview studies. Simon and Schuster Inc. New York.