For my blog post today I have been thinking about many different things. So now that it is time, I am going to proceed with the topic that has mostly entered into my head when thinking about this post – testing. I know that it is not truly a free choice learning topic as testing is often associated with standard school functions, however I want to bring forth that the more experiences you have outside of school should in theory support the success of testing. With that said, I am truly not a fan of standardized testing. Recently I read an article about a teacher who retired in New York after over 20 years of teaching and claimed that he no longer has a profession. This article struck me and made me think of what we do with our research in the free choice learning arena. We try to document various experiences that people have and ponder what meaning it has in their lives. Will this experience help them understand a particular concept better? Will it expand their thinking on a particular area – for example environmental issues – Will the experience of being in a free choice learning setting influence the participant to be more “open” in accepting new experiences such as touching animals in a touch tank or petting zoo? Not sure, and as group we were all looking at various data sets to reflect on these issues.

So how is this related to testing? Well our free choice learning environments are tied to the formal environments in many ways. The participants typically have had some sort of schooling. This helps shape the background knowledge brought forth by the participant. If what I am hearing from my teacher friends is true, as well as the information presented by the recently retired educator, then the experiences that the students are receiving in formal schools are largely focused on standardized testing. UGH! This in my thought process is very limiting. This limits active conversation by the teacher and students, sets an imposed timeline on pre-planned topics presented removing free flow of ideas.

How can we as educators and researcher in the free choice arena use this information when planning and when trying to implement change within the overall educational system? Do we still use any form of testing within our field? How is this testing different from the standardized ones given in the formal setting? Food for thought and hopefully future conversations.

In light of the recent posts discussing Positivism vs Interpretivism, Grounded theory approach, and the challenge of thinking about epistemology and ontology, I decided to use this post to continue the debate and share a few things I have been thinking about and doing, that I hope will help me making sense of the paradigmatic views and theoretical approaches that may eventually be a part of my research.

Research design has been a challenging task nonetheless very meaningful process to me, because I am having the chance to dig deep inside into who I am and the personal values, beliefs and goals I carry with me. To start such reflection I referred back to writing exercises, a piece that I remember was topic of the first lab meetings I participated as a member of the group, and that inspired me to find ways to apply different kinds if exercises to research design. As a result of that and of the ongoing advanced qualitative class I am taking right now, my computer file folder entitled “Memos” is growing very quickly as I go through the process of writing my proposal and thinking about my research design.

I am using many forms of memos. I got myself a research “journal” that I am using to register the “brilliant” ideas I come across one way or another during this process, not only ideas for  research goals/ methods/ questions, etc… but also epiphanies  on concepts, theories and how I am making sense of them as they apply to my research. I am carrying it with me everywhere I go because, believe me, ideas pop up unexpectedly in very strange situations. The goal is not to loose track of my thought process as it evolves into a conceptual framework for my research. Saying it bluntly, I want to be able to say clearly why I choose the approach that I choose for my design and how I justify it.

To start this search for my own clarification about where in the world of qualitative research I sit in, which I assumed would inform my methodological choices, I wrote my first memo as a class exercise – a “Researcher Identity Memo”.  It may sound very “elementary” to some of you, but I saw this exercise as opening the doors of my own path through understanding why I seat where I seat right now,  how I came to be here, and where I can potentially go. The memo was a reflective exercise about past experiences in life, upbringing, values and beliefs that I may see connected to the topic of research I choose to investigate, how would I predict that as facilitating or imposing challenge to my work as a researcher. This turned out to be 6 pages document that brought out 3 personas in me that equally influenced my decisions. The educator, the scientist, and a concerned citizen of this world. The synergies between the values, beliefs, experiences, goals and interests of each got me to decided on my research topic (family “affordances” to learning at the touch-tank exhibit at HMSC).

This actually made me rethink my research goals to identify personal, practical and intellectual interests as they combine to answer the “so what?” of my research idea. In fact, “the evolution of my research questions” is another ongoing memo I am working at as my questions emerge, evolve, change, etc. I also have a mini notebook on a key chain attached to my wallet for when those revealing moments happen as I have dialogues with other professional like yourselves or want to write a quick reference to look at later. I think the practice of writing these memos is helping me untangled bits of theoretical debates that I am slowly making sense of , and that are helping me se where I sit.

Now if you are not too fan of writing, if you avoid writing exercises like the plague, Laura suggests to use alternatives ways of registering this moments. She told me she used her phone to record a voice memo the other days. How you do it is not the key issue, but I think it is important that you find a way  that works for you that you can register the evolution of your thought process. Going through a few conversations with Shawn during our weekly meetings, he articulated an approach he thinks I seating on right now for my research. he bursted out these big words together that I am still trying to work trough but that emerged smoothly and almost instantly out of his mind. He called it “Neo-Kantian Post-positivist and Probabilistic Theory of Truth”. I hope he wasn’t tricking me :). Here is the way I see where I stand right now in my less eloquent philosophical terms:

1. Departing from axiological views, I am interested in explanations and descriptions of real meaningful events, why and how questions.

2. Therefore, I am moving from “data to theory”, through inductive questioning

3. As for what is the nature of reality? (ontology), I think I compromise in between objectivity and subjectivity, is there a possible inter-objectivity or inter-subjectivity?

4. As for what counts as reality? (Epistemology), I tend to associate with Social-Cosntructivism.

So,  I using the following schema as a wall decoration in my research room:

Epistemology – Social-Constructivism; Theoretical perspective/ Approach – Interpretivism; Suited Methodology: Grounded Theory.

However, I see myself as open to new topics, ideas. So I am adopting a paradigm but it does not necessarily mean that I will completely oppose combining aspects of other paradigms. I read in a piece of literature once that “sometimes we need a little constructivism, and sometimes we need a little realism”. While I oppose to think radically about it, I do think that it is important to use existing theories critically, and if  you are to be critical you are open to testing (hermeneutics). Here is were I seat in conflict between objectivity and subjectivity, qualitative and quantitative values, and that is why I intend to use mixed methods

I don’t know if this links perfectly to the definition of the approach Shawm saw me taking, But boy, I am happy to be going through this discovery process right now, and memos are really helping me along the way.

Susan

 

 

 

 

When you have a new idea in a field so steeped in tradition as science or education, as a newcomer, how can you encourage discussion, at the very least, while still presenting yourself as a professional member of your new field? This was at the heart of some discussion that came up this weekend after Shawn and I presented his “Better Presentations” workshop. The HMSC graduate student organization, HsO, was hosting the annual exchange with the University of Oregon’s Oregon Institute of Marine Biology grad students, who work at the UO satellite campus in Charleston, Oregon, a ways south on the coast from Newport.

The heart of Shawn’s presentation is built around learning research that suggests better ways to build your visuals to accompany your professional presentation. For most of the audience, that was slides or posters for scientific research talks at conferences, as part of proposal defenses, or just with one’s own research group. Shawn suggests ways to break out of what has become a pretty standard default: slides crowded with bullet points, at-best illegible and at-worst incomprehensible figures, and in general, too much content crammed onto single slides and into the overall presentation.

The students were eager to hear about the research foundations of his suggestions, but then raised a concern: how far could they go in pushing the envelope without jeopardizing their entry into the field? That is, if they used a Prezi instead of a PowerPoint, would they be dismissed as using a stunt and their research work overlooked, perhaps in front of influential members of their discipline? Or, if they don’t put every step of their methodology on their poster and a potential employer comes by when they aren’t there, how will that employer know how innovative their work is?

Personally, my reaction was to think: do you want to work with these people if that’s their stance? However, I’m in the enviable position of having seen my results work – I have a job offer that really values the sort of maverick thinking (at least to some traditional science educators) that our free-choice/informal approach offers. In retrospect, that’s how I view the lack of response I got from numerous other places I applied to – I wouldn’t have wanted to work with them anyway if they didn’t value what I could bring to the table. I might have thought quite differently if I were still searching for a position at this point.

For the grad student, especially, it struck me that it’s a tough row to hoe. On the one hand, you’re new to the field, eager, and probably brimming with new ideas. On the other, you have to carefully fit those ideas into the traditional structure in order to secure funding and professional advancement. However, how do you compromise without compromising too far and losing that part of you which, as a researcher, tells you to look at the research for guidance?

It occurred to me that I will have to deal with this as I go into my new position which relies on grant funding after the first year. I am thinking about what my research agenda will be, ideally, and how I may or may not have to bend that based on what funding is available. One of my main sources of funding will likely be through helping scientists do their broader impacts and outreach projects, and building my research into those. How able I am to pick and choose projects to fit my agenda as well as theirs remains to be seen, but this conversation brought me around to thinking about that reality.

As Shawn emphasized in the beginning of the talk, the best outreach (and honestly, probably the best project in any discipline, be it science, or business, or government assistance) is designed with the goals and outcomes in mind first, then picking the tools and manner of achieving those goals only afterwards. We sometimes lament the amazing number of very traditional outreach programs that center around a classroom visit, for example, and wonder if we can ever convince the scientists we partner with that there are new, research-based ways of doing things (see Laura’s post on the problems some of our potential partners have with our ways of doing research). I will be fortunate, indeed, if I find partners for funding that believe the same, or at least are willing to listen to what may be a new idea, at least about outreach.

While taking a short break during some late night studying in the library last night, Susan and I got talking about the difficulties in our field of figuring out a theoretical framework for studies. Susan has just begun a new qualitative methodologies class in the Sociology department here at OSU and was showing me some cool insight she had gained from the class around the differences between a positivist and a interpretivist framework for social science studies. She was describing where she felt she belonged, and why it seemed appropriate to take a mixed-methods approach to her upcoming research on family learning in museums.

Here’s a nice little factsheet I found highlighting the differences.

What we realized was, as graduate students, we had yet to have very little conversations with other researchers about their theoretical and methodological approaches, and why they had taken such paths in their academic careers. We were aware of the theoretical and methodological choices we’re presented with, but realized that we somehow had avoided those conversations in case of conflict.

I position myself as an interpretivist and a qualitative researcher, and like many others have had to endure at one point or another a skeptical and frowning positivist while explaining my research, and, I’m sure the reverse case is not uncommon. It  got me thinking that, similarly to us avoiding confrontation in education around climate change and evolution, the positivism vs. interpretivism conversation has become as equally controversial.

But why? Isn’t this discussion simply about the choices we make as researchers? I argue that conversations around our choices as upcoming academics is as important to our research development as the time we spend writing lit reviews, and we must not avoid them simply to avoid having to defend our choices outside of our theses. Understanding the choices our peers make is part of the process of understanding our field as a whole, and allows us to think outside the bubble of our research norms for future work, which of course drives innovation.

As a set of tools for learning research applicable to a variety of theoretical and metholodical approaches, the free-choice learning lab has some interesting opportunities in the future for scholars to interact with those outside their “bubble”. I’m looking forward to finding out what kind of impact that has on the research that takes place within it.

 

 

 

I’ve been thinking a lot about grounded theory for my dissertation writing lately, and its role in the development of strategies for qualitative coding, so I thought I’d share some resources I found.

Grounded theory, which emerged from the work of Glaser and Strauss, is an approach I took in my work because it allows you to discover theory from data in order to generate hypotheses, rather than test hypotheses. I have found it a useful approach in a realm of study where past literature is scarce and more theory is required, and also because it appeals to my beliefs that free-choice learning research needs more base-level ground work in some areas regarding teaching and practice.

In terms of qualitative coding, grounded theory has helped framed the constant comparative method. Here, you code descriptive data (i.e. interviews, videos) by the themes that emerge from the data, either describing those themes using the language found in the data itself ( “in vivo”) or using terminology a researcher applies based on the conceptual framework of the study. In essence, you constantly compare the emerging themes to one another to make larger groups of themes that help you build a claim about the data. I’ve used this method for data analysis several times, and although it is long-winded, I find it amazingly useful for getting to know your data from the inside out.

Here is a cute video to help you visualize the constant comparative method: http://goo.gl/RReqN

So in terms of resources for grounded theory and qualitative coding, I’ve found some very helpful videos by  Graham Gibbs, a researcher from the University of Huddersfield in the UK who specializes in social research and evaluation. He has a youtube channel of his lectures which he has created as part of the courses he teaches, but also because he believes in providing outreach for other social researchers. Find it at http://goo.gl/T2VhS

What I like about these videos the clarity of the conversation around strategies and approaches behind qualitative coding, which I think are incredibly useful for helping to better understand not only how to code, but the different qualitative approaches that are out there for researchers. In the past, I had been confused by the literature in terms of how specifically to apply grounded theory and coding in practice, but I think these videos really help put the literature together in a meaningful way. They’re also a good example of social science outreach, and helping others understand the mechanics of qualitative research.

 

 

I have been coding my qualitative interview data all in one big fell swoop, trying to get everything done for the graduation deadline. It feels almost like a class project that I’ve put off, as usual, longer than I should have. In having a conversation with another grad student, about timelines, and how I’ve been sitting on this data since oh, November or so (at least a good chunk of it), we speculated about why we don’t tackle it in smaller chunks. One reason for me, I’m sure, is just general fear of failure or whatever drives my general procrastinating and perfectionist tendencies (remember, the best dissertation is a DONE dissertation – we’re not here to save the world with this one project).

However, another reason occurs to me as well; I collected all the data myself and I wonder if I was too close to it in the process of collecting it? I certainly had to prioritize finishing collecting it, considering the struggles I had to get subjects to participate, and delays with IRB, etc. But I wonder if it’s actually been better to leave it all for a while and come back to it. I guess if I had really done the interview coding before the eye-tracking, I might have shaped the eye-tracking interviews a bit differently, but I think the main adjustments I made based on the interviews were sufficient without coding (i.e. I recognized how much the experts were just seeing that the images were all the same and I couldn’t come up with difficult enough tasks for them, really). The other reason to have coded the interviews first would have been to separate my interviewees into high- and low-performing, if the data proved to be that way, so that I could invite sub-groups for the eye-tracking. But I ended up, again due to recruitment issues, just getting whoever I could from my interview population to come back. And now, I’m not really sure there’s any high- or low-performers among the novices anyway – they each seem to have their strengths and weaknesses at this task.

Other fun with coding: I have a mix of basically closed-ended questions that I am scoring with a rubric for correctness, and then open-ended “how do you know” semi-clinical interview questions. Since I eventually repeated some of these questions for the various versions of the scaffolded images, my subjects started to conflate their answers and parsing these things apart is truly a pleasure (NOT). And, I’m up to some 120 codes, and keeping those all in mind as I go is just nuts. Of course, I have just done the first pass, and as I created codes as I went through, I have to turn around and re-code for those particular ones on the ones I coded before I created them, but I still am stressing as to whether I’m finding everything in every transcript, especially the sort of obscure codes. I have one that I’ve dubbed “Santa” because two of my subjects referred to knowing the poles of Earth are cold because they learned that Santa lives at the North Pole where it’s cold. So I’m now wondering if there were any other evidences of non-science reasoning that I missed. I don’t think this is a huge problem; I am fairly confident my coding is thorough, but I’m also at that stage of crisis where I’m not sure any of this is good enough as I draw closer to my defense!

Other fun facts: I also find myself agonizing over what to call codes, when the description is more important. And it’s also a very humbling look at how badly I (feel like I) conducted the interviews. For one thing, I asked all the wrong questions, as it turns out – what I expected people would struggle with, they didn’t really, and I didn’t have good questions ready to probe for what they did struggle with. Sigh. I guess that’s for the next experiment.

The good stuff: I do have a lot of good data about people’s expectations of the images and the topics, especially when there are misunderstandings. This will be important as we design new products for outreach, both the images themselves and the supporting info that must go alongside. I also sorta thought I knew a lot about this data going into the coding, but number of new codes with each subject is surprising, and gratifying that maybe I did get some information out of this task after all. Finally, I’m learning that this is an exercise in throwing stuff out, too – I was overly ambitious in my proposal about all the questions I could answer, and I collected a lot more data than I can use at the moment. So, as is a typical part of the research process, I have to choose what fits the story I need to tell to get the dissertation (or paper, or presentation) done for the moment, and leave the rest aside for now. That’s what all those papers post-dissertation are for, I guess!

What are your adventures with/fears about coding or data analysis? (besides putting it off to the last minute, which I don’t recommend).