Monday, April 21, 2014

Cult of Competition?

Barry, Britten, Barber, Bradley and Stevenson (1999) mention several times the tradition of the individual researcher and the culture of completion among graduate students. While I do see some competition to progress in the program (and for AI’ships and jobs) it has been my fortunate experience to work in a very supportive and collaborative program when academic progression and research are concerned. Professors tend to encourage working on early or individual projects as small groups. With this said – there is very little support for how this early collaborative work should be structured. Without this structure and support it can be difficult for graduate students to initiate co-research projects.

I was fortunate to work collaboratively with my adviser and fellow grad student on a small research project. We were only involved in the transcription, data analysis and writing portions of the project, but it was a great introduction to the qualitative research process.

As such my experience with collaborative work and coding is reflective of Sin’s ( 2008) discussion. We worked to establish an a priori coding schema. This was done through early meetings and an initial reading of the transcripts to locate and discuss emergent themes. From here we established our main schema that was utilized through the project. Our adviser walked us through the process establishing initial coding schema (or trees) and ensuring we worked to establish and maintain intercoder reliability throughout the research process. In retrospect and understanding more about the structure of NVivo this makes perfect sense. NVivo would not have supported a more iterative process, and working collaboratively we needed to be on similar footing or the findings might not have made much sense.


Learning how to establish codes and how to interpret the findings was a valuable experience as we working together to write up the findings. However, we did not make room for reflexivity practices. I think it was assumed my graduate partner and I would simply follow in the epistemological and methodological footsteps of our adviser. Also, we really just divided and conquered the writing process, not wasn’t much collaboration with that which I would like to practice more of as I’m particularly interested in collaborative practices. (I tend to find the individual process overwhelming and find myself far more productive when I’m held accountable by the expectations of others.)

Sunday, April 20, 2014

Drops Microphone...Walks Out

Dr. Paulus’s discussion really helped to tie a lot of things together for me, including how important the right CAQDAS package can be for a researcher’s work. One of the themes I’ve picked up on this semester is the lack of support and even access young researchers have when it comes to CAQDAS packages. Many of our readings and discussion have focused on this. So I was not shocked to hear Dr. Paulus comment on the double standard related to the trend for universities to have licenses for traditionally quantitative packages such as SPSS and SAS, but nothing for qualitative researchers. I was however, (again) validated in my own concerns related to the double standard.

The lack of training and support with coding and CAQDAS packages was highlighted for me twice this week. First, I met with a panel of reviewers hired to evaluate the Curriculum Studies program. One of the many questions related to the ways in which grad students are supported in the research process. One of the reviewers (from Wisconsin) was shocked when I mentioned the University wide lack of access to qualitative research tools. I was truly shocked at how many of the doc students present had no familiarity with any type of software for qualitative data!

Later in the week during my proposal writing class my professor was speaking about coding and had us briefly practice the initial stages of the coding process. (The class is taught by my advisor so I was familiar with much of this.) He mentioned – only briefly – the possibility of using a program like NVivo and several members of the class had questions.  From here I hijacked the discussion and briefly shared some of the insights from our class (Hopefully, I got it right and did the class justice!) This included the fact that there are several programs that cost money and that DeDoose might be an inexpensive option. A lot of my classmates were not aware of video transcription software and it appeared as if several students planned to use video in their data collection.

What I’m trying to highlight is the intense need for faculty and university support when it comes to modern qualitative data analysis. If one of the top C&I programs in the nation is so far behind I can only imagine what it is like at other schools. Imagine how inclusion of courses and support of digital data analysis could enhance the overall transparency of the research process. The little I’ve learned through this class will prove invaluable as I begin my dissertation process and I actually feel horrible that so many of my peers are completely unaware of the vast resources available.


This is my rant for the night, more to come I’m sure!

Monday, April 14, 2014

Short, Sweet, and Late

I was frustrated reading Taylow, Lewins and Gribbs (2005) discussion regarding the debates about CAQDAS packages. I found myself continually saying “but that was then” and wondering how the debate has advanced The article is nearly a decade old and many of the debates presented were over two decades old. I understand the need to for historical framing, but I would hope that these concerns have been addressed and that researchers are far less critical and CAQDAS packages have worked their way into the mainstream of data analysis.

I did appreciate the recognition that CAQDAS packages allow for multiple coding schemas and allow the researcher to revisit the data – especially large data sets. Collecting extreme amounts of data (like those often gathered during the investigative portion of dissertation work) can be both beneficial and overwhelming. CAQDAS packages allow the data to “be reused and returned to with new analytic ideas and objectives.” This allows a researcher to continue the data analysis beyond the initial publication. In doing this I think the researcher and those interesting in the area are actually given more in-depth exposure to the overall analysis and they can see more than the initial interpretation of the data. Utilizing the same data set through several projects, with different perspectives and goals makes the overall data analysis more thorough. This is amazing affordance offered by CAQDAS packages that doesn’t  surprise me, but that I hadn’t thought existed.


In trying to make up for this being short and late – here’s a Vic Pic! This might put his size in perspective.


Tuesday, April 8, 2014

Front Page of the Daily Prophet

First – I loved the introduction to the Education Policy and Analysis Archives. The requirement of a video summary for published work is fantastic! What a great way to make the research accessible to a wider audience. I look forward to investigating the site further. Also, I had no idea TCR required something similar. It’s a good sign that digital tools and multiple means of representation are becoming more prevalent.
I’m really glad to be entering the discussion of digital tools for representing findings. I’ve been mulling over several things the past few weeks, kind of an itchy scratchy idea in the back of my mind that I’m looking forward to finally sharing as the readings have helped to bring these ideas more to the forefront of my thoughts.

I really began thinking about how powerful presenting a story via multiple media can actually be and how difficult this is to do in a fluid manner. I’ve seen people attempt to incorporate media into presentations, but sometimes it feels piecemeal or clunky. While there is great promise for things like blogs, websites, video ethnographies etc. I don’t think the traditional academic journal article will completely diminish – but it might become technologically enhanced (beyond a companion piece like the ones from EPAA or TRC). I am sure we will find ways to incorporate technology.

I stumbled across one of the most seamless examples of what this might look like in journalism from a March 9th article in the New York Times Online by Dan Barry titled “The ‘Boys’ in the Bunkhouse”. At first glance it seems like a traditional piece of investigative journalism to include stunning images. However, the images didn’t seem quite right, then I realized that many of them were moving (almost like they were taken from the pages of the Daily Prophet). The Times had incorporated Vines throughout the article and several of these images linked to more in-depth video clips. While this isn’t academic qualitative research the presentation proved powerful and gut wrenching, the images and video just made it more so without pandering. 

As journals begin the migration to online platforms I believe this type of media integration will emerge and prove a powerful extension of thick description associated with high quality qualitative research. However, there are constraints, many related to participant protection. Also, I question the new set of skills researchers will require to render video that when taken out of context is capable of capturing the emotion or symbolism for the average reader/viewer.


But I can’t deny the importance of video analysis in both interpreting and presenting the data. I was fortunate to see Barbara Rogoff speak last week regarding her work on “Learning through Pitching In.” I found her discussion both interesting and her descriptions effective, however when she showed examples of students learning and working together I was struck by just how significant her observations were. The videos captured participant interaction with researchers in a way that neither the most detailed written descriptions via field notes nor an audio transcript would ever be capable of capturing. (The two most powerful examples of the significance of video and nonverbal communication begin around 44:50 and 1:02:00.)

Sunday, March 30, 2014

"Content Clouds as a Methodology"

I've used content clouds as an instructional methodology but never thought about their use as a research methodology. Cidell's (2010) notion is so simple, so true to what I've asked my students to do in analyzing primary documents or research why did I never think to apply it to my own research?

When I taught M300 the multicultural education course here at IU I spent some time on how social class might influence teacher's perceptions of their students. One of the most detailed, telling works on this subject is Jean Anyon's work "Social Class and School Knowledge." The work is long and sadly I didn't trust my undergraduates to fully engage with the entire article. So I got creative and divided the reading into segments based on the type of school presented in Anyon's detailed ethnography. With each reading I included a Wordle. We began class by examining un-labled Wordles and identifying from them the goals, curriculum and environment of each learning site. We then attempted to identify which Wordle would represent the school we wanted to attend. We shared and discussed the different components of the reading relating her work back to our earlier analysis of the Wordles. This proved to be a particularly powerful experience and I can't help but think that striping the articles bare helped to emphasize the differences between schools. It proved to be an effective tool for analysis. (This worked much better than the time a colleague and I collaborated to create a series of lessons on the Clinton Impeachment and decided to include a Wordle on the Star Report.)


(Which school would you like to attend?)

 

Cidell provides two main cautions about commonly used words and how the word counts are relational (p. 521). I probably should have taken out words like teacher and school since these were prominent throughout each example above and distracted from the nuances of the Wordles.  Analysis of the word cloud should also be detailed. Looking at the nouns doesn't give much detail (with the exception of book or textbook in the Wordle on the right). For the examples above, I find it interesting if you look at the different verbs present - or even the presence of verbs in each!

Cidell (2010) also suggests the possibility to using Word Clouds and assigning the results to not only geographic locations, but possibly to demographic information. I wonder how Anyon's article might have differed is she was able to process her interviews and observations through a Wordle? If Anyon had a cloud would it have enhanced her work? Or possibly diminished it by removing her from the data? Her work is so powerful much of this is due to the thick discription of her field sites and participants. I'd like to think that this would have enhanced her description (if that is even possible!)  Cidell implies that word clouds are tools, but I think it's important to again emphasize that tools should be an enhancement of the research process not a substitute. 

This whole thing has my head spinning with ideas! So with my interest sparked I decided to place part of the transcripts I'm currently working with into TagCrowd.com and see if what I've determined to be trends in the interviews is picked up by the word cloud...Result to follow.