I've started to analyse the data from the Learning Event (LE), looking firstly at the transcripts of the interviews that I held with some participants before they started. I held 8 interviews over Skype and 43 by email, asking the same questions in all cases. The first question was What are your expectations for this event? and using Atlas.ti, as an experiment, I've coded the replies.
The image above shows a section of a reply to the first question from a participant, submitted by email. On the right you can see the codes that I've applied, which on this occasion are Web 2.0 and tools and Digital skills and competence - you will probably need to click on the image to expand it and read the information . The other codes you can see, for example Channel were added automatically by Atals.ti as I imported the data and represent the headings for each chunk of data - in this case, whether the information was submitted by email or by Skype.
The overall coding results from the 51 participants are as follows:
In other words, there were 25 instances of participants including reference to the use of Web 2.0 and its associated tools in their answers to the question on What are your expectations for this event?. Similarly, 20 people made reference to using these tools in their teaching practice. Some of the codes could possibly be combined, for example Cooperation and collaboration could be joined with eTwinning project, as one is usually done within the context of another. This is easily achieved with the tool.
The results in themselves are not too surprising. They reflect a full range of expectations, with a focus mainly on learning specific web 2.0 tools and gaining experience of using them in their teaching practice. What perhaps will be more interesting is to compare these to results obtained from the final interviews were the same questions were asked.
This experience of using content analysis techniques and Computer Aided Qualitative Data Analysis Software (CAQDAS) was interesting but also raised a number of concerns for me that are to some extent echoed by Enriquez (2009). Analysing dialogue using coding schemes tends to focus your attention on the detail, the individual words, perhaps at the expense of seeing a bigger picture. Indeed, in an attempt to keep one's coding scheme limited, in order to achieve parsimony, one is encouraged to group together different expressions under the same heading. Yet they may hide important different latent meanings that would add depth to the analysis if they were surfaced rather than suppressed. I also note that the very process of coding using a CAQDAS leads you to quantitative data results - for example, in the case above, it is convenient to report that only 20/51 respondents (i.e. ~40%) referred to learning how to use web 2.0 tools in their teaching practice as an explicit expectation, but what does this result actually mean?
Enriquez (2009) raises discontent with the prevailing use of content analysis in online discourse, suggesting that the written word only reflects part of the context for knowledge production. In order to have a fuller picture, one should take into account the external environment for the discussion (the situation of the learner), the internal environment (text in chats is of a different nature to that in forums), the temporal structure (asynchronous is different to synchronous), the purpose of the discussion (topic or project related, for example) and the characteristics of the members of the group (experience of online collaboration, English skills, etc), to mention a few. In order to do this, she proposes the use of genres as an alternative to content analysis. Whereas I find her arguments compelling, I do not fully understand how genres could be applied in practice (she refers us to other papers for examples of application). I would also say that I will be comparing two very similar situations, the LE held earlier this year and the recent one, where several of these variables will be largely stable. I therefore feel that it should indeed be useful to analyse the content, though my recent experimentation has highlighted to me the dangers of getting to bogged down with the coding and the use of a tool which makes simplifications so easy to implement.
Enriquez, J. G. (2009) 'Discontent with content analysis of online transcripts'. ALT-J: Research in Learning Technology, 17 (2), pp.101 - 113
About my research
My research was set in the context of the European Commission’s eTwinning initiative and it looked specifically at the use of eTwinning Learning Events (non-formal learning). It examined how the community influences the development of teachers’ competence in online collaboration and discourse, and it considered the contribution of social aspects and online moderation.
I am very grateful to my supervisor, Dr. Julie-Ann Sime from Lancaster University, and to my eTwinning soulmate, Tiina Sarisalmi, for their invaluable support. And to my examiners, Prof. Marilyn Leask from the University of Bedfordshire and Dr. Don Passey from the University of Lancaster, for their valuable advice.