About my research

My research was set in the context of the European Commission’s eTwinning initiative and it looked specifically at the use of eTwinning Learning Events (non-formal learning). It examined how the community influences the development of teachers’ competence in online collaboration and discourse, and it considered the contribution of social aspects and online moderation.

I am very grateful to my supervisor, Dr. Julie-Ann Sime from Lancaster University, and to my eTwinning soulmate, Tiina Sarisalmi, for their invaluable support. And to my examiners, Prof. Marilyn Leask from the University of Bedfordshire and Dr. Don Passey from the University of Lancaster, for their valuable advice.
Keywords: online learning communities; community of inquiry; online collaboration; content analysis; social presence; social ties; teacher training

Friday 24 December 2010

Teachers' expectations

I've started to analyse the data from the Learning Event (LE), looking firstly at the transcripts of the interviews that I held with some participants before they started. I held 8 interviews over Skype and 43 by email, asking the same questions in all cases. The first question was What are your expectations for this event? and using Atlas.ti, as an experiment, I've coded the replies.


The image above shows a section of a reply to the first question from a participant, submitted by email. On the right you can see the codes that I've applied, which on this occasion are Web 2.0 and tools and Digital skills and competence - you will probably need to click on the image to expand it and read the information . The other codes you can see, for example Channel were added automatically by Atals.ti as I imported the data and represent the headings for each chunk of data - in this case, whether the information was submitted by email or by Skype.

The overall coding results from the 51 participants are as follows:


In other words, there were 25 instances of participants including reference to the use of Web 2.0 and its associated tools in their answers to the question on What are your expectations for this event?. Similarly, 20 people made reference to using these tools in their teaching practice. Some of the codes could possibly be combined, for example Cooperation and collaboration could be joined with eTwinning project, as one is usually done within the context of another. This is easily achieved with the tool.

The results in themselves are not too surprising. They reflect a full range of expectations, with a focus mainly on learning specific web 2.0 tools and gaining experience of using them in their teaching practice. What perhaps will be more interesting is to compare these to results obtained from the final interviews were the same questions were asked.

This experience of using content analysis techniques and Computer Aided Qualitative Data Analysis Software (CAQDAS) was interesting but also raised a number of concerns for me that are to some extent echoed by Enriquez (2009). Analysing dialogue using coding schemes tends to focus your attention on the detail, the individual words, perhaps at the expense of seeing a bigger picture. Indeed, in an attempt to keep one's coding scheme limited, in order to achieve parsimony, one is encouraged to group together different expressions under the same heading. Yet they may hide important different latent meanings that would add depth to the analysis if they were surfaced rather than suppressed. I also note that the very process of coding using a CAQDAS leads you to quantitative data results - for example, in the case above, it is convenient to report that only 20/51 respondents (i.e. ~40%) referred to learning how to use web 2.0 tools in their teaching practice as an explicit expectation, but what does this result actually mean?

Enriquez (2009) raises discontent with the prevailing use of content analysis in online discourse, suggesting that the written word only reflects part of the context for knowledge production. In order to have a fuller picture, one should take into account the external environment for the discussion (the situation of the learner), the internal environment (text in chats is of a different nature to that in forums), the temporal structure (asynchronous is different to synchronous), the purpose of the discussion (topic or  project related, for example) and the characteristics of the members of the group (experience of online collaboration, English skills, etc), to mention a few. In order to do this, she proposes the use of genres as an alternative to content analysis. Whereas I find her arguments compelling, I do not fully understand how genres could be applied in practice (she refers us to other papers for examples of application). I would also say that I will be comparing two very similar situations, the LE held earlier this year and the recent one, where several of these variables will be largely stable. I therefore feel that it should indeed be useful to analyse the content, though my recent experimentation has highlighted to me the dangers of getting to bogged down with the coding and the use of a tool which makes simplifications so easy to implement.

Brian.

Enriquez, J. G. (2009) 'Discontent with content analysis of online transcripts'. ALT-J: Research in Learning Technology, 17 (2), pp.101 - 113

Wednesday 22 December 2010

More than just a Learning Event

I've been interviewing teachers recently who took part in the Learning Event (LE), via Skype. In particular, those who I interviewed before they started and who managed to complete the event, and were able to contribute to the final reflection.

A few things have struck me from these discussions. Firstly the enthusiasm they portray from their experience. For several of them, this was more than just a LE addressing web 2.0 tools, it was a life changing experience that has opened up new avenues in their teaching practice. I've heard from teachers who used such tools for the first time, but were able to try them out with fellow teachers in their schools and saw a marvellous reaction from their pupils. For them, this is the start of a new adventure which has only just started. It is gratifying to see how happy the teachers are when they are able to provide something new for their pupils, that engages them and increases their personal kudos as teachers.

Secondly it is interesting to see the extent to which their initial expectations were met. Many expressed their original goals in terms of learning about new web 2.0 tools. They learned about these, but more importantly they also learned about how to use them in their teaching practice, they shared concrete examples with their peers and they developed personally in terms of their own competence.

So everything sounds perfect? Well not exactly. The experience from these pioneering few is not necessarily representative of the majority. As I said in my previous post, this transformation only happened (generally) for those that managed to complete the course and invested time and effort in the activities. The others still learned, but perhaps not at quite such a deep level; they learned about the individual tools but not necessarily about how to use them in their teaching practice and the lack of collaboration, of 'learning-by-doing', meant that they didn't develop their own competence to quite the same degree.

So I've also been asking those teachers who didn't complete why they thought this was. Their answers reflect a complex picture of busy teachers with difficult personal schedules meaning that they had insufficient time to be able to invest themselves in the time-consuming collaboration, of teachers so new to online collaboration that they simply felt left behind by the experience (to the extent that the LE could have a negative impact on their motivation), and of some who simply did not expect this type of LE and were expecting to be more autonomous, independent learners, following the LE at their own pace.

I also noted that for some teachers the learning philosophy of reflection-in-practice is not something they have previously experienced. Indeed their own teaching style is more instructional and they in turn expected the tutors/facilitators to be more instrumental in summarising the learning outcomes of the group.

This run of the LE has, I feel, managed to achieve a level of collaboration that was missing or was less evident in the previous run earlier this spring. As such, needs have arisen  that we had not anticipated. These included, at a certain point, a need for Tiina and I to raise the issue of netiquette and awareness of what might be considered to be inappropriate behaviour in an online community. On reflection I feel there was a need for the small groups that we had set up (the Round Tables) to openly discuss and agree what they expected from each other in terms of contribution, timing, etc (this may have helped addressed the legitimate concerns raised by Daniela in her comment on the way the groups were established). As it was, the absence of such an agreement led to some groups experiencing frustration - examples from my analysis include:
- some participants contributed whilst other didn't, leading to a sense of inequality or even resentment for the effort invested;
- several natural leaders emerging within a single group who (in retrospect) might have been rather dominant in their approach to setting up blogs, Google docs, etc as places for the group to collaborate;
- perhaps unsympathetic replies (or certainly less supportive messages) to peers who arrived late in the group to find ideas had already been "decided", etc.

Hindsight is a wonderful thing.

Certainly my feeling is that for every innovation we employ in learning, there are important disadvantages that may emerge. These may be in terms of uncomfortable power shifts within the group, of the different starting levels for the participants (experienced collaborators compared with inexperienced novices) leading to unequal opportunities for growth and feelings of inadequacy, of reinforced teaching presence (such as clearer guidelines) for some leading to the loss of a valuable learning experience for others (who might have learned more through initial failure), etc. So each new innovation leads to a rebalance of the pros and cons, and to the need for us to reconsider our teaching practice. Nothing can be taken for granted.

Food for thought, and there is certainly plenty of fodder for me in the data from this experience!

Brian.

Saturday 11 December 2010

What an experience

The revised Learning Event (LE) on web 2.0 tools and collaboration finished recently and what an experience it was. We tried out some of the ideas that emerged from the first LE earlier in the year and I participated as a facilitator in the Staff Room and in the final reflection. I was again impressed by the level of enthusiasm and commitment of the teachers involved.

It will take me several months to analyse all the data - and boy is there lots. So what can I say from what I see so far? Well I think we can consider it a success as far as the participants were concerned. The final questionnaire conducted by the European Schoolnet shows 66% indicating that the event was excellent and 30% very good (n=127). There was a terrific response to my final questionaire with 87 replies that is 58% of those that started the event. Here is a summary that I added to my presentation at Online Educa:

I need now to follow this up with more interviews and with an analysis of the discourse in the forums. However, from what I have seen so far, it appears that those teachers who persevered until the end - trying out what they had learned in their own teaching practice and then sharing their experience with their peers in the final reflection - learned not only about the tools but also about how to apply them for teaching and about the consequences for their own professional development. Whereas those that finished after the first 12 days of activity tended to learn only about the tools. If this can be confirmed in my analysis then it will be a significant result as this is precisely what we were aiming to improve.

39% of the teachers who started the LE completed the final activities, that is 59% of those that were still active after the first 12 days. This is a really good result; I was talking to a friend of my who delivers face-to-face training courses for HR professionals in the UK and she remarked that it is always a challenge to convince participants to come back to the course after a period away. Indeed, given that these figures only reflect postings to the forums (contributions), the number actually involved will have been higher as I am sure there will have been some who will have read the postings and benefited from the experience of others without posting themselves (lurkers). Such vicarious learning is surely valuable.

The quantitative results are useful in terms of offering immediate feedback. However, my research is primarily qualitative in nature and so I must now press on with the time consuming task of walking through the interview scripts and forum dialogues, coding and analysing. Onwards we go ...

Brian

Friday 10 December 2010

Busy, busy, busy

I realise it has been a while since I posted a message, only I have been so busy and I am only now finding the time.

I've just been visting the campus at Lancaster University and I was struck again by the positive feeling one gets from being there. The intellectual discussions in the bars, the students with their heads in books and the serendipitous meetings with interesting people. I was able to have a very useful meeting with my supervisor Julie-Ann and a discussion with Maria, another tutor on the course. Both chats help me to refocus my thoughts. It was also great to meet John, a fellow student, and to exchange references, ideas and tips. One of the reasons for my visiting the campus was to attend a short course on Atlas.ti. It was really useful as a reminder of what the tool can offer and how to take advantage of its powerful functionality. I am now keen to get on and use the tool to help analyse my data.

I recently gave a presentation at a workshop (PED74) at Online Educa. It was good to give a public airing to my work. There were several teachers in the audience and I saw reassuring knods of approval as I spoke. A very useful and rewarding experience. Incidently, for my fulltime job I participated in a couple of workshops on assessing learning in a digital world (AP18 & AP33). The first involved an insightful discussion on the need for a change in assessment approaches for online learning in a web 2.0 environment. After my opening presentation there was one offered by Thomas Ryberg (presented by me as he was unfortunately stuck in snow in Denmark) and an intervention by Kiran Trehan. Both did an excellent job at highlighting some of the challenges associated with this new way of learning. Thomas explained how learning with web 2.0 implies much more than just a new environment, it means a change in culture to participative, active learning involving such possibilities as contributing to the design of the learning and the definition of the assessment criteria. Kiran reminded us of the expectations of online learning in communities and by referring to some concrete examples from a course run at Lancaster, was able to highlight some of the darker elements asssociated with power, inequality and the ubiquitous search for consenus. The second session introduced some relevant EU funded projects under the Lifelong Learning Programme that are faced with these challenges and are looking at practical ways forward. This was the first time we had brought academics together with practitioners and it really worked. The presentations should appear on our Agency's web site in the near future and I will add  link here when they do.

Last but not least, I've been very busy following and facilitation the revised Learning Event with Tiina, but this warrants a seperate posting so I shall stop here for now.

Brian