Wednesday 29 February 2012

Everyone's a critic

Everyone's a critic, they say.

Until it comes to academic writing, that is. Many students fail to realise their full potential when it comes to essay writing, usually because they can't seem to find their way out of the descriptive cul-de-sac they make for themselves. If they could only find it within themselves to write critically, they would earn higher grades. So why do some find critical writing such a problem?

Firstly, knowing your field of study is an important factor in academic writing, and some students simply don't trawl deeply enough. If you know one theory, but are unaware that it has been challenged by another theory, you only have half the story, and you then find yourself on the periphery of the discourse. Knowing the weaknesses of a particular theory will only come from gaining an insight into how that theory came about, and understanding how it can be applied in particular contexts. So to be able to write critically, you need to have read around your subject - you need to have seen the 'big picture'.

Knowledge of your field is not enough though. Critical thinking is crucial to the process. You can think without writing, but you cannot write without thinking. It follows that critical writing comes from critical thinking. If you can't think critically, you won't be able to write critically. Students need to learn to think in a particular mode to be able to do this. One of the top tips I can give to anyone who wishes to write critically, is first to think critically about what they are reading, and learn to ask questions of the text. It is a kind of conversation the reader has with the author. Best questions to ask are questions such as 'how does this writer justify what s/he is saying?' or 'What support does this writer have for their ideas?' You may like to dig deeper and find out how their evidence was obtained. Were the data they used obtained from a particular sample, and were they biased, or contrived in some way? Is the writer being totally objective, or is there some hidden agenda in there?

Academic writing has the capability to generate a great deal of angst. For example, students often get hung up on whether they should be using the personal pronoun in their essays and projects. My view is that there is nothing wrong with it, provided the writer is not expressing their own unsupported opinion. Writing 'I reflected upon this experience and subsequently adjusted my professional practice...' is justifiable, but simply writing 'I believe that ....' is not enough.

In his blog post 5 ways to develop critical thinking in ICT, Terry Freedman offers some great advice on how teachers can probe understanding by repeatedly asking 'why?', or 'how do you know that?' If students can do this during the process of writing up their assignments, many of their descriptive, lack lustre passages could be transformed into dynamic critical, reflective and analytic pieces of writing. One aspect of marking assignments I find particularly unpalatable is when students churn out the same old, bland writing which merely represents what has been covered in the module, and not what they have learnt and critically applied to their practice.

Another pet hate I have is disjointed essay writing. Some students seem to think that they will impress the marker if they pepper their writing with copious direct quotations from the set readin lists. All they end up achieving is a series of unconnected quotations with no particular thread of reasoning running through them. Better by far is the art of paraphrasing key points from published authors and then applying these to support an argument that you are developing. Even better still is the ability to counterpoise these paraphrased elements to form a finely balanced discussion that shows you have thought deeply about all perspectives associated with your argument, and can logically organise them. Whichever way you examine essay writing though, it all tends to comes back to the ability to think critically.

Probably the best form of critical thinking emerges from dialogue within the community of practice. Carr and Kemmis (1997) highlighted the importance of dialectical thinking. Based on Hegel's realist philosophies, Carr and Kemmis propose that the tensions between opposing perspectives, where opponents take the stance of 'thesis' and 'antithesis', usually result in some kind of 'synthesis' of ideas. Although this can often be a compromise between the two opposing perspectives, more often than not, it is also a merging of the strengths of both arguments to form an even stronger, newer thesis. Development of such world views are the basis of all critical learning, and require the student to be open to new ideas, and open to being challenged in their own beliefs, values and thought processes.

Reference
Carr, W. and Kemmis, S. (1997) Being Critical: Education, knowledge and action research. London: Falmer Press.

Image by Enrique Sanabria


Creative Commons Licence
Everyone's a critic by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

Tuesday 28 February 2012

On Aer

I spent two great days in Ireland this weekend, and would like to thank my hosts at the Computer Education Society of Ireland for inviting me. It's a beautiful country, the culture is rich and the people are all so friendly. Ahead of my keynote speech at the CESI 2012 conference in Portlaiose, I managed to do an interview for Dublin City FM 103.2. The programme Inside Education, presented by Seán Delaney, is a regular radio and podcast Irish perspective on news and stories from the world of education. We sat outside in the early spring sunshine of Portlaiose and discussed blogging, social media, the state of education, ideal schools of the future, innovation and technology, and a whole host of related topics.

I emphasised the importance of blogging as a means of teacher professional development and best uses of technology in education (social media, interactive whiteboards, VLEs, videoconferencing, iPads), and we discussed choice and adoption of new technologies in education. It was a wide ranging interview in which we also explored the use of Twitter as a rich communication backchannel and social networking media, discussed personal learning networks and communities of practice, and talked at length about the idea of classrooms without walls, BYOD and open educational resources. We also touched on youth culture, txt speak and digital literacies. The most important thing, I emphasised, is for teachers to consider the pedagogy, the potential learning gain and the student experience before they decide on the purchase of any technology.

Seán was a very good interviewer because he listened to what I had to say and then followed up my statements with useful questions that delved deeper into my ideas. The programme was broadcast on Sunday evening, and the podcast featuring the first part of the interview can be found here. The interview lasts approximately 20 minutes - listen carefully and you can actually hear the crows in the background! - do have a listen.

Image source

Creative Commons Licence
On Aer by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

Wednesday 22 February 2012

The Commons touch

Many people assume that because the web is open, any and all content is open for copying and reuse. It is not. Use some content and you could well be breaking copyright law. Many sites host copyrighted material, and many people are confused about what they can reuse or copy. My advice is this - assume that all content is copyrighted unless otherwise indicated.In the last few years, the introduction of Creative Commons licensing has ensured that a lot of web based content is now open for reuse, repurposing and even commercial use. The Stanford University law professor Lawrence Lessig is one of the prime movers behind this initiative. Essentially, Creative Commons has established a set of licences that enables content creators to waive their right to receive any royalties or other payment for their work. Many are sharing their content for free, in the hope that if others find it useful, they will feel free to take it and use it. Creative Commons is a significant part of the Copyleft movement, which seeks to use aspects of international copyright law to offer the right to distribute copies and modified versions of a work for free, as long as it is attributed to the creator. Any subsequent reiterations of the work must also be made available under identical conditions. In keeping with similar open access agreements, Copyleft promotes four freedoms:

Freedom 0 – the freedom to use the work,
Freedom 1 – the freedom to study the work,
Freedom 2 – the freedom to copy and share the work with others,
Freedom 3 – the freedom to modify the work, and the freedom to distribute modified and therefore derivative works.

Finding free for use images on the web is now fairly easy. Normal search will unearth lots of images. But these are not necessarily free images. Many will have copyright restrictions. To find the free stuff go to Google and click on the cog icon at the top right of the screen. S
elect the Advanced Search option. Next, scroll down the screen until you find the drop down box labelled 'usage rights'. You will be presented with four options:

Free to use or share
Free to use or share, even commercially
Free to use, share or modify
Free to use, share or modify, even commercially

Whatever option you choose, you will be presented with a reduced collection of images that still meet the requirements of the search, but under the conditions of that specific licence. Now you have a collection of images you can use under the agreements of Creative Commons. Use them for free under these agreements and you are complying with international copyright law. Don't forget the attribute the source!

So why would people wish to give away their content for nothing? I have previously written about my own personal and professional reasons for doing so in 'Giving it all away', but just for the record, I will summarise:

Giving away your content for free under a CC licence ensures that anyone who is interested in your work does not have to pay for it or worry about whether they are licenced under copyright law to use your content. In today's economic uncertain climate, it makes sense to be equitable and to give content away that others have a need to see and can make good use of. It also means that users will do some of your dissemination for you. Your ideas will be spread farther if you give them away for free, than they necessarily will if you ask people to pay a copyright fee or royalty. If you allow repurposing of your content, the rewards can be even greater. Some of my slideshows have been translated into other languages. Having your content translated into Spanish for example, opens up a huge new audience not only in Spain, but also most of the continent of South America. Many are now licensing their work under CC because they know it makes sense. Much of the content on Wikipedia for example is licensed under Wikimedia Commons - a version of CC. So look out for Creative Commons licensing - it's going to be very big news indeed for all web users in the near future.

Image source


Creative Commons Licence
The Commons touch by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

Sunday 19 February 2012

Learning pathways

I recently heard a story about the building of a new university campus. Unusually, the architect hadn't designed any pedestrian paths into his plan. When asked why there were no pathways between the buildings, he replied cryptically that he was waiting to see what happened. Soon, over a period of time, as students and staff walked between the buildings, they made their own tracks or 'desire lines' through the grass. Once these tracks had become established as the most natural and preferred routes, the architect ordered the builders in to pave over the tracks. 'Better they create their own pathways', he said, 'than for me build them, and then for them not be used'. Instead of imposing his own ideas onto the community, the architect had crowd sourced his design.

How often do we impose pathways upon students which do not meet their needs, or fit their expectations? How many times have we invested in technology, environments and curricula that is simply a waste of time and resources? The institutional learning platform - the VLE - is a classic case of decisions made about learning without consulting the learner. How can we reach a place in education where students find their own level and make their own pathways through learning?

Deleuze and Guattari's 1980 publication A Thousand Plateaus might offer us some clues. It was hailed by some as a masterpiece of post-modernist 'nomadic' writing. Others criticised it for its dense, pseudo-scientific prose. Whichever way you view this book however, it was notable for introducing rhizome theory as a metaphor for knowledge representation. According to Deleuze and Guattari, rhizomes are unlike any other kind of root system, having no beginning and no end. Rhizomes don't follow the rules of normal root systems, because they resist organisational structure and chronology, 'favouring a nomadic system of growth and propagation.' In plain English, the authors are attempting to describe the way ideas spread out naturally to occupy spaces like water finding its level. The rhizome is not linear, but planar they argue - and therefore can spread out in any and all directions, connecting with other systems as it goes. The same might be said about the way communities form, create their preferred ways of communication and decide their priorities.

Rhizome theory is also a useful framework for understanding self-determined learning - the heutagogy described by Hase and Kenyon. Hase and Kenyon contextualise heutagogy with reference to complexity theory, and suggest a number of characteristics including 'recognition of the emergent nature of learning' and 'the need for a living curriculum'. The self-determined pathway to learning is fast becoming familiar to learners in the digital age, and is also the antithesis to the formal, structured learning found in traditional education.

Dave Cormier - one of the foremost contributors to rhizomatic learning theory - takes this concept deeper into digital territory by equating rhizomatic learning to 'community as curriculum'. The advent of social media, mobile communications and digital media facilitate large, unbounded personal learning networks that mimic the characteristics of rhizomes. If we accept that there is a need for a living curriculum, it would be logical to also accept that a self-determined community generates and negotiates its own knowledge, thereby forming the basis of what its members learn. Rhizomatic learning is also premised on an extension of community as curriculum, where: 'knowledge can only be negotiated, and the contextual, collaborative learning experience shared by constructivist and connectivist pedagogies is a social as well as a personal knowledge-creation process with mutable goals and constantly negotiated premises'.

Students can, and do, create their own personalised learning pathways. There is also evidence that learning communities informally decide their own priorities, often observed in the emerging folksonomies that result when digital content is organised, shared and curated. These processes often occur in spite of the strictures and rules imposed upon students by the institution. Most are the result of informal learning, achieved outside and beyond the walls of the traditional education environment. Self-determined learning pathways are crucial for individual learners as well as learning communities and they are by their very nature beyond the control of universities and schools. Schools and universities cannot (and should not attempt to) harness these processes, but they can facilitate them. Just like the architect, institutions can refrain from imposing structures and pre-determined tools, wait to see what their students prefer and then provide them with the best possible conditions to support self-determined learning.

Image by justpeace


Creative Commons Licence
Learning pathways by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

Thursday 16 February 2012

Never mind the quality

While waiting for my flight home from Cyprus last week, I did an impromptu interview for some colleagues from Pakistan in the departure lounge. They quizzed me about my views on quality in education, and recorded my responses on video. They intend to share the video online once all the airport public address announcements have been edited out. In the meantime, here's the essence of the interview:

My view on quality in primary education is that it cannot solely be measured through standardised testing or other performance related metrics. These are used by governments as measures of whole school compliance to policies rather than as measures of how individual children are learning. Standardised testing is a device to control schools and systems. It has never been about learning. The quality of personal learning gain can only be measured through authentic forms of assessment, and the more individualised these are, the better. I suggest ipsative assessment which involves measuring a student's learning against their own previous achievements. This is a much fairer method, and has the potential to inspire learners rather than show them how big a failure they are. The Assessing Pupil Progress (APP) schemes already practiced in some UK schools are exploiting this potential, and it's a more equitable method of assessment than the old norm or criterion referenced forms that are still being used by many schools throughout the world.

How do we ensure quality learning in education? The best way I know how to do this is to provide space for children to express themselves creatively. Children need to be given licence to ask questions, no matter how ridiculous or bizarre they are, to explore outrageous possibilities, to exercise their imagination and to create something they can be proud of. The lack of expressive subjects such as art and music in the English Baccalaureate (EBAC) subjects is a travesty, and should be redressed as quickly as possible.

Children also need to be given space to make mistakes without any condemnation. Alvin Toffler once declared: “The illiterate of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn.” Too often 'success' culture has been so deeply ingrained within the fabric of school life, that there is no room for failure from which we can learn.

If children are able to control what they learn and create things, their interest will grow, and if they are interested in the subject they will learn. They don't always have to be happy or comfortable for quality learning to occur. Sometimes discomfort, dissatisfaction or a lack of closure will spur them on to achieve even more in learning. Children need to be given tools to help them to learn, and then they need to be left alone to use the tools in the best ways they can find toward deeper learning. Better still, allow them to use the tools they are already familiar with.

Standardised curricula are bad news for schools. More trust needs to be invested in young people to be responsible for their own choices. Too often when teachers are pressured, they tend to revert to methods they are most familiar with. Often, these methods bear no resemblence to the needs of contemporary society, because it has moved on from the time they were themselves in school. Often we forget that teaching today is about the children, not the teachers. It's not our learning, it's theirs, because as the Indian poet Rabindranath Tagore once warned: 'Do not limit a child to your own learning, for he was born in a different time'.

Image source


Creative Commons Licence
Never mind the quality by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

Wednesday 15 February 2012

Bear pit pedagogy

In our digital literacy teacher training programme at Plymouth University we create environments that encourage critical thinking. My colleague Peter Yeomans (AKA @ethinking on Twitter) says we create the 'bear pits' for our students. In other words, we enable digital and physical learning spaces in which they can freely explore ideas, argue with each other (and us) over concepts and theories and in so doing, develop their reasoning and thinking skills.

In order to develop key critical thinking skills, learners need to be able to argue effectively. They need to be aware that there are alternative perspectives and they need to be able to defend a position from attack. They must also investigate theories critically, because if they simply accept a theory as 'truth', they may be leading their entire classroom down a blind alley. Too much bad theory has crept into the classroom in recent years, as I have previously commented, and we want to ensure that our trainee teachers are aware of flaws, counter-arguments and alternatives to all theories. That's why we encourage our students to critically engage with course material, and then to extend their knowledge by creating their own additional content around it.

We encourage them to develop their own Personal/Professional Learning Networks (PLNs) so they can lock into and exploit the vast communities of practice that already exist out there in the rapidly expanding Blogosphere and Twitterverse. They are quite adept at using the tools at their disposal to create these connections, but first they need to be convinced. Once they realise the benefits of blogging or tweeting, and can see how much they learn as a result of engaging with remote peers, they engage with it enthusiastically. When students are given projects to complete, blogs, videos, podcasts, they are expected to organise their ideas, form their argument and present them in seminar or digital format - and then they must defend them. You see, when students are required to present something they have learnt to an audience, they need to know it well before they can present it convincingly. It's not the easiest route for learning, but it invariably turns out to be deep learning. The bear pit approach is more akin to dropping them in the deep end, and it can be a little uncomfortable at times.

One final point: We also give students the license to challenge us, and sometimes, if we feel it necessary, tutors may even debate each other in front of the students. Academics don't (and can't) always agree on everything, so why not model critical discussion for the benefit of the students? I would be interested to hear from other teacher educators about what approaches you use and whether you see any value in what we are doing with our bear pits.

Image source


Creative Commons Licence
Bear pit pedagogy by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

Friday 10 February 2012

Damascus Road

My keynote presentation yesterday to the Cyprus International Conference on Educational Research had a mixed reception. Some delegates agreed with the points I made, others were more sceptical. It's interesting when you present what are considered radical ideas to a rather conservative audience and see the reactions around the room. It's like watching the surf hitting the rocks outside the window of my beach hotel room. An unstoppable force against an immovable object, and all that.

The majority of the delegates present were from from Asia, the Middle East and North Africa. When asked, around 75% admitted that they had no involvement whatsoever with any of the social media or networking tools I was talking about. I had to pause at this point to rub my eyes. How can you expect to understand what your students are doing if you don't yourself engage with these tools? was my challenge. I think some were quite appalled when I suggested that even if they ban mobile devices and social networks in their classrooms (which many are doing), the students will still continue to use them, probably under the tables. There were some worried glances when I suggested that the reason students are using mobile devices and social media in the classroom might be to check out how accurate and truthful the lecturer's statements are. This kind of challenge to authority may not be palatable for many conservative academics, but its a plain fact - it happens all the time, and it will grow in its intensity and reach. My message was - get over it - it isn't going away.

I also caused a few ripples on the normally placid pond of academic publishing by showing some recent figures on how successfully the major publishers are exploiting our good will in offering our work to them for free. I called for an end to the enormous profiteering that is currently perpetrated by some publishers, and pointed out that often, public money has funded the research that ends up behind a paywall. That was the main reason, I declared, that I resigned from my job as Co-editor of a major Taylor and Francis journal late last year. I could not, in good conscience, continue to help the publishers to line their pockets off the back of free labour, and publicly funded research that ended up behind a pay wall, read by very few people who had the means to pay for it.

I cited figures from two of my own papers, both published around the same time (in the slideset above) which showed the unacceptable editorial/review lead in times for many closed journals in comparison to open online journals. Paper based journals suffer from editorial back logs and there is little they can do to alleviate this problem. Some have established online 'early' publishing systems that host accepted papers prior to full publication, but they remain behind the paywalls. The most stunning comparison I offered was between the citations metrics of my two papers. The closed journal paper had received 19 citations against 511 for the open journal publication in the same time period. This alone, I argued, shows that open journals have the edge over closed journals, with many, many more people reading the free to view articles. If we want widespread dissemination of our findings, we need to look to the open journals, with their vast readerships.

During the question time, objections were voiced. I expected it. One delegate claimed that the review processes for open journals were not as rigorous. Well, that's just your perception, I countered, and it's a very challengable statement. I pointed out that in some open journals, review processes are even more rigorous - my open access journal article for example, was reviewed by three separate reviewers. The fact that they were unblinded (they knew our author names and we knew their names) and that the reviews and our responses were posted up online alongside the paper openly, created a higher quality, and more transparent review than the traditional closed, double blinded reviews could ever hope to achieve. Well, I did my best, and hopefully, some delegates will have a Damascus Road experience before they submit their next journal article. Perhaps some will think twice about banning mobile devices and social media in their classrooms in the future - and hope against hopes - some may even take the plunge and subscribe to a social network or two. We live in hope.


Creative Commons Licence
Damascus Road by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

Thursday 9 February 2012

A dangerous game

There's a dangerous game they play in Cyprus. It's called Meze, and it's far more brutal than the Spanish Tapas equivalent. The game goes like this: There are two teams: the eating team and the waiter team. The waiter team tries to beat the eating team into submission by delivering a constant supply of small dishes, containing far more food than they are ever likely to need in a full calendar month. It begins innocuously, with a few plates of pitta bread, humus and tzatziki. The eating team is lulled into a false sense of security. This is nice, they think, we can do this. Then more dishes begin to arrive at an alarming rate.

As the eating team finishes one dish, it is removed and three more replace it. The goal of the waiter team is to fill the table up so completely with food that there is no room left, and the eating team has no choice but to eat their way out to safety. But the game is a fix. No matter how much the eating team consume, there are always more dishes arriving. Kebabs, eggplants, grilled cheese, prawns, skewered meat, fried octopus - you name it, it all arrives far too quickly. There is a sadistic streak in the waiter team. Even when the eating team has had enough, the waiting team continue to deliver knockout blows, placing even more food directly on to their plates. Eventually, and inevitably, the eating team are writhing in extreme agony on the floor clutching their stomachs and yelling 'Enough! We surrender!' The end of the game is signalled by the waving of a white napkin, and then you can observe the smug grins on the faces of the waiter team, who look at each other and nod knowingly. Yes, we have defeated yet another group of tourists with our clever food manoeuvres. Our job is done.

This got me thinking that many of the world's education systems are a little like the eating game of Meze. We pile the students plates high with content. Content of every kind is presented to be consumed, and the poor students don't stand a chance. Many are overwhelmed by the amount of content they need to learn, and the pace at which they have to learn it. Even while they are struggling their way through an overburdened 'just in case' curriculum, still more content continues to arrive at an alarming pace. Some learners cry out for mercy, but they are still compelled to consume the content, because later, they are required to regurgitate it in an examination to obtain their grades. The examinations bear no resemblance to that which will be required of them in the real world. No wonder so many wish to leave the table early. What can teachers do to obviate this problem? Some are making a difference, reinterpreting the curriculum they are given by enabling activities and creating resources that facilitate student centred learning. Learning at one's own pace, and in a manner that suits the individual will overcome some of the problems of overload, but more needs to be done. Things are changing, but they are changing slowly, too slowly for many people's tastes. It's a dangerous game we are playing in education. Isn't it about time we stopped?

Image source


Creative Commons Licence
A dangerous game by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

Monday 6 February 2012

My mouse is dead

No, we haven't had a pet bereavement ... and there is no tiny rodent laying legs up in its little cage (although ... there's a thought. My daughter's pet mouse sometimes keeps us awake at night with its irritating noises....). No, I'm referring to that wonderful old computer peripheral device that was first introduced in the last century. The mouse has served its purpose, and has been a faithful servant for all those using computers. But the fact is, the mouse is going the way of 5.25 inch disk drives (remember those?), CRT screens and dot matrix printers. Many of us wondered what we would do when the floppy disk drive was phased out. But how many of us miss it now? Same goes for the bulky visual display units and the clunky, noisy printers.

The computer mouse is old technology, and for a growing number of users has recently been superceded by touch screen devices (and the soon to be widely used non-touch devices). The only time I ever use a mouse now is when I am at my desk at the university, and am compelled to use a desktop computer. Most of the time I'm out and about using my iPod Touch, iPhone and a touch-pad laptop. Personally I haven't needed computer rodentia for several years. My mouse is dead. It is an ex-mouse. It has gone to join the squeaky choir of rodents in the sky.

I'm wondering how many other people are also of the same opinion. I have often watched young children trying hard to control a mouse, particularly when their hands are small and they can't quite grip it correctly. I have also watched children with fine motor control problems struggling to use them. Perhaps it's about time the more intuitive touch screen interfaces were introduced widely in schools. Hand-eye co-ordination is also required to control a computer from a mouse. Most of us can do it easily, but it's not everyone's experience. I have watched some older people struggle to get the mouse pointer in the correct position to execute a click. For older people with motor control difficulties or reduced visual accuity, the more intuitive touch screen, voice activation and gesture control may also be a clear advantage over mouse driven computers.

I think the mouse has had its day. The cat can have it. Now it's time for the next generation of intuitive interfaces. What do you think?

Image by Nik Hewitt


Creative Commons Licence
My mouse is dead by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

Sunday 5 February 2012

Tim flies

Tomorrow I head off to Nicosia to keynote the Cyprus International Conference on Educational Research. The event, hosted by the Middle East University (North Cyprus campus), will feature four keynote speakers and presentations of papers, workshops, posters, seminars and virtual presentations on a wide range of pedagogical research themes. In total, it looks as though there are over 400 presentations accepted into the three day programme.

The conference aims to "bring together educational scientists, administrators, counsellors, education experts, teachers, graduate students and civil society organizations and representatives together, to share and to discuss theoretical and practical knowledge in a scientific environment".

The three other keynote speakers are Janet Parker (Open University, UK) who will speak on the topic of 'Encouraging Early Career Researchers to become Expert Published Writers', Lejf Moos (NTNU Trondheim, Norway) whose theme is 'European Educational Research Today', and local academic Mehmet Çağlar (Near East University, Cyprus).

My own keynote will cover the proposition that social media, mobile technologies and the Web are together changing the way we perceive knowledge, learning and education. I'm going to  propose that we are witnessing a radical shift in the way knowledge is represented, consumed, created and shared, and that as a result, we need to reappraise the way we conduct research and disseminate our findings. I'm going to talk about adopting open access journal publishing as the best way forward for widespread and effective publication of research, and I'm going to champion open scholarship. Let's see how that will be received.

I'm going to blog again about the conference once I'm there and it's in full swing. Cyprus is a wonderful country to visit any time of the year, but doubly so at the moment, with the inclement weather here in the UK assailing the senses. The island's temperate Mediterranean climate will be very welcome, and the Cyprus culture and history are rich. It's a dirty job, but somebody has to do it. 


Creative Commons Licence
Tim flies by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

Friday 3 February 2012

Five tools for global educators

Recently I have been considering the changing role of teachers who are adopting technology to extend the walls of the classroom. These are a new breed of teachers who do not necessarily accept that the classroom is contained within four walls. In effect, through the use of social media and telecommunication technologies, these teachers are becoming global educators. I consider myself a global educator and have tried to articulate my ideas on why this is a different role to traditional teaching. We are connected educators, linked in to a number of powerful global communities of practice, and we have access to resources, dialogue and audiences we would not enjoy in a traditional learning and teaching role. But what tools do we use to enable us to connect with these communities, resources, audiences around the globe? Here are my top five tools:

Webinar: There are a number of ways to teach and present live from beyond the classroom. I regularly present live (synchronous) webinars or web seminars, and other teaching sessions from my home office, or from a hotel room, and conceivably just about anywhere else there is connectivity to the internet. I have presented from Australia to the USA (strange timezone differences there) and from Europe to the USA, and even, in such events as the Reform Symposium, presented to a worldwide audience of educators. Webinar tools include Elluminate (now known as Blackboard Collaborate), WebEx and Adobe Connect all of which have similar screen topographies and perform similar functions, but all have an associated cost. All of the above tools support live audio (you should use a headset to maintain quality) and video communication (a webcam or internal camera on a laptop is needed for this), slideshow presentation tools and text communication. Webinars could also be conducted on Skype which is currently free, but quality may be more variable using this tool.

Blog: Blogging is arguably one of the most powerful tools for global education. I have already written a great deal about the power of blogging, so I won't elaborate too much here. What I will say is that by following a few simple guidelines, teachers can write and present content in accessible formats, and can incorporate images (pictures, diagrams), videos, audio and hyperlinks, all of which can help students to investigate a topic in greater detail if they wish. The comments boxes below each post support dialogue, and the tagging feature on most blogs enables easier search for content.

Twitter: This social networking tool is deceptively simple, but deeply sophisticated and versatile due to its inherent filtering facilities. It is also an excellent connecting tool - retweets are not repetition, they are amplification of content. The power of Twitter lies not only in its simplicity, but also in its accessibility. Whether used as a backchannel to amplify an event, or as a closed channel to converse between small groups, Twitter has an appeal that enables a great deal more expression that one would expect from a 140 character limit. Hyperlinks and other media links can be shared, and with the addition of a URL shortener, can also make more space for a few annotations. Used in conjunction with the other tools showcased on this page, it is indeed a very powerful tool for the global educator.

Video: Social media tools such as YouTube are maturing into sophisticated tools that enable all kinds of visual media sharing. Over 24 hours of video footage is uploaded to the YouTube servers every minute. Most of it can be disregarded, but some content found on YouTube is gold dust for teachers. It is now possible to create your own personal channel on the service, simply by clicking a few buttons. There is an editing facility available that allows teachers to select specific sequences of video and create new versions for showing to students. The comments box at the foot of each video clip enables dialogue between presenter and students. It's asynchronous, but can still be a highly effective way of sending quality content to distributed student groups.

Slideshare: If you have a Powerpoint presentation or a document and you want to share it with a wider audience, then Slideshare is probably your first port of call. Several of my recent presentations have gone viral simply because the tool is easy to access and is being used by large numbers of people every day. You can see at a glance how many views your slideshow has received, how many favourites, downloads, embeds, and most importantly, you can respond to comments to create dialogue with your remote students.

These are just a few of the vast array of tools that are currently available to the global educator, and they are my preferences. I am sure others will have different preferences or recommendations to make. Please feel free to share your expertise and ideas below in the comments box.

Image source


Creative Commons Licence
Five tools for global educators by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

Wednesday 1 February 2012

Human 2.0

Post human. Such a strange concept, and one that many people struggle to understand. At its simplest, being post-human is a state closely aligned to the cyborg, or cybernetic-organism - part human, part machine. In other words, the post-human condition emerges when humankind and technology merge to the point where they become a part of each other. We can understand cyborgs - and for many, the idea of a half-man, half-machine evokes deep seated fears about how far technology can go. Donna Haraway (2004) makes a point of singling out Rachel - a replicant character in the sci-fi movie Bladerunner - as 'the image of a cyborg culture's fear, love, and confusion.' We have seen many other popular culture examples of the cyborg, from the six-million dollar man to Robocop - and each is endowed with superhuman strength, or enhanced senses. We recognise them because they are on their own, isolated, lost in a world of otherwise normality, unnatural, freaks of non-nature.

No-one really knows exactly if or when a post-human phase emerged, it is all theory and supposition. But we can trace the history of prosthetics and reflect on the incorporation of various kinds of technology into the human body. Replacement limbs may not strictly be accepted as a merging of technology and humanity, unless they are robotic limbs. Heart pacemakers, valves and other forms of technology implant or merger might be. Computer scientist and philosopher Andy Clark, in his 2003 book Natural Born Cyborgs, argues that humankind has an innate need to interface with technology: 'What the human brain is best at is learning to be a team player in a problem-solving field populated by an incredible variety of nonbiological props, scaffoldings, instruments and resources' (p 26). Essentially, when wetware (biological entity) meets hardware, the software can be interoperable. Clark sees the merging of mind and machine to be unstoppable and inevitable. He believes it's not a matter of if, but when. Some would argue that the transient phase leading to post-humanism is the non-invasive but just as powerful welding together of human and computer, as seen in the addictive video game playing of geeks, or the smartphone ultra-dependency of our current youth generation.

So are we now on the verge of a new phase in human development? Are we at the cusp of the incorporation of technology into the human body because we have such a desire to enhance our senses, increase our physical and mental performance, or otherwise extend the capabilities of what is considered to be 'natural'? Are we about to embark on a post-human phase in human development? Some would affirm this, citing several notable 'real examples' of cyborgs in recent years. Meet Kevin Warwick, a professor at the University of Reading, probably the world's first true cyborg. Professor Warwick is interested in how technology can enhance human senses and improve performance. In the foreword to his book I Cyborg, he writes: 'Humans have limited capabilities. Humans sense the world in a restricted way, vision being the best of the senses. Humans understand the world in only 3 dimensions and communicate in a very slow, serial fashion called speech. But can this be improved on? Can we apply technology to the upgrading of humans?' In essence, Warwick is asking: Can we become Human 2.0?

In a famous experiment in 1998 Warwick had a chip transponder surgically implanted into his arm. A computer was then able to track Warwick as he moved around the university campus, and allowed him to open doors, turn on lights, and operate computers without touching them. Other phases of the experiment involved more advanced transponder implants that monitored Warwick's internal condition, such as his emotional responses, stress levels and even thoughts. The speculation was that if others also had similar transponders implanted, people might then be able to communicate their thoughts and emotions to each other via computer mediation.

More recently, Tanya Vlach made headlines with her plans for a new prosthetic eye. She has a dream to transform herself into an 'enhanced human being' after being involved in a serious car accident in which she lost her left eye. She is now planning to have an 'eye-cam' - installed inside her prosthetic eye, complete with zoom control, infra-red and ultra-violet capabilities and the facility for face recognition. The eye-cam would interface with a custom made app, housed in a standard smartphone. She is currently waiting for technology to catch up with her vision, and one day soon, hopes to be able to hard wire the eye-cam directly to the vision centre in her brain, and in so doing become a truly enhanced human being - a cyborg - a post human. Scary, fascinating, challenging stuff - the cyborg becomes the iBorg.

Computer scientist Jaron Lanier's keynote speech at Learning Technologies in London recently served to illustrate several of the dangers and caveats of the post-human condition. Jaron Lanier vehemently rejects Ray Kurzweil's vision of a future where computers can exceed the capabilities of humans. 'You have to be somebody before you can share yourself', he warns. He suggests that we already have expanded memories (search engines of the web) and remote ears and eyes (mobile phones and webcams). Lanier sees no techno-eutopia in the future, but warns instead that we are in danger of dystopia. Indeed, he advised the makers of the movie Minority Report on what might be expected from a technology dominated future in which people were manipulated like chess pieces. The data mining capabilities of the social networks alone can enslave us by owning our purchasing habits, internet search preferences and all other personal data he suggests. He sees Facebook and other social networks undermining and devaluing friendships. The technology should work for us, not us for the technology. Lanier is a contentious, thoughtful character. In just a few minutes of conversation with him in the speaker's lounge, my impression was that he opposes anything that involves a 'hive mind'. 'Why are you wearing a Creative Commons badge?' he asked me as we gazed out over West London. I explained that I believe in giving all my content away for free and that to me, that is the essence of the future of learning. 'I'm going to speak against that today', he warned. It's clear that generally, Jaron Lanier holds a somewhat more pessimistic view of our possible cyborg future.

In the final analysis though, it is mind amplification that is the ultimate goal for humankind's future enhancement. The ability to distribute knowledge beyond the confines of the human brain, and the capability to extend the mind through and across networks does not demand or require any co-joining of human and computer. We have already achieved much of this through mind tools such as social media, which according to Karen Stephenson enable us to store our knowledge with our friends. Do we really need a post-human future? iThink not. 


Top image by Elif Ayiter

References

Clark, A. (2003) Natural Born Cyborgs: Minds, Technologies and the Future of Human Intelligence. New York: Oxford University Press.

Haraway, D. J. (2004) A Manifesto for Cyborgs: Science, Technology, and Socialist Feminism in the 1980s. New York: Routledge.

Lanier, J. (2010) You are not a Gadget. London: Penguin.


Creative Commons Licence
Human 2.0 by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.