Saturday 30 January 2010

Barking at ants

On Wednesday 3rd February I gave a presentation to staff at the University of Brighton, on an invite from my old mate Asher Rospigliosi. His blog on learning using the Internet is worth a read. Below is the abstract of my presentation to staff in the Brighton Business School:

New Pedagogies for the Digital Age

The rapid emergence of new, participative and social media in higher education has caused teachers to question what they have previously believed about university teaching. Students also have different expectations when they arrive on the university campus. The proliferation of handheld and mobile devices, smart phones, ubiquitous computing and broadband networked technologies, interactive whiteboards, touch screen and wireless technologies is bewildering, and on the back of these developments, many academics also find it difficult to come to terms with the new digital cultures and values their students bring to the lecture room. In this presentation I will explore these tensions, discuss the potential and actual applications of new technologies, and examine how they are changing and challenging our traditional notions of pedagogy. I will speculate on how the digital age is causing us to reappraise and re-examine our own personal and professional values, and approaches to the business of learning and teaching. I will offer some simple, practical examples of how to integrate new social media into formal learning contexts and provide some insight into these processes from my own reflections on professional practice in teacher education.

Here is my slideshow:
It has been a long time since I was in Brighton. When I was last in Brighton I was an undergraduate student of psychology. I studied at Lancaster House, the cognitive science centre of the University of Sussex. I remember vividly visiting the funfair on the end of Brighton Pier one evening, and took a ride on the waltzers. I had just enjoyed a rather large dinner at a local Tex-mex. It wasn't a smart thing to do. Spicy food, cider and fast whirling things is not a very comfortable combination. The inevitable happened. I experienced a technicolour yawn - I won't go into any more details. I managed to avoid barking at ants again by steering clear of all the fast whirling things when I was there this time.
Image source

Thursday 28 January 2010

Lost in translation (again)

Travelling up on the National Express bus from Plymouth to London was rather tedious. I had a guy sat behind me who insisted on speaking very loudly into his mobile phone for most of the 6 hour journey. It couldn't have been an iPhone because the battery wouldn't have lasted. What was worse, I couldn't even listen in to the gossip, because he was gabbling away in some far Eastern language...

I was in London to speak at the Learning Technologies Conference at Olympia. I was invited to speak by the urbane and witty Donald Taylor, who really does know how to organise a smart and glittering event. In the speaker's lounge (yes they have one) Don introduced me to a smiling grey haired, bearded man who turned out to be Lord David Puttnam, our first keynote speaker. His Lordship looked into my eyes and it was as if we were old friends. He has that effect on people. Some may remember him as the force behind some of the greatest movies in the British cinema, producing such classics as The Mission, Local Hero and Chariots of Fire. And here we were talking about how he had recently been down to Plymouth to speak at a prizegiving at one of our local primary schools. He really has a way of putting you at your ease.

Lord Puttnam's keynote speech will no doubt be covered much more eruditely elsewhere on the web, but I want to capture just the essence of his talk here. He argued for education as the answer to all the world's needs, including climate change, and gave the example of climate change simulation games where the first thing children do is destroy the world. Later they learn how to save it. He said that no education system can be better than the professionals it employs. My favourite soundbite was that 'good teachers should be able to walk into your head and turn on the lights'. The bottom line for Lord Puttnam was this - only through engaging with digital media are we likely to nurture a generation of smart learners, who are agile and flexible enough to cope with the world's changes.

The first day of LT10UK was mainly about technology used in corporate training and development contexts, and it had a distinctly HR feel about it. My own session on smart technologies was in the main auditorium, complete with stage, large screen, coloured theatre lighting, and a video camera that captured everything for playout later. There were around 200 in my audience, and boy, were they polite! I braced myself for a barrage of 'what ifs?' and 'so whats?' - the type of feedback you get from a savvy academic audience. But no, this lot were on their best behaviour and the discussion was very placid indeed. It was almost as if we were talking in two different languages. I suspect that the worlds of training and education (which collided a few times at the conference) are not converging as fast as many think they are, and LT10UK really was dominated by training and development - and of course that's why most people were there. After my presentation, I had several approaches from people who wanted to quiz me more about my talk, most blown away by the idea that handheld devices could be used in the ways I had described in my slideshow. I had to convince some over coffee by doing some live demos on my iPhone of Navigator and other GPS based tools.

I paid a brief visit downstairs to the main exhibition where 200 stands reached out and tried to grab 5 minutes of your time. Most appeared to be VLE vendors. Many were chancing it, in a market that is probably burgeoning for corporate training, but which in my opinion, is already 10 years out of date for schools, colleges and universities. Again, the VLE companies were speaking in another language - this one more akin to Latin or ancient Greek, so I quickly left the arena.

The conference party that evening over at the Kensington public bar was outrageous. All the drinks were free, plates of food kept circulating, as did a team of street magicians, doing card tricks and close up illusions. I withdrew politely at just before midnight after an excellent Italian style meal, but some of the hardier souls went on to other venues and stayed out as late as 4 am. They may be a placid lot at Learning Technologies, but they certainly know how to party.

Day 2 was interesting simply because so many of the corporate speakers seemed to be reading from the same script. Social learning is a good thing, they said. Do collaborative stuff and your employees will learn better, they advised. It sounded suspiciously like lip service though. I suspect that many are 'on the verge of beginning to think about considering it', but haven't yet taken the plunge. After all, user generated content seems a little risky for those businesses who want to protect their secrets from their competitors and maintain their unique branding. One of the more vocal delegates did manage to ask a sticky question of one of the presenters in the mobile learning session. The response was: 'Why do you ask? Are you one of our competitors?'

On Day 2 I enjoyed a very interesting 30 minutes over lunch with Professor Stephen Heppell, whom I had never had the pleasure of talking with before. I had heard him speak several times at events such as Handheld Learning and ALT-C, but this was a real opportunity to speak to him on a one-to-one basis and hear his ideas. We covered everything from MPENSA, to schooling in post-conflict areas, to disaster relief, and one of his most recent ideas - to take over vacated highstreet store spaces of the likes of Burger King and Marks and Spencer and turn them into learning centres. 'They're already DDA compliant' Stephen said, and convinced me it is an opportunity too good to miss. I said to him, 'Stephen, you're a busy man.' His reply, with a twinkle in his eye was 'I'm having the time of my life!'

All too soon the conference was over. I got to meet several people I had only ever met on Twitter, including Barry Sampson and Jane Hart, and had several very interesting conversations with a number of the delegates, mainly from Scotland, for some reason... The bus trip back home was horrendous, as Chiswick flyover was closed and we were diverted across town in heavy traffic. A man behind me was talking loudly and incessantly into a mobile phone for most of the journey home. And guess what? It was in a far-eastern language and I didn't understand a word...

Image source

Wednesday 27 January 2010

Can we talk?

The World Wide Web may have connected many people together and made the world a smaller place in the short time it has existed, but it also has the potential to cause division and social upheaval. The web wars are hotting up. At a macro level, we have a nation state taking on one of the giants of the Internet. In the red corner, the People's Republic of China, and in the multicoloured corner, Google. When one of the largest icons of free speech clashes headlong with the largest single totalitarian force on the planet, there is going to be collateral damage. Google says it is pulling out of operations in China, because of its attempts to suppress free speech. So is Google big enough to take on the world's largest nation state?

Daniel Lyons, writing in this week's Newsweek says 'the Internet is bigger than any one country - even a country as big as China. Calling out China as someone who "doesn't get it" is a way of putting the rest of the world on notice. The Chinese people are finding their own ways around the 'Great Firewall of China', sidestepping filters, using proxy servers to gain access to the sites and ideas its government is trying to suppress. China may be big, and its economy one of the most successful worldwide, but it won't win this war.

At the personal level the web war is also causing casualties. In this week's Spectator, Rod Liddle recounts the story of yet another victim of social network indiscretion. This time it's the hapless Paul Chambers who, frustrated with a decision by his local airport to shut down due to the recent snow, made a seemingly innocuous remark on Twitter that he was going to 'blow the airport sky high' if they didn't get their act together. Someone reported it as a terrorist threat. The Police were called in. Chambers was arrested, his laptop, iPhone and home computer confiscated. He has subsequently been banned from flying from any UK airport, and has even been suspended from his job. In his commentary, Liddle asks reasonably whether there is any genuine terrorist who would actually announce his plans over a public channel like Twitter. Liddle condemns the British authorities for over-reacting, for grotesque officiousness, self-righteous posturing and a 'complete and utter lack of common sense and natural justice'. He'll probably be suspended from his job now for stating his opinion. One wonders how far away the UK is from becoming just a little bit like China in its suppression of free speech?

Image source

Monday 25 January 2010

Spinning the Web

This is the final part in my 11 part series on the history and impact of distance education. I have taken a British perspective on this, but of course, other views are available. In this final part, another great Briton makes his impact with a contribution to the World Wide Web.

‘Enquire Within Upon Everything’ was an obscure computer program designed 25 years ago by a young software consultant called Tim Berners-Lee. The program may have been obscure but it was also ground breaking as it encapsulated the ideas that would eventually enable Internet users to link directly from their personal computer to any information they required.

Tim Berners-Lee was born in London in 1955. In 1976 he graduated from Queen's College, Oxford University, before working at CERN, the European nuclear research facility in Switzerland. Whilst working as a computer software consultant, Berners-Lee began to consider the problem of how to communicate and access information via computer on the emerging world wide phenomenon that was known as the Internet. In 1989, Berners-Lee proposed a global hypertext project which he called the World Wide Web. Two years later, his ideas had crystallized on the Internet, and by 1993 the principles of his browser system Mosaic was being championed by the University of Illinois. A year later, in 1994, Berners-Lee joined M.I.T. where he headed up the fledgling W3 Consortium.

The World Wide Web is a truly unique and all pervasive innovation - without it the Internet would not be as successful as it evidently is. Browsers make accessing information ‘friendlier’, and pages more navigable. Berners-Lee has campaigned tirelessly to keep the World Wide Web open and free, and this is possibly one reason why it remains largely an un-policed, imaginatively fertile and unpredictable aspect of distance education. For many commentators, the Internet was inevitable - the World Wide Web simply made it easier for millions to use it.

Image source

Sunday 24 January 2010

University of the second chance

This is part 10 of my series on the history and impact of distance education. On Friday in part 9 we looked at how satellite technology has impacted upon global communication and e-learning. Today's post is all about the Open University model of distance education.

Under the Labour government of Harold Wilson, the UK’s Ministry of Education decided upon the ambitious plan of establishing a university that would confer degrees entirely delivered at a distance. It was higher education for all, regardless of age, social or economic status. Wilson’s government advisors proposed the name ‘University of the Air’ to acknowledge the institution's predominant form of proposed delivery method – broadcast telvision and radio. It was not long, however, before the UK government realised that the correspondence tools first established in Victorian times were still very valuable. Eventually, in 1969 the Open University (OU) was born, opening its 'doors' to students two years later.

With the OU came a whole new set of benchmarks for quality in distance education. Yet the British Open University was not the first Open University. That honour probably belongs to the University of South Africa (UNISA) which was established a few years prior to the British OU. However, under the guidance of several luminaries from the world of distance education, including the late Charles Wedemeyer (University of Wisconsin), the OU flourished and established a model of best practice that many subsequent open universities emulated. Now known as 'mega universities', several open universities around the world that deliver degrees predominantly via distance education can now boast over 1 million students. Indira Gandhi University in India is the largest with a staggering 2.5 million students enrolled each year.

The OU’s current foray into electronic forms of learning such as web based learning and computer mediated communications is an extension of its tried and tested model of distance-blended learning. Many OU courses have face to face tutorial contact and week long summer schools built into their structure, but most of the learning process is still conducted away from the parent institution, based in Milton Keynes. Regular television and radio broadcasts are still used, as are a range of other methods including online delivery, mailouts, and the OU still maintains a close partnership with the government owned British Broadcasting Corporation with a regular schedule of programmes broadcast on radio and television.

On a personal note, as an OU graduate myself (BSc (Hons) Psychology 1995 - 1st Class) I would like to pay tribute to the OU and all that it does. It really is the University of the Second Chance. I blew it at school, and left with few academic qualifications. I simply wasn't interested in study at the time. My teacher told my parents 'Steve's a very sociable lad, but he'll never be an academic!' Well, the OU gave me my second chance when I needed it (and teachers don't know everything). When I met Sir John Daniel (then the OU Vice Chancellor) over a few drinks during a conference in Ankara back in 1998, we talked long about the history of the OU and how it had changed so many lives, including mine. Sitting with us were a number of other pioneers of distance education, namely Tony Bates and Michael Moore (no not that one), and I'm still in touch with them to this day. They had a lot of stories to tell about the early days of the mega-universities, but that's for another blog post....

Tomorrow: Part 11: Spinning the Web

Image source

Thursday 21 January 2010

1945 and all that

This is part 9 in my series on the history and impact of distance education. Yesterday in Part 8 we saw how the television was conceived and invented. However, before the introduction of geosynchronous satellite technology, global telecommunication was problematic, and global distance education continued at the pace of the snail mail whilst radio and audioconferencing the mainstay distance communication media.

1945 is a momentous year in the history of the development of distance education technology. It was not only the year we saw the back of the Second World War. It was in this year that a young English scientist published a seemingly outlandish article in the magazine Wireless World.

The article, entitled Extra-Terrestrial Relays speculated that if three radio transmitters were placed at equidistant points at a precise altitude above the Earth's equator, they would be able to achieve global communication coverage. This is a facsimile of the original article. The author of the article was none other than the now celebrated science fiction writer Arthur C. Clarke (author of 2001: A Space Odyssey and other stories), and the article was instrumental in opening the debate about the feasibility of global communication satellites. Just 12 years later, on October 4th 1957, the USSR succeeded in launching the world's first artificial satellite, Sputnik, into orbit - and the Space Race began.

The most important aspect of Clarke's theory was the placement of the satellite at a precise orbit of 22,300 miles over the Equator. At this altitude, Clarke speculated, the satellite would have exactly the same velocity as the rotational speed of the Earth, and it would therefore appear to be stationary in the sky. This technique is now well established, enabling satellite users to dispense with expensive tracking devices. Communication satellites are placed into geosynchronous orbit, and this zone of optimum distance above the Equator is now referred to as the Clarke Belt. If you are in the Northern hemisphere, you will see that satellite dishes tend to point South toward the Equator. In the Southern hemisphere the opposite applies. In Equatorial regions, it's a common site to see satellite dishes pointing right up at the sky, and some have holes drilled in them to drain the rain water out! Queen Elizabeth II knighted Clarke for his services to science in 2000. Sir Arthur C. Clarke retired to the island of Sri Lanka where he died in 2008.

In 1965, Clarke's dream was realised when the first ever geosychronous communication satellite was positioned in orbit above the Atlantic Ocean by NASA. By 1969, three satellites had been linked to achieve the first fully global satellite coverage. For more on the uses of satellite technology in distance education visit here. Today, a lot of distance education provision is dependent upon geosynchronous communication satellites, and we take for granted the ability to talk to people on the other side of the world via telephone, video or other means. Watching live events from around the world on television is not something we think of as particularly special. We are so used to the idea that satellites are there, we give them no second thought. If they suddenly disappeared though, I think we would all know it.

On Monday: Part 10: University of the second chance

Image source

Man of vision

This is part 8 in my series on the history and impact of distance education. In part 7 yesterday we traced the history of telecommunications and the contribution of the telephone. We continue today with what many of us do just about every evening - taking a look at the television.

Another Scot by the name of John Logie Baird also made a huge impact on telecommunication and indirectly, on modern distance education. Baird is celebrated as a man of great vision - television. In fact, Baird was the inventor of many new technologies, including fiber optics, a technology that looms almost as large as TV in the distance education hall of fame. Born in 1888 in Helensburgh, Scotland, a coastal town about 25 miles to the northwest of Glasgow, Baird was the fourth child of the local church minister. Even as a young boy he was known for his home experiments, one of which literally left him with his fingers burnt! Baird eventually left Helensburgh to seek work in the capital, London, and lived in the South of England for much of the remainder of his life. Much of the early research that defined his lifetime of innovation took place on the south coast in the small towns of Hastings and Folkestone.

Although the original term 'television' (literally 'to see from a distance') was coined by scientist Constantin Perskyi at a conference in Paris in 1900, it was Baird who is credited with the creation of the first operational device that could transmit pictures. Baird successfully tested the prototype of his mechanically scanned disk television in the laboratory in 1925 and it was later demonstrated in public in London in 1926.


However, it was not long before Baird's mechanical version was supplanted by electronic television, which laid the foundation for today's television broadcasts, interactive television and video conferencing technologies. Never the less, Baird's pioneering achievements, including his involvement in the first trans-Atlantic television transmission, were important scientific accomplishments. Baird's far reaching innovation is exactly that - an invention that enables us to reach far across distances to hear and see each other, and to learn together no matter where we are located. The computer and television together provide the basis upon which visual communication and global information access is achieved. There is just one more component needed to achieve global telecommunication though.... which we will discuss in tomorrow's post.

Tuesday 19 January 2010

Ringing the changes

This is part 7 in my series on the history and impact of distance education. Yesterday in Part 6, we examined the impact computers have made on pedagogy. Another innovation as ubiquitous and influential as the computer was invented by a Briton prior to the Second World War. This invention also has a great deal of importance to the practice of distance education, as we understand it today.

We have a Scot to thank for one of the most taken for granted technologies in the modern world. Alexander Graham Bell was born on March 3, 1847 in Edinburgh, Scotland. In 1875, along with his assistant Thomas A. Watson, Bell constructed instruments that transmitted speech. In 1876 Bell invented the forerunner of the modern telephone, a device which today forms the basis of many communications technologies from the cellular phone to the Internet.


Bell received his official patent to the telephone on March 7, 1876. Three days later he and Watson, located in different rooms, tested the new type of transmission device described in his patent. As they were setting up the experiment, Watson suddenly heard Bell's voice through the earpiece saying, "Mr. Watson, come here. I want you." Bell had had an accident with a battery, and had spilled acid over his clothes. He had inadvertently use the telephone to speak to Watson, but when he realised what he had achieved, the accident was soon forgotten!


The first telephone company, the Bell Telephone Company, was established in 1877 to exploit the potential of Bell's new invention. During his productive career, Alexander Graham Bell invented several other devices, although none were as useful as the telephone. He died on August 2, 1922, in Nova Scotia, Canada. Technology supported distance education owes a lot to this Scot inventor, who changed the concept of what it meant to communicate with others over great distances. Today we take for granted the fact that we can punch a number into a keypad, and somewhere in the world, a corresponding telephone will ring, connecting us to a person who we can hear in 'real time'. The social presence of the telephone (the perception that you are connected to the other person) is very high, and many prefer it to so-called richer media such as videoconferencing. We often forget that telecommunication methods are the backbone upon which the Internet and other global communication methods have been based. Tomorrow we will take a look at another technology. Can you guess what it is yet?

Tomorrow: Part 8: Man of vision

Image source

Come the revolution...

This is part 6 of my series on the history and impact of distance education. In part 5, we saw how programmable computing was first proposed.

When Charles Babbage first conceived the 'Difference Engine' in Victorian England, he could have had no conception of the far-reaching effects of his invention. As we have already seen, Babbage's first attempt at creating a hand-cranked machine to mechanically manipulate arithmetic functions became the blue print for the earliest programmable computers.

Since the end of the 1980's the computer has entered into the world's collective consciousness as a ubiquitous electronic device that affects every aspect of our daily lives. They are everywhere - in offices, in homes, in our hands.
Few could be in any doubt that the computer is now influencing the way we live, work, communicate and spend our leisure time. The computer is at the very heart of what some have called 'the information revolution' - if indeed, a revolution it is. When connected to the global telecommunications network such as the Internet and all its convergent features, the computer is a very powerful tool, providing distance learners with opportunities to access learning experiences they would otherwise have missed.

Babbage's invention is now all grown up, and offers us a multitude of destinations, enabling us to explore previously unseen worlds, which neither he nor any of his Victorian contemporaries could ever have conceived. Computers now enable us to work and communicate flexibly and enjoy unprecedented access to information. But freedom of this kind comes with a price tag for educators.

History has shown us that most revolutions have a dictatorship waiting in the wings. The 'computer revolution' also exudes an air of tyranny. The way computers are employed has for some time tended to dictate the way teachers conceptualise and develop courses, design learning materials, manage the virtual learning environment, assess learning and communicate with their students. We have all experienced 'death by PowerPoint' and we all are aware of the stranglehold that software companies such as Microsoft have on our computers. Perhaps I'm painting things a little too black here, but we need to be aware of all the implications.


David Jonassen and his colleagues (1999, p 219) were not slow in responding to the trends in e-learning, arguing that in order for students to learn effectively from new technology, it will first be necessary for their teachers to accept a new model of learning. This new model is premised upon educators rejecting the role of the model where the teacher is the 'knowledge provider' and instead, adopting of the role of the facilitator. Some teachers may not like this. Time militates against them, as does a fear of losing control for some. Others are rushing with open arms to embrace new technologies. Some are going too far, using technology simply because it's there and it's cool. I suspect a lot of teachers will be ambivalent, gazing on with a gimlet eye, because they know what we know - change is the one thing that is always certain in education.

Digital technologies have been responsible for some of the most radical changes of the last few years in schools. Computers brought the world to the classroom. Now smart mobile versions are taking the classroom out into the world. Distance education is going through changes, just like traditional education - and a lot of the changes are being driven by the introduction of new technologies. The pace is relentless, and will not slow down. We know this: The sage on the stage is rapidly becoming the guide on the side - mainly due to the impact and influence of digital technologies. And it all started with the humble calculating machine.

Tomorrow: Part 7: Ringing the changes.

Reference: Jonassen, D. H., Peck, K. L. and Wilson, B. (1999) Learning with technology: A constructivist perspective. Upper Saddle River, NY: Prentice Hall.

Image source

Monday 18 January 2010

Catching a code

This is part 5 of my series on the history and impact of distance education. In part 4, we saw how Charles Babbage developed his ideas to create one of the first computers - the Difference Engine.

One of Charles Babbage’s associates was a member of Britian’s aristocracy. Ada Byron, also known as Lady Lovelace, was the daughter of the romantic poet Lord (George) Byron, and she seems to have had a great deal of time on her hands. Some accounts suggest that she wished to become 'an analyst and a metaphysician' and that from a young age she had developed a passion for science - an aspiration that women were generally discouraged from following in 19th Century Britain. She didn't seem fazed by these restrictions though - and tended to follow her own ideals.

Ada was still in her teens when she heard of Charles Babbage's idea of the Analytical Engine - an automatic calculating system - and the successor to his earlier invention, the Difference Engine. Babbage had conjectured that a calculating engine might not only predict but could act on that prediction. Ada was very impressed by these ideas and began to speculate about her own contribution to the development of the calculating machine. Correspondence between Lady Lovelace and Babbage was by all accounts filled with a heady mixture of fact and fantasy, as they both began to speculate on how such a calculating device might be used. Lady Lovelace eventually published an article in which she predicted that Babbage's machine might be used for scientific and domestic use. This visionary account of the machine’s potential was uncanny in its accuracy, predicting its potential to perform a multitude of tasks such as playing music, creating pictures and composing letters. It's a pity we don't have someone of her calibre in the meteorological office today, predicting our weather for us.

Lady Lovelace suggested to Babbage that a plan might be formulated to enable the Difference Engine to calculate Bernoulli numbers (look, just follow the link). This suggestion is now seen by many as the earliest example of computer programming. It wasn't exactly C++ but it worked. Lord Byron's daughter, in her collaboration with the genius Charles Babbage, gave the world the second part of the computer equation - the knowledge that it was possible not only to create a computing device, but to write instructions for it to follow so that it could produce a defined result. The modern computer is based upon this premise. In 1979, the U.S. Department of Defense named a computer program 'Ada' in honour of her pioneering ideas.

Tomorrow: Part 6: Come the revolution...

Image source

Friday 15 January 2010

New smart devices

I was asked some time ago if I was interested in giving a presentation at Learning Technologies 2010. To be honest, I knew very little about this conference, which will be held this month at Olympia in London. And yet it seems that with over 4000 delegates pre-registered for the event, it is one of the largest learning technology training and development events in Europe. I had a long telephone conversation with the conference chair, Donald H Taylor, discussing the possible content of my presentation and we eventually agreed that I should talk about new smart devices. Don wanted me to move the discussion on from the smart phones, toward an area where augmented reality tools are being introduced into training and development contexts.

And so here it is: my slideset which I will present on 27 January in London at Learning Technologies 2010. I hope to meet as many people as possible in my session at Learning Technologies, but if you can't be there, at least you will be able to see my slides. And of course, as ever, any constructive feedback is very welcome.


For those who take their slides with a little music, here is the same slideshow on Animoto (with an unplugged version of Shine by Take That) courtesy of @james_wilding. Thanks James!

Making a difference

This is Part 4 - a continuation in my series on the history and impact of distance education. Yesterday in Part 3, we saw how the correspondence course could be adapted to deliver a full degree. In Part 4 we start to examine the technology behind distance education.

Considering its relatively small size and population, (and this is my personal view) the United Kingdom has contributed disproportionately to the rise of technology supported distance education over the last two centuries (Wheeler, 2005). But I would say that wouldn't I? I'm a Brit after all. The computer, one of the most vital distance education tools of the last 30 years, is generally agreed to have been most influenced by British mathematician Charles Babbage in 1821. Yes, I know that other Europeans such as Blaise Pascal and Konrad Zuse pioneered their own versions of calculating machines, but Babbage's method of calculation through the Difference Engine - which later became a programmable machine - was the innovation that provided the template on which modern computing is based.

Charles Babbage was raised in a well-to-do English family, and was a child prodigy. Historical accounts suggest that he taught himself algebra when very young, and developed a great passion for all things numerical. So, before he could be numbered with the greats, he had to be great with numbers (Stop it - Ed.). We even have a building named in his honour here at the University of Plymouth, which of course houses our school of computing and the open access computing suites.

It was inevitable that he would eventually follow a career in mathematics and in 1811 he enrolled at Trinity College, Cambridge. He became a greatly respected scientist and was honoured for his work when he was invited to become a member of the Royal Society. The story goes that one day Babbage was sitting in his study, holding his head in his hands, as he pored over reams of statistics. A colleague came in, saw him and enquired, ‘What are you dreaming of Babbage?’ ‘I was thinking’ replied Babbage, ‘that many of these calculations could be performed mechanically!’ They must have thought him a nut job, but Babbage was serious. Soon he began to take an interest in the notion of building a 'calculating machine'.

He eventually succeeded in building a prototype of his Difference Engine but his work was stalled due to lack of interest and limited funding from the British government and little support from his peers. Sadly, he died a bitter and disappointed man, having invested much of his life and personal fortune into an ambitious and ground breaking engineering project that showed little positive results during his own lifetime. His legacy and influence on modern life however, is profound and Charles Babbage is today acknowledged as the 'Father of Computing'.

The computer has extended its influence exponentially in the past few decades, and has advanced unrecognisably beyond the original notion of being a mere ‘calculating machine.’ It is now a very sophisticated tool for the development, storage, retrieval, delivery and transformation of data - it has the potential to enrich and extend educational experiences, and can provide students with a truly time and space independent portal to education. We must remember though that good pedagogy does not just happen because technology is being used. Good pedagogy takes place when teachers use technology appropriately and creatively. That is what can make the difference. We also need to know this: Such sophisticated and far reaching functions would never have been possible without the ability to issue instructions, or ‘program’ the computer. In Part 5 we will begin to explore this.

Reference: Wheeler, S. (2005) British Distance Education: A Proud Tradition. In Y. Visser, L. Visser and M. Simonson (Eds.) Trends and Issues in Distance Education: An International Perspective. Greenwich, Connecticut, USA: Information Age Publishers.

On Monday: Part 5: Catching a code

Image source

Thursday 14 January 2010

First degree burns

This is Part 3 - a continuation of my series on the history and impact of distance education. Yesterday in Part 2, we saw how correspondence courses were established in Britain and the USA.

Setting up short vocational courses seemed to be no problem. Academic programmes were an entirely prospect though. When Cambridge scholar Richard Green Moulton attempted to establish an entire degree via correspondence, he met with a wall of opposition. Moulton’s plan was to deliver a degree course managed along similar lines to the correspondence school techniques made so successful by Isaac Pitman. Pitman had used printed cards mailed out to students through the Penny Post service. Students sent their work back via mail where it was then graded. Students then received their grades along with the next installation of their studies in the next post.

Moulton’s colleagues at Cambridge University were sceptical and dismissive about these processes and blocked his progress. Unfortunately, his innovative ideas could go no further at Cambridge - they crashed and burned. It's probable that Moulton’s colleagues were concerned about issues such as quality assurance and the means through which assessment of learning would be achieved and authenticated. They may also have been appalled at the incredible logistics that would be involved. It is not known how Moulton planned to address such issues. I'm also wondering how many of these issues remain a concern today in our digital world?

Like most pioneers and trailblazers however, Moulton refused to lie down. He persevered, and realised that his future obviously resided elsewhere than in Cambridge, England. He subsequently immigrated to the United States where he took up a post on the faculty of the University of Chicago. Here he eventually realised his dream and in 1892 was able to establish the first degree programme delivered via correspondence course. I guess we owe Moulton a lot for his tenacity.

Tomorrow: Part 4: Making a difference

Image source

Wednesday 13 January 2010

Short hand, long distance

Here's the second in my series on the history of distance education. Yesterday's post examined some conceptual issues of 'distance'. Today we look at the roots of distance education.

Arguably the first distance education course was delivered in the first century, in Asia Minor. The writings of St Paul (known as epistles) were in effect a form of instruction delivered to remote groups of people (early Christian churches) distributed by courier across what is now Israel, Turkey, Greece and Italy (more here). Yet this was very much a didactic, one-way mode of knowledge transmission. There was no latitude for interaction, and therefore no dialogue occurred between student and teacher.
In an organised format, one of the earliest occurances of distance education emerged in Victorian England. When Isaac Pitman established the first organised correspondence course in England in 1840, he achieved it on the back of two technologies – the printing press and the newly arrived national Penny Postal service.

Pitman’s correspondence school taught shorthand to a distributed nationwide audience predominantly of office workers. Pitman’s use of the nationwide postal service advanced the work of previous correspondence courses giving educators the ability to engage in two-way communication with their students wherever they were located in the country. This was an asynchronous (time delayed) form of communication, and the process took time, but the Victorians were not afflicted by the impatience and clock watching habits we now see in contemporary society. Life was much more sedate. Within a few short years of commencing distance delivery, Pitman's correspondence school had enrolled over 100,000 students. Even by today's standards, this was a phenomenal number of students. In 1892, Pitman was knighted by Queen Victoria for his services to education and his visionary plan to 'educate one and all'.

This early success prompted many others to attempt similar feats, and soon the organised correspondence course was burgeoning. In the US, Anna Eliot Ticknor set up the Society to Encourage Study at Home' which was predominantly aimed at women (for more on this story follow this link). Other similar organisations soon began to spring up. Geographical distance had been breached, and students were able to glean feedback on their progress from their instructors wherever they were. It was not so much the time spent waiting that was an issue for students in correspondence courses – rather it was the depth of richness of feedback they received that made all the difference between success and failure. Such two way interaction over distance via correspondence became the basis for much of what was to follow. Even today, in the advent of digital technology, ubiquitous communications and web based learning, the vast majority of distance education is still reliant on mailed out, paper based material and the humble correspondence course.

Tuesday 12 January 2010

The space between us all

In this new series I will discuss how distance education has developed and the influences it has had on our current education provision. Comments are most welcome. Here's the first installment:

A few years ago I heard a funny remark at an e-learning conference in Germany. Someone suggested that small area nations such as the United Kingdom have no need for distance education, because they have no ‘distance’. I laughed at the time and replied that if we followed this line of reasoning, there would be no need for any education either. More laughter. Of course the UK has distance education! I have already made the case for a significant British contribution to the development of distance education, both in terms of its conceptualisation, and also in terms of its innovation of technologies such as telephony (Alexander Graham Bell), television (John Logie Baird), correspondence courses (such as Sir Isaac Pitman's shorthand courses), the World Wide Web (Sir Tim-Berners-Lee) and of course the British Open University model (Wheeler, 2005).

Although light-hearted, the conversation at the German conference led me to re-examine the notion of ‘distance’ and in fact ultimately launched me into seven years of study culminating in a research degree in the field. A key question for distance educators to ask then, is – what is distance? Distance is almost always conceived of as being geographical in nature. In class I often ask my students ‘what is the distance between you and I?’ Their first answer is always an approximate measurement of feet, yards, or (if they live in continental Europe) in metres. I then ask them to reconsider their response. I ask them what other distances there are between us. After a little consternation and head scratching, the light comes on and they begin to respond in terms of other 'distances'.

There may be an age gap, or a gender gap. These distances are based on the premise that people of different age groups tend to see things in different ways, and have different values – which leads to a ‘distance’ being perceived between them – what was once called ‘the generation gap’. This may have been the basis for the controversial assumptions made by Marc Prensky's 'Digital Natives and Immigrants' theory. The gender gap may be a little more subtle, but the distance between males and females can be just as tangible. Ask anyone who is married. Then there is the intellectual distance experienced between students and their instructors. This perception often leads to a power differential between the two, and (some would say an appropriate) distancing. Other distances may also come into play including cultural and particularly language distances. These may lead to misunderstandings or misconceptions about the motives or intentions of people, and may create a psychological distance. I go on to tell the students that there are always ‘distances’ between each of us, no matter what the nature of the transaction.

In distance education, the geographical distance does not have as much influence as it once had, as interactive technologies are now quite sophisticated. Beatle George Harrison once wrote ‘We were thinking about the space between us all…and the people who hide themselves behind a wall of illusion...’ One of the most important distances to overcome is the perceived distance between each of us and those we attempt to communicate with. Michael Moore (no, not that one) once theorised that there is a distance between us and others which is one of a transactional nature. My theory is that depending on how a technology is used, it has the potential to either amplify or reduce such transactional distances (Wheeler, 2007). As educators we need to address many of these issues particularly if we are operating within a distance education context...but it also applies in face to face teaching and learning contexts.

Distance education is of course best conceived of as a method for delivering and supporting learning opportunities to students who can't be present on campus or in a classroom. It is an ideal strategy for the promotion of inclusive education, where those who cannot travel to a university or college for some reason can still participate in a community of learning. In an organised format, one of the first beginnings of distance education was in England in the Victorian era....

References: Wheeler, S. (2005) British Distance Education: A Proud Tradition. In Y. Visser, L. Visser and M. Simonson (Eds.) Trends and Issues in Distance Education: An International Perspective. Greenwich, Connecticut, USA: Information Age Publishers.
Wheeler, S. (2007) The Influence of Communication Technologies and Approaches to Study on Transactional Distance in Blended Learning, [Abstract] ALT-J: Research in Learning Technology, 15 (2), 103-117.

Image source

Monday 11 January 2010

Very social software

A veritable feast of articles on social media is appearing in the academic press at the moment.

Two articles that have caught my eye deal with social software in education, and both have landed on my desk in the last few weeks.

If you're interested in the impact blogs, wikis and social networking can have on the social and cultural dimensions of education, then take a read of them. Below are the abstracts, references and links to both full papers.




Social Software: new opportunities for challenging social inequalities in learning? by Gwyneth Hughes

Enthusiasts for new social software do not always acknowledge that belonging to e-learning communities depends upon complex and often unresolved identity issues for learners. Drawing on the author's previous research on belonging in social learning, the paper presents a theory of identity congruence in social learning and brings to the foreground the importance of identities which arise from expressions of gender, class, ethnicity, age, etc. in the social, operational and, in particular, the knowledge-building aspects of learning. These three dimensions of identity congruence are used to evaluate the potential of new social software. While social software might encourage some learners to engage in social and operational identity work, there are disadvantages for others, and learner-generated knowledge and e-assessment practices can be divisive. Inclusive e-learning depends upon pedagogies and assessments which enable learners to shift and transform identities, and not solely on widening the range of technologies available. Such caution should underpin future research.

Keywords: social software; Web 2.0; inclusive; inequality; belonging; identity.

Reference: Hughes, G. (2009) Social software: new opportunities for challenging social inequalities in learning? Learning, Media and Technology, 34 (4), 291-306.

Culture, learning styles and Web 2.0, by Bolanle A. Olaniran

This article explores Web 2.0 in interactive learning environments. Specifically, the article examines Web 2.0 as an interactive learning platform that holds potential, but is also limited by learning styles and cultural value preferences. The article explores the issue of control from both teacher and learner perspectives, and in particular the cultural challenges that impact learner control. From the control perspective, the issue of access to Web 2.0 technologies from both cost affordability and government censorship is also addressed. Finally, the article concludes with implications and recommendations for Web 2.0 learning environments.

Keywords: Web 2.0; interactive learning; cultural preferences; learning styles; learner control.

Reference: Olaniran, B. A. (2009) Cuture, learning styles and Web 2.0, Interactive Learning Environments, 17 (4), 261-272.

Image source

Sunday 10 January 2010

The long and the short of it

We've had the Oscars, the Tonys, and of course, for education bloggers, the Eddies. What next? Well, you'll just love the Shorty Awards. It's a celebration of all that is great, whacky or downright weird on the real-time service Twitter.

This is the second year the Shorties have run, and frankly it's all a lot of great fun. The strapline on the Shorty Awards Website says: 'Honoring the best producers of short and real-time content'. There are several 'official' categories including Arts, Music, Humor, Celebrity, Education and Tech. I'm particularly interested in the last two as they relate to the themes of my blog. But it's a lot of fun watching Twitter users scrabbling and canvassing to try to gain votes from their fellow twits, so they can edge their way to the top of the 'leader board' in each category. There is even an unofficial category emerging, I see. It's the 'egocentric' category and seems to be awarded to those twits who think a little too highly of themselves, or who are perhaps just a little too enthusiastic in trying to solicit votes.

Here's the hilarious bit: Anyone can devise a new category to nominate someone. If enough people do the same thing, that category then becomes 'official', and an award is made for it. I can see people's imaginations running wild now. How about categories for 'boring', 'drivesbadly', 'lunatic', 'mostlyasleep' or 'drunkard'. I gave @davecormier a nomination in 'convolution' for his meandering but brilliant theory on rhizomatic learning. I don't think anyone has picked up on the irony of it yet. I also gave Tom Caswell (@tom4cam - one of the tallest men I have ever encountered) a nomination in the 'tall' catagory, which for the shorties is ... well, appropriate I think. He got it. :-)

I'm flattered to have received 8 votes for @timbuckteeth so far in the 'education' category, which puts me in 18th joint equal place alongside such luminaries as @ollie_otter. How good is that? I'm on a par with a river dwelling rodent, but not half as cute. Excellent.

I don't stand a cat's chance in hell of winning anything of course. There are just too many great people (and some awesome ones too). Currently heading the leaderboard in the education category by a mere 1750 votes ahead of place two, is some bloke called Brother Eli Soriano (who should perhaps more fairly be placed in a 'religious' or 'preacher' category to give the teachers a chance. He also appears to have at least 7 accounts - check him out Shorty Awards). Some nutcase who will remain nameless (@simfin) has just nominated me in the 'humor' category. That's a laugh, that is. How about several of you nominating me in a new category - let's call it 'obscurity'. If enough of you vote for me, it may turn out to be the most ironic award of the whole Shorties. There's also an interview for you to complete which is displayed next to your buddy pic. Do have fun with the Shorties - 'cos that's exactly what they are.

Image source

Friday 8 January 2010

The song remains the same

In the 1960s and 1970s several musicians tried to change the world through their music. The 'protest songs' of Bob Dylan, Joan Baez, Neil Young and The Byrds gained a lot of air play, particularly on the pirate radio stations, as they sought to raise awareness of the dangers of nuclear weapons, the futility of the Vietnam War, or the need for greater equality, justice and freedom.

The late 1970s saw the emergence of an even more angry genre of music - punk rock exploded on the scene, grabbing attention, causing controversy and piquing our social awareness through the raw energy of The Sex Pistols, The Clash and Patti Smith, and political awareness flowed freely from the lyrics of Billy Bragg, UB40 and a number of other home grown UK bands in the 1980s. They railed against corporate greed, and called out for equality, justice and freedom (again). If we were cynical we would point out that they also became wealthy off the back of their record sales. Most recently, Hip Hop outfits such as Public Enemy have tried to tap into the power of the beat to challenge the status quo, change people's minds and drag them from their comfortable lethargy into political activism but they have a limited audience. Pink's 'Dear Mr President' held currency for a while, but a new president with a new agenda may have drawn some of its potency.

Political movements have risen from songs. Remember 'Stand Down Margaret', and 'Free Nelson Mandela', or 'Biko' (Peter Gabriel). But ultimately what has been the legacy of these music movements? Did they really change much at all? And what is the alternative now that music seems to have lapsed into its own lethargic morass of the X-Factor style 'karoake' culture?

Gil Scott-Heron sang 'The Revolution will not be Televised', but the times they are a-changing, and people now have mobile TV and Radio studios and newspaper presses in their pockets. We have witnessed the evidence that the use of mobile phones by citizen journalists has raised public awareness to the injustices, tragedies and disasters occurring across the globe - as they happen, and at the scene. Sometimes several hours ahead of the major media channels - instant messages, texting and other live networking has raised our awareness that there are riots on the streets of Iran, or that a major incident is happening somewhere in the world. YouTube videos, Flickr photographs and blog reports provide us with the content that informs, circumventing the mainstream media, and undermining the repressive control methods of those in power.

In the next few years there will be a rise in the use of citizen journalism, just as there will be a rise in the number of free internet channels that will be open for all to use. Perhaps we don't need the protest song anymore. The song remains the same, but the tools have changed. And they may be a lot more effective. Government ministers and those who wield the power will be looking over their shoulders with increasing regularity, as citizen observers with powerful links to the world ensure that they do the best for their country.

And that can only be a good thing, can't it?

Image source

Monday 4 January 2010

The future is the Web

I'm putting together my invited presentation for Learning Technologies 2010 later this month. I have been asked to talk about the new smart technologies that we will be using in the next year or two. These include touch screen technologies (iPhone, Touchtables, etc), augmented reality (AR) devices (6th Sense, Layar etc) and other emerging technologies such as QR tagging, GPS and wearable computers. I will talk about how I think these might be used to support, enhance and extend pedagogy in the future. It's quite a tall order, but if you know we well, you also know I relish a challenge.

I'm going to use Web 2.0 (and the emergence of Web 3.0) as my starting point. I know a lot of people don't like these terms, but as I have previously commented (see Lost in Translation), even if it isn't the right one, the name evokes meaning and makes the concept comprehensible. At least most of us will know what I mean by Web 2.0 (the read/write Web - connecting with others) and Web 3.0 (The semantic Web - connecting with knowledge content through 'intelligent' search/filtering). I'm starting with them because whatever technology we will use in the future for teaching and learning, we can be assured it will have a social element and that we will increasingly rely upon intelligent (agent) systems to get us from where we are to where we want to be. The future is digital, and the future is the Web.

The future will be interactive. The way we interact with the Web will be different from the way we interact now. We are already seeing signs of shifts in use, moves away from mouse and keyboard toward natural hand and finger movements, such as multi- and pinch-gestures. The iPhone was one of the first handheld devices to capture this, but we already had interactive whiteboards in classrooms that could operate on a touch basis. It's now one small step away from that to interaction in 3-D, where we move around in physical space, and the computer tracks and responds to our hands, eyes and voice. AR tools will recognise what we are doing and then enhance it by adding information to the world around us.


The future will also be mobile. I have already elaborated on why we need to be more mobile and agile in a previous post, but I will emphasise here that learning and working on the move is going to increase exponentially in the coming years. It will increase because we are more mobile as a society than ever before and we need to maximise the limited time we have each day. The demands on an individual's time are increasing to the point that many people now need to learn how to work smarter not harder, by doing more in the down-time that is otherwise wasted waiting or travelling. Increasingly we will see people using handheld smart devices to augment personal reality, access information, communicate, navigate their way through physical space and create and share content while on the move. Many of us are already starting to do this.

Finally the future will be sensory. One of the words we will increasingly hear in 2010 is the word 'haptic.' Haptic - pertaining to the sense of touch - describes the manner by which some new technologies are providing tactile feedback to users through handheld controllers. When it's your turn on Wii sports, your handset makes a small sound, a blue light flashes, and it vibrates. The handset alerts several of your senses simultaneously. Mobile phones also have a built in vibration alert when you want to go silent but still want to know a call or text is incoming. We are witnessing the early stage of haptic consumer technology. It will get better as it becomes more widely available, and more control will become possible using haptic tools. Humans have more than the 5 traditional senses Artistotle talked about. At least three - equilibrioception (balance), proprioception (sense of limb juxtapositions) and kinaesthesia (motion) - are there waiting to be tapped into, to make our interaction with digital media more natural, responsive and effective.

I'm going to try to make some sense of all this for my audience. Together, we will discuss ways in which new smart technologies can enhance education and training, and make learning more exciting and relevant for our students.

Learning Technologies 2010 will be held at Olympia 2, London, on January 27-28.

Image source

Saturday 2 January 2010

Wisdom of clouds

This is the first time I've done this, but it's a new year and a new broom. Here is the first Learning with 'E's guest blog post - it's by Manish Malik, who is based at the University of Portsmouth. He has some interesting ideas, and I am happy to host these on my own blog. You can find Manish's poll on his Edublend blog. I think it's an important discussion - so please read his post and then make your vote!

I have been meaning to write this for quite some time now. To be honest, at some point in the run up to the ALT-C 2009 conference, I got this idea. There is a shorter version of this post too. There were many people at a session titled "the VLE is Dead" hosted by James Clay, Josie Fraser, Graham Attwell, Nick Sharratt and Steve Wheeler aka Timbuckteeth :). Martin Weller blogged about the death of VLE/LMS too in Nov 2007.

Scott Leislie coined a term Loosely coupled teaching a month before that. Martin's prediction about a move towards loosely coupled teaching tools has examples in practice today. However, there is more to it. Let me explain:

PLE...a set of tools that the learners enjoy full control on and choice of. The tools within a PLE are most likely not used for the purpose of formal education of all learners within an educational institution. Each learner may use a different set of tools to support/enhance their informal learning.

VLE...a set of tools that the learners enjoy very little control over, if any, or choice of and is an institutional system that is mostly likely for formal education. Academics and the institutions have the most control on this learning environment. Learners may have a say in it to some extent.

Loosely coupled.....to quote Scott, "courses taught using contemporary social software/web 2.0 tools outside a course management system." - again the learner may have little control over these tools but the academic is the owner and has most control/choice. As it's a non institutional learning environment, it is most likely to support informal teaching and learning but may be used for formal teaching and learning too. I have blogged on this type of tools as my own personal teaching environment.

CLE or Cloud Learning Environment....The cloud can be seen as one big autonomous system not owned by any educational institution. Let the Academics or Learners be the users of some cloud based services, where all equally share the privileges like control, choice, sharing of content etc on these services. This is different from a PLE, a VLE and a PTE. For example Google Apps for universities is hosted on the cloud, not fully controlled by any educational institution and certainly not owned by one. The tools on it are to a great extent academic or learner controlled. Each "Google Site", for example, can be owned by an academic or a Learner and both users be given the same rights/control by one another (depending on who creates first). Likewise Google Docs can be owned and shared between learners themselves or learners and academics under their own control.

This gives all parties the same rights on same set of tools. This clearly has potential to enable and facilitate both formal and informal learning for the learner. Both the academic and the learner are free to use the tools the way they wanted and share and collaborate with anyone they wanted. This would not have been possible if either the academics or the learners or for that matter the institution designed and developed the set of tools or bought it from any one supplier. Google Apps was not designed just for institutions or for individuals, it was designed for collaboration both within and accross institutions.

CLEs also make it very easy to generate content and share it with the rest of the world in a DERPable (Discoverable, Editable, Repurposable and Portable) manner, in the spirit of the UKOER programme. With a bit of search engine optimisation it could work magic in terms of making the educational material that sits on a CLE visible and usable by the rest of the world.

Lastly, students at my institution love the Google Apps interface, which makes it very easy to get them to engage with their work using online tools. This can be seen from the crazy usage statistics of Google Sites where I now host my Examopedia.

Image source