Under consideration during one of my tutorials were the affordances of touch screen tools such as Apple's iPad, iPhone and iPod Touch. Regular readers of this blog may remember a post I wrote last month on natural gesture interfaces entitled It's only natural. In it I reported that there are a number of ways to interface with a computer now, including touch screen, non-touch (e.g. the XBox 360 Kinect), touch surface (e.g. MIT's Sixth Sense wearable), voice activation, and a number of other operation modes, many of which are spin-offs of adaptive technologies developed to support users with physical disabilities. Even facial feature recognition has been mentioned as a future interface mode.
But it was the Apple iPad tablet and other touch screen tools such as Dell's Latitude laptop that were in our focus today. (A review of the new Latitude 2110 will feature on this blog in the near future) I speculated that it was not only the tactile characteristics of the touch screen that were important, but that haptics could also be a key factor. Non-touch interfaces will no doubt become popular in time, as has already been shown by the rapid rise in popularity of the XBox Kinect. But the Nintendo Wii remains a popular gaming technology, possibly because of the haptic feedback system built into the handset. If you hit a golf ball too strongly for example, not only do you hear the fateful sound of an overhit golf ball, and experience the view of the ball overshooting the green, you also feel the vibration in the handset, which convinces your nervous system that you have made a mistake. Although the iPad screen doesn't vibrate, it never the less provides pressure resistance feedback to the user. It is a sort of middle ground between the flexible 'give' of the conventional keyboard or mouse, and the 'nothingness' of the XBox 360 Kinect. Haptics, I think, will have a big role to play in the future acceptance of natural gesture interfaces and may influence which systems ultimately become the 'Killer App' replacement for the keyboard and mouse. People may not be as ready for the completely non-touch interfaces.
A second point we discussed was that natural gestures such as pinching, flicking and swiping are intuitive, and offer students a tactile, transparent window to manipulation of content and quicker learning. Transparent technologies are those that require learners to invest a minimum of thought and effort into navigating and operating a system, thereby allowing them more cognitive processing capablity to learn. Conversely, an opaque technology (some institutional VLEs fall into this category) is a technology that forces students to concentrate more on using the tools than they do on actual learning. The former is clearly more desirable than the latter, and iPad and iPhone type interfaces provide this transparency. Students 'see through' the technology to more easily find, organise and assimilate the content.
The third important aspect of touch screen interfaces is their capability to support learning, communication and interaction with surroundings while on the move. New and emerging applications such as Augmented Reality, GPS and 3D visualisation also have a lot of appeal, particularly for those who find themselves having to navigate through unfamiliar neighbourhoods. We will probably see a lot of new developments around computer interfaces in the coming few years, but I think Apple have nailed it with the iPad touchscreen for a while at least.
Image source
Very touching by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.
No comments:
Post a Comment