Natural user interfaces (NUIs) just happened to be the theme of an invited presentation by Microsoft's Head of Research, Professor Andrew Blake (pictured) this morning at LWF. Although, as some of the audience tweeted, non-touch interfaces are already here, it was interesting to see how Microsoft and other large corporations plan to incorporate this technology into their products in the near future. We are clearly headed for a world of intuitive computer use according to Professor Blake. He started by demonstrating how technology is capturing and recognising the shape and movement of the human body and he discussed the challenges involved in achieving this. Different body shapes, sizes and people standing in groups can all confuse cameras, because they 'see' us as flat images rather than in 3D, he said. How does the camera select human limbs from their background surroundings for example? he asked. The use of the depth camera achieves this. Using the example of the special effects in the Hollywood movie Titanic and its extensive use of matte technology (what we used to call colour separation overlay in television studio work) Professor Blake deftly applied a similar approach to visual manipulation for practical applications. This technology builds on previous research into object recognition and is used for example in games playing technologies such as the XBox 360 Kinect.
Computers can even recognise emotions now, said Blake, and this was an unexpected spin-off from autism research, where facial feature recognition software was first developed. Blake admits that all this important research is leading somewhere significant, but he admits he doesn't really know what shape it will take yet. The final summing up remark was that with new non-touch interfaces, it is likely that computers will learn from us, just as we learn from computers.
It's only natural by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.
No comments:
Post a Comment