Wednesday, 1 February 2012

Human 2.0

Post human. Such a strange concept, and one that many people struggle to understand. At its simplest, being post-human is a state closely aligned to the cyborg, or cybernetic-organism - part human, part machine. In other words, the post-human condition emerges when humankind and technology merge to the point where they become a part of each other. We can understand cyborgs - and for many, the idea of a half-man, half-machine evokes deep seated fears about how far technology can go. Donna Haraway (2004) makes a point of singling out Rachel - a replicant character in the sci-fi movie Bladerunner - as 'the image of a cyborg culture's fear, love, and confusion.' We have seen many other popular culture examples of the cyborg, from the six-million dollar man to Robocop - and each is endowed with superhuman strength, or enhanced senses. We recognise them because they are on their own, isolated, lost in a world of otherwise normality, unnatural, freaks of non-nature.

No-one really knows exactly if or when a post-human phase emerged, it is all theory and supposition. But we can trace the history of prosthetics and reflect on the incorporation of various kinds of technology into the human body. Replacement limbs may not strictly be accepted as a merging of technology and humanity, unless they are robotic limbs. Heart pacemakers, valves and other forms of technology implant or merger might be. Computer scientist and philosopher Andy Clark, in his 2003 book Natural Born Cyborgs, argues that humankind has an innate need to interface with technology: 'What the human brain is best at is learning to be a team player in a problem-solving field populated by an incredible variety of nonbiological props, scaffoldings, instruments and resources' (p 26). Essentially, when wetware (biological entity) meets hardware, the software can be interoperable. Clark sees the merging of mind and machine to be unstoppable and inevitable. He believes it's not a matter of if, but when. Some would argue that the transient phase leading to post-humanism is the non-invasive but just as powerful welding together of human and computer, as seen in the addictive video game playing of geeks, or the smartphone ultra-dependency of our current youth generation.

So are we now on the verge of a new phase in human development? Are we at the cusp of the incorporation of technology into the human body because we have such a desire to enhance our senses, increase our physical and mental performance, or otherwise extend the capabilities of what is considered to be 'natural'? Are we about to embark on a post-human phase in human development? Some would affirm this, citing several notable 'real examples' of cyborgs in recent years. Meet Kevin Warwick, a professor at the University of Reading, probably the world's first true cyborg. Professor Warwick is interested in how technology can enhance human senses and improve performance. In the foreword to his book I Cyborg, he writes: 'Humans have limited capabilities. Humans sense the world in a restricted way, vision being the best of the senses. Humans understand the world in only 3 dimensions and communicate in a very slow, serial fashion called speech. But can this be improved on? Can we apply technology to the upgrading of humans?' In essence, Warwick is asking: Can we become Human 2.0?

In a famous experiment in 1998 Warwick had a chip transponder surgically implanted into his arm. A computer was then able to track Warwick as he moved around the university campus, and allowed him to open doors, turn on lights, and operate computers without touching them. Other phases of the experiment involved more advanced transponder implants that monitored Warwick's internal condition, such as his emotional responses, stress levels and even thoughts. The speculation was that if others also had similar transponders implanted, people might then be able to communicate their thoughts and emotions to each other via computer mediation.

More recently, Tanya Vlach made headlines with her plans for a new prosthetic eye. She has a dream to transform herself into an 'enhanced human being' after being involved in a serious car accident in which she lost her left eye. She is now planning to have an 'eye-cam' - installed inside her prosthetic eye, complete with zoom control, infra-red and ultra-violet capabilities and the facility for face recognition. The eye-cam would interface with a custom made app, housed in a standard smartphone. She is currently waiting for technology to catch up with her vision, and one day soon, hopes to be able to hard wire the eye-cam directly to the vision centre in her brain, and in so doing become a truly enhanced human being - a cyborg - a post human. Scary, fascinating, challenging stuff - the cyborg becomes the iBorg.

Computer scientist Jaron Lanier's keynote speech at Learning Technologies in London recently served to illustrate several of the dangers and caveats of the post-human condition. Jaron Lanier vehemently rejects Ray Kurzweil's vision of a future where computers can exceed the capabilities of humans. 'You have to be somebody before you can share yourself', he warns. He suggests that we already have expanded memories (search engines of the web) and remote ears and eyes (mobile phones and webcams). Lanier sees no techno-eutopia in the future, but warns instead that we are in danger of dystopia. Indeed, he advised the makers of the movie Minority Report on what might be expected from a technology dominated future in which people were manipulated like chess pieces. The data mining capabilities of the social networks alone can enslave us by owning our purchasing habits, internet search preferences and all other personal data he suggests. He sees Facebook and other social networks undermining and devaluing friendships. The technology should work for us, not us for the technology. Lanier is a contentious, thoughtful character. In just a few minutes of conversation with him in the speaker's lounge, my impression was that he opposes anything that involves a 'hive mind'. 'Why are you wearing a Creative Commons badge?' he asked me as we gazed out over West London. I explained that I believe in giving all my content away for free and that to me, that is the essence of the future of learning. 'I'm going to speak against that today', he warned. It's clear that generally, Jaron Lanier holds a somewhat more pessimistic view of our possible cyborg future.

In the final analysis though, it is mind amplification that is the ultimate goal for humankind's future enhancement. The ability to distribute knowledge beyond the confines of the human brain, and the capability to extend the mind through and across networks does not demand or require any co-joining of human and computer. We have already achieved much of this through mind tools such as social media, which according to Karen Stephenson enable us to store our knowledge with our friends. Do we really need a post-human future? iThink not. 


Top image by Elif Ayiter

References

Clark, A. (2003) Natural Born Cyborgs: Minds, Technologies and the Future of Human Intelligence. New York: Oxford University Press.

Haraway, D. J. (2004) A Manifesto for Cyborgs: Science, Technology, and Socialist Feminism in the 1980s. New York: Routledge.

Lanier, J. (2010) You are not a Gadget. London: Penguin.


Creative Commons Licence
Human 2.0 by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

No comments:

Post a Comment