Obviously you've explored the body's limitations in many ways, and looked to extend the spectrum of bodily experience. A work like Exoskeleton, a six-legged, pneumatically powered walking machine, seems particularly about translating movements of the human form into another, mechanical anatomy. Muscle Machine involved a somewhat similar approach, though this time with a more direct linkage that made it less clear whether the man moved the machine or vice-versa. That seems to complicate the operator's relationship with his own body — and perhaps, by extension, the spectator's sense of embodiment. What intrigues you about these kinds of feedback loops?

Both of these projects translate human bipedal gait into 6-legged insect like locomotion. But the control systems and feedback loops are different. As are the experiences of symbiosis. With Exoskeleton the leg movements are selected by the artist’s arm gestures. The robot can walk forwards and backwards with a ripple gait and sideways with a tripod gait. It can sit, stand and turn on the spot. The body and walking chassis of the Muscle Machine are more directly linked as encoders on my hip joints result in the machine responding to my leg motions. So lifting one leg up lifts three robot legs and swings them forward. By alternating my left and right leg the machine moves. There are sensors embedded within the chassis that detect the direction and body faces and the machine always walks in that direction. The performances are simply about taking the robot for a walk.

But yes, it’s strange navigating the performance space with six legs instead of two and the large space that the robots inhabit. Exoskeleton is three meters in diameter and the Muscle Machine is five meters in diameter, making them difficult to maneuver in limited spaces. Also both machines are sound systems with their mechanical sounds and the pneumatic hissing and solenoid clicks amplified. So you not only look where you are walking the robot but you also listen to the sounds you are generating. To compose the sounds you choreograph the movements of the machine.

The Prosthetic Head project dispensed with a body altogether, projecting a head that looked much like your own onto a large screen and allowing visitors to interact with it through a keyboard, “speaking” to it through text. This seems like the ultimate disembodiment — a Max Headroom-style simulation of a human face, with (nearly) all its expressive potential, but none of its corporeality. Yet all of that “expression” came about in part thanks to lines of computer code, suggesting that human bodies too are simply “running code.”

The Prosthetic Head is an embodied conversational agent. It has a database and a conversational strategy that enables it to speak, with facial expressions, to the person who interrogates it.

There were several reasons why it was only a disembodied head without a body. Firstly I felt that as a conversational system that speaking and facial expressions were adequate. And secondly, to emulate a whole body with the problems of limb movements and appropriate hand gestures would be difficult to achieve with any fidelity. The Head was not meant to be illustrative of an AI but to be simply a conversational system. That is coupled to a human head, it’s able to generate plausible and interesting responses. Certainly the notion of bodies simply running code is seductive, or that we can adequately operate as code, or that we might be able to upload or download bodies as code are interesting ideas. They are also highly problematic as a body is what it is not only because of its DNA code and carbon chemistry but also because of its physiology and its social interactivity. To be an intelligent agent you need to be both embodied and embedded in the world. Not to mention also needing to be massively augmented with instruments and machines to achieve what we consider human potential. But as part of the Thinking Head Project (led by the MARCS Labs at the University of Western Sydney), the Prosthetic Head has generated alternate physical embodiments. The Articulated Head has an industrial robot arm torso (or 6 degree of freedom articulated neck) that enables real-world behavior augmenting its virtual movements. An attention model was also developed that made it a more seductive interactive agent. The Floating Head (a collaboration with NXI Gestatio, Montreal) was a flying robot embodied with the Prosthetic Head and the Swarming Heads (MARCS Robotics Lab) are a cluster of smaller robots which interact with each other and that can be guided by gestures with the onboard Kinect sensor.

Articulated Head. Photographer: C. Kapor