Please turn on JavaScript. Media requires JavaScript to play. Advertisement Despite the chaos and carnage of three nights of live punk at the Institute of Contemporary Arts punters would still be hard pressed to miss the three pogo-dancing robots in their midst. The machines, which have been created by a collaboration of artists and scientists, have been designed to fall in love with punk music and show their appreciation through dance. The robot punks take pride of place in the mosh pit at a series of gigs called Neurotic. Standing 2m tall, padded and dressed in leather, they are no ordinary concert goers. Professor McOwan, from Queen Mary University, and one of the creators of the robots, said they were built because of his fascination with human-computer interaction. "I'm a computational neuroscientist and my interest is in trying to build mathematical and computational models for the way the brain processes sensory information, such as visual or auditory information. A robot "dances" in front of punk band the Fumadores "I work out how human beings do that, build a computer model to test how it works and then hopefully, if it works well, you understand more about humans but also you have software for use in robotic systems. "The idea is to look at the information processing strategies that have taken billions of years to develop through evolution, steal them and put them into computers." The robots use neural networks, a collection of computer processors that function in a similar way to a simple animal brain. Neural networks are popular in the field of artificial intelligence because of their ability to recognise patterns from the sensory input of external sources, much like a human brain. The robots have been trained to like punk, explained Professor McOwan. Appreciate patterns "The robot brain, for want of a better word, was played lots of punk, reggae, disco and classical and over a period of time the robot has learned to recognise and appreciate the patterns of sound in punk music," he said. The neural network understands the music in a similar way to a human brain, breaking down the sound into a series of frequency bands. Programmer Jons Jones Morris said: "Breaking down the sound produces a map of the audio over time which is turned into an image. That image is submitted to one of the neural networks." Punk fans can dance alongside the pogoing robots. Using this "adaptive resonance theory", the neural networks begin to build up a history of different patterns relating to different sounds. The teaching then shifts to supervised learning in which a more advanced neural network is used to statistically analyse the occurrence of these patterns in a song, said Mr Jones Morris. "We tell it when the pattern is punk and soul, or whatever, so it has examples and counter examples. "When the robot is listening to live music it is basically pattern matching against the statistics from other types of music it has listened to previously." During a gig, the robot is reacting in real time to music it has never encountered before. Flicks between The robots can decide whether a song is punk or not within 30 seconds. "It depends on the form at the beginning of the song. It flicks between thinking a song is punk and not punk at the start and then becomes convinced," said Mr Jones Morris. Professor McOwan added: "If you look at the audio cortex in the brain and the cochlea in the ear you find that's exactly how the human system does it. The robots move up and down together. The robot reacts to the level of "punk" in the song. The more punk it believes the song is, the more it pogos in a "happy and frenzied way", said Professor McOwan. During the gigs - which run form 3-5 July - the hope is that the audience will interact with the robots. Professor McOwan said: "Is that level of performance within the robot something that human beings empathise with? "Do they think the robots like punk? It's almost like a Turing test - producing something with makes people believe there is an intelligence, making them empathise with the robot." It is also a real-world test of the technology, he said. "It's real-time signal processing and robotic control in a fairly hostile environment - in a mosh pit with lots of sweaty punks. "Also, hopefully people will become interested in the science behind it." Neurotic is at the ICA from 3-5 July and is funded by the Wellcome Trust.



E-mail this to a friend Printable version Bookmark with: Delicious

Digg

reddit

Facebook

StumbleUpon What are these?