In order to see this embed, you must give consent to Social Media cookies. Open my cookie preferences.

Artificial intelligence has proved itself incredibly capable of analysing images, now its getting rhythm in the form of a four-armed, marimba-playing robot.

Novels, pop songs and artwork: AI is taking on culture Play Novels, pop songs and artwork: AI is taking on culture


The robot, named Shimon, was given a vast amount of musical data: more than 5,000 complete songs, two million motifs, riffs and short passages of music by researchers at Georgia Institute of Technology. It was then asked to compose and perform its own music.

It's been in development for some years, but this is the first time it has composed its own music. Once it had been fed the data it was able to use deep learning techniques to create two 30 second pieces of original music.

Read next Splice is giving locked down musicians a lifeline Splice is giving locked down musicians a lifeline

Georgia Tech researchers says the pieces sound like a cross between jazz and classical music. We'd call them delicately soothing. As well as the deep learning, Shimon also uses computer vision through a camera on its robo-head to detect what notes it should be playing.

In order to see this embed, you must give consent to Social Media cookies. Open my cookie preferences.

"The robot analyses a large dataset of music (including pop, classical, jazz and more) in an effort to identify patterns that appear in all songs and genres in the dataset," Gil Weinberg, the director of the Center for Music Technology at the University, tells WIRED. "It then uses what it learned (which can include melodic, harmonic and rhythmic patterns) to generate its own personal music based on a musical seed".

Before the robot starts to compose a piece it is given a starting point to work from. For the first piece of music it was given eight notes, the second was based on 16. "Since we are looking at units such as bars, riffs, and motifs, we can gain a more structural understanding of musical compositions than previous efforts, which only learned probabilistic note-to-note transitions," Weinberg says.


Subscribe to WIRED

It isn't the first time that artificial intelligence applications have been used to make music. Daddy's Car, a song based on the style of The Beatles, was composed by AI and has had more than 1.6 million views on YouTube. Back in June 2016, Google's AI group 'Brain' created a basic 90 second melody. The tech giant has also used machine learning to allow users to duet with artificial intelligence.

Weinberg explains that the developments could lead to longer musical pieces being created by AI: "Think a symphony or a concept album," he says. "Deep learning musical modelling that uses larger and larger musical units (which will also require larger and larger datasets to learn from) could lead to robotic music that can tell a long musical story".