Louis Armstrong's music will be used to "teach" AI how to play jazz. Wikimedia Commons Artificial intelligence (AI) can paint hallucinatory images, shut down internet trolls, and critique the most creative paintings in history.

Now, with help from the Defense Advanced Research Projects Agency (DARPA), AI is coming for your saxophones and pianos, too. Jazz musician and computer scientist Kelland Thomas is building an AI program that can learn to play jazz and jam with the best of them, under a DARPA-funded project that aims to improve how we communicate with computers.

"A jazz musician improvises, given certain structures and certain constraints and certain basic guidelines that musicians are all working with," Thomas told Tech Insider. "Our system is going to be an improvisational system. So yeah, it will be able to jam."

Thomas and his team will first build a database of thousands of transcribed musical performances by the best jazz improvisers, including Louis Armstrong, Miles Davis, and Charlie Parker. Then, using machine learning techniques, they'll "train" the AI system with this database.

Eventually the AI will learn to analyze and identify musical patterns from the transcriptions, including Miles Davis's performance of "So What?" below:

The AI could use that knowledge to compose and play live, original music.

"A human musician also builds a knowledge base by practicing and by listening and by learning and studying," Thomas said. "So the thing we're proposing to do is analogous to the way a human learns, but eventually it will be able to do this on a much larger scale. It can scour thousands of transcriptions instead of dozens or hundreds."

Many people might not consider music a form of communication, but Paul Cohen, an AI researcher and head of the Communicating with Computers project, thinks music shares many qualities with the spoken and written word.

"Jazz, as with conversation, is sort of an interesting mixture of creativity and very tightly, ruled-down behavior," Cohen told Tech Insider. "There are strict rules about improvisation, following particular harmonic lines and making sure your timing is right. You can't end a phrase at the wrong place. It has to be done at exactly the right time and place."

Thomas thinks that making computers as convincingly creative as humans will make collaborations between humans and computers smoother and more efficient. For Thomas, jazz is the best way to model human creativity.

"In my mind, jazz and improvisation in music represent a pinnacle of human intellectual and mental achievement," Thomas said. "The ability to, on the fly and in the moment, create melodies that are goal-directed, that are going somewhere, doing something and evincing emotion in the listener, is really, really amazing."

Within five years, Thomas hopes to build an AI system that can improvise an electronic jazz number alongside a human musician. Following that: a robot that can manipulate musical instruments and accompany human musicians on stage.

But you don't have to wait five years to watch intelligent machines play music. Engineers from Japan to Germany are already building robots you can program to play pre-written songs.

Then there's Mason Bretan, a PhD student from Georgia Tech. He's been jamming alongside "Shimi" robots, which can partially improvise.

In the video below, Bretan provided the arrangement of his parts, and a recording of their tracks and cues. "But in between," including the mallet solo, the robots are "doing their own thing based on his chord progressions," according to the Washington Post.