Machine learning is enabling some brilliant things in art and music. The latest example, from Google’s creative research team Magenta, is the Piano Genie — an AI program that lets you improvise fluently on the piano by simply bashing away at eight buttons.

The team behind Piano Genie was inspired by Guitar Hero, a game that also simplifies how to play an instrument. They didn’t want users to just tap along to prewritten songs, but to make up pieces of melody on the fly instead. To enable this, they trained an AI program on a huge dataset of classical piano music, teaching it to predict what notes follow each other the same way your phone’s predictive text function guesses what you’ll write next. (You can also try out a web version for yourself here.)

“I really wanted to design a tool that we could give to someone who doesn’t know how to play, and they’d be able to create music with some kind of intention,” Chris Donahue, an intern at Google Magenta and one of the trio that created Piano Genie, tells The Verge.

“occasionally, it will feel like it’s sort of reading your mind.”

Donahue explains that a lot of AI musical projects generate entire melodies from a single starting note or chord. Piano Genie is different in that it improvises note by note, giving the user a greater feeling of control. This is a technical challenge to minimize the latency so that each note is immediately ready to play, but it also creates a unique feeling for the player, says Donahue, who’s been playing the piano for 20 years.

“When you’re playing it, it’s this really awesome experience where, occasionally, it will feel like it’s sort of reading your mind and play the exact note you’re intending to,” he says. “And then other times, it will completely disobey you but still do something reasonable.”

This is why it’s the Piano Genie, says Donahue, because although you can wish for what you want, what you get is not always what you asked for.

The machine learning side of the Piano Genie is built from a few common AI elements. The main component is a recurrent neural network, a type of program that’s particularly good at learning to mimic sequential data, such as writing and music. This neural net was fed with a dataset of piano music taken from an international competition. This data was particularly useful, as the competition records all performances in a file format that preserves not only the notes, but also velocity (which translates to timbre and volume).

This was the main training data used to build a predictive model of what piano notes follow one another. This also means that the notes the Genie produces stick to certain keys and scales, although this variant can be tweaked. Donahue adds that the data was also useful as it was from a competition, meaning “people were playing appropriately flashy things.”

The Genie team, which also included Google’s Ian Simon and DeepMind’s Sander Dieleman, then had to design a pair of encoders that could fit this output into a format that suited their Guitar Hero-like controller. In other words, they had to shrink down 88 notes (the standard number of keys on a piano) into just eight buttons. The last part of the process was hooking all this up to a self-playing piano like what you see in the videos.

Donahue says programs like Piano Genie show that AI can work to augment human creativity. It turns humans into cyborgs of sorts, pairing our instinctive knowledge of when notes should be played with a computer’s ability to say which notes should come next.

“I think it’s a powerful combination I hope to see a lot more of,” says Donahue. He says that when newcomers try Piano Genie, they’re quickly delighted. “They have a tendency to timidly press a couple of keys here and there at first, but then if I say, ‘Imagine you’re a concert pianist onstage at Carnegie Hall,’ they get it more and really go for it.”