Consider this hypothetical scenario: Bob and Alice are playing a game of Magic: The Gathering. It's normal game play at first, as, say, Filigree robots from Kaladesh face off against werewolves and vampires from Innistrad. But then Alice draws just the right card from her customized deck, and suddenly Bob finds himself caught in the equivalent of a Turing machine, the famed abstract device that can simulate any computer algorithm. Thanks to the peculiarities of the rules of Magic, Bob can now only finish the game when he meets whatever condition Alice has programmed her in-game algorithm to accomplish—for example, to find a pair of twin primes greater than one million.

It may be a highly unlikely scenario, but a recent paper posted on the physics arXiv proves that it's possible in principle to build a simple computer within this massively popular tabletop game using just the right combination of Magic cards. While the inputs must be pre-programmed, "Literally any function that can be computed by any computer can be computed within a game of Magic," said co-author Alex Churchill, a longtime Magic fan who has been working on the problem for several years.

Furthermore, he and his co-authors—Stella Biderman of the Georgia Institute of Technology and Austin Herrick of the University of Pennsylvania—have concluded that Magic might be as computationally complex as it's possible for any tabletop game to be. In other words, "This is the first result showing that there exists a real-world game [of Magic] for which determining the winning strategy is non-computable," the authors write.



A touch of Magic

For the uninitiated, Magic: The Gathering is a tabletop trading card game, invented in 1993 by mathematician Richard Garfield while he was completing his PhD. Players can build customized decks of 60 cards chosen from the massive collection available. They then use those cards to cast spells, use artifacts, and summon various magical creatures to defeat their opponents by draining them of "life points."

Magic shares some thematic similarities to tabletop games like Dungeons and Dragons—except, of course, Magic relies on thousands of cards (some 20 billion Magic cards were produced between 2008 and 2016 alone) and a dizzying array of rules governing game play. The authors of this latest paper prefer to think of it as a "two-player, zero-sum stochastic card game with imperfect information, putting it in the same category as games like poker and hearts."

Based in Cambridge, England, Churchill is a software engineer by day and an avid designer of games on the side. He grew up playing canasta, bridge, mahjong, and Scrabble, among others, and has well over 250 board games on his home shelves. He first discovered Magic: The Gathering while at college some 20 years ago. He's been an avid player ever since, one of the more than 20 million ardent players drawn to the expansive world-building within the game. (It's so popular that regular World Cup competitions have been held; as you'll see below, Ars UK attended the 2016 edition in Barcelona.)

"It's so many things to so many people," Churchill said of the game's enduring appeal. "Some people love to play to prove themselves the best; some love to play as a social experience; some love to play the game for the swinging with giant dragons and angels. It's got a vast amount of lore and storyline, and the artwork is incredible." It's the creative element of building one's own deck that most appeals to Churchill. "There's a lot of strategic and tactical depth, of course, but there's also a self-expression element in choosing which cards to put together," he said.

Churchill proposed the possibility of assembling a universal Turing machine from Magic cards several years ago as a means of proving that the game is "Turing complete." (You can read all the gory details at his website.) This latest work is a culmination of those earlier findings.

First proposed by Alan Turing in the 1930s, a Turing machine is an abstract concept, as opposed to a physical object, that laid the conceptual groundwork for the invention of the modern computer and basic programming techniques. As Matt Ford wrote for Ars back in 2007,

Turing machines are simple logic devices that can be made to simulate the logic of any standard computer that could be constructed. They consist of an infinite number of cells on a tape (the memory) and an active cell that is referred to as the "head." Each cell can be one of a set number of colors, and the head can have a fixed number of states. A set of rules determine how the combination of cell color and head state dictates what color should be written to the tape, what state the head should be placed in, and what direction it should move (left or right).

A universal Turing machine is one capable of running any algorithm, while "Turing completeness" is a term "used to indicate that a system has a particular degree of complexity," said Churchill. "Any Turing-complete system is theoretically able to emulate any other." Being able to determine whether a given problem can be solved in principle is a key task in computer science. If Magic is Turing complete, then there should exist within the game a scenario where it's impossible to determine a winning strategy—equivalent to the famous "halting problem" in computer science.

One way to demonstrate that a system is Turing complete is to create a Turing machine within it, and that's just what Churchill et al. have done with their work. There are three fundamental elements needed: a tape that encodes the computation; a controller to determine what action to take next based on the current state of play; and a read/write head under the control of the controller.

Listing image by Flickr user: Max Mayorov