There’s something magical about complex algorithms. They take a hard problem and build a strong fence around it. Once you’ve got an implementation working, you don’t have to worry about what’s inside any more. Take the specifics of the problem at hand, drop them into the algorithm, and out pops an answer. It’s like a cartoon machine with lights and wheels on the outside, and magic happens on the inside.

I’ve been poking around looking at Python implementations of algorithms for solving. Roughly speaking, these tools take a system of constraints, and produce an answer comprising the values that satisfy the constraints. I’m sure there are more, but these are what I’ve turned up in a quick search.

¶ Simple Gauss-Jordan Elimination is a classic math technique for solving a system of linear equations. This works great, but is crying out to be implemented as a C extension.

¶ SMAWK (an awk-like acronym of the authors’ initials) finds the maxima in each row of a large matrix of values, provided the matrix is totally monotone. This sounds esoteric, but has applications in computational geometry and other areas. David Eppstein used it to implement Knuth’s paragraph wrapping algorithm, and if an algorithm lets you re-implement Knuth-Plass in 65 lines of Python, I’m guessing you have a powerful algorithm. David has a bunch of other algorithmic stuff for Python too.

¶ logilab-constraint and python-constraint are two Python implementations of constraint solvers. These seem to be good at discrete problems like solving Sudoku or eight queens. BTW: Roman Barták maintains a list of constraint system implementations which could be useful.

Beyond these, there are tons of fascinating techniques: simulated annealing (with some Python implementations), genetic algorithms (Python implementations), and optimization algorithms of all sorts.