Inference in Discrete Bayesian Networks with Haskell

I am learning the algorithms used for inference with discrete bayesian networks. I decided to write an implementation in Haskell to give me an opportunity to code a bit in this beautiful language (I don't have a lot of opportunities at my office).

The first version of the library hbayes is very preliminary but already supporting junction trees.

You can find more info in the hbayes package on hackage.

But, here are a few examples.

1. Creating a network

example :: ( DVSet , SBN CPT )

example = runBN $ do

winter <- variable "winter" ( t :: Bool )

sprinkler <- variable "sprinkler" ( t :: Bool )

wet <- variable "wet grass" ( t :: Bool )

rain <- variable "rain" ( t :: Bool )

road <- variable "slippery road" ( t :: Bool )



proba winter ~~ [ 0.4 , 0.6 ]

cpt sprinkler [ winter ] ~~ [ 0.25 , 0.8 , 0.75 , 0.2 ]

cpt rain [ winter ] ~~ [ 0.9 , 0.2 , 0.1 , 0.8 ]

cpt wet [ sprinkler , rain ] ~~ [ 1 , 0.2 , 0.1 , 0.05 , 0 , 0.8 , 0.9 , 0.95 ]

cpt road [ rain ] ~~ [ 1 , 0.3 , 0 , 0.7 ]

return [ winter , sprinkler , rain , wet , road ]



2. Inference with variable elimination

First, we need the graph and the variables:

let ( [ winter , sprinkler , rain , wet , road ] , exampleG ) = example



then we can start doing some inferences

print "Prior Marginal : probability of rain"

print $ priorMarginal exampleG [ winter , sprinkler , wet , road ] [ rain ]



and with some evidence

print "Posterior Marginal : probability of rain if grass wet"

print $ posteriorMarginal exampleG [ winter , sprinkler , wet , road ] [ rain ] [ wet =: True ]



3. Inference with factor elimination (junction tree)

First, we need to create the junction tree. The junction tree created is dependent on the cost function used.

let jt = createJunctionTree nodeComparisonForTriangulation exampleG



Once the junction tree is available, it can be used to compute several marginals:

print "Prior Marginal : probability of rain"

print $ posterior jt rain



The function is named posterior although we are computing a prior. It is normal. The same function is used in both cases. The only difference is the presence of evidence in the tree or not.

To use some evidence, the junction has to be "loaded" with that evidence:

let jt' = updateEvidence [ wet =: True ] jt



This new tree can now be used to compute posterior marginals

print "Posterior Marginal : probability of rain if grass wet"

print $ posterior jt' rain



4. Problems

It is a very preliminary library. I have only tested it on a few number of bayesian networks and on a few queries. I don't know if it will work on other networks. I have implemented a few quickcheck tests (and one to check the key junction tree property). But, it is not enough. I need to make the algorithms clearer and more elegant. Then, it will be easier to discover problems.

The library has not been optimized. I have tested it on a big network and it was EXTREMELY slow.

I have not provided any additional tools to create the networks : soft evidence, additional nodes for more complex logical queries, noisy OR etc ... It will come in a future release.

Also, the Hugin importer has just been used to help me import a few networks for testing the algorithms. It is not a real Hugin parser.

So, in conclusion, it is still a toy library but you can already have some fun with it.