define-ruleset

define-facts

define-rule

define-rule

(and <pattern-clause> ...)

(or (and <pattern-clause> ...) ...)



(define-rule (some-rule some-ruleset)

(p1 ?a ?b ?c)

(p2 ?a ?b ?c)

(or (and (p3 ?a)

(p4 ?b))

(p3 ?c))

==>

(printf "a = ~a, b = ~a, c = ~a~n" ?a ?b ?c))





(define-rule (some-rule-a some-ruleset)

(p1 ?a ?b ?c)

(p2 ?a ?b ?c)

(p3 ?a)

(p4 ?b)

==>

(printf "a = ~a, b = ~a, c = ~a~n" ?a ?b ?c))



(define-rule (some-rule-b some-ruleset)

(p1 ?a ?b ?c)

(p2 ?a ?b ?c)

(p3 ?c))

==>

(printf "a = ~a, b = ~a, c = ~a~n" ?a ?b ?c))





(p1 ?a 12 ?b : (> ?b 10) (c . ?c))





(p1

(#:variable #f ?a #f (lambda (?a bindings)

(vector-immutable

(unsafe-vector-ref bindings 0))))

(#:literal #f #f 12 (lambda (x bindings)

(if (= x 12) bindings #f)))

(#:variable #f ?b (> ?b 10) (lambda (?b bindings)

(let ((?a (unsafe-vector-ref bindings 1)))

(if (> ?b 10)

(vector-immutable

(unsafe-vector-ref bindings 0) ?a ?b)

#f))))

(#:variable c ?c #f (lambda (?c bindings)

(let ((?a (unsafe-vector-ref bindings 1))

(?b (unsafe-vector-ref bindings 2)))

(vector-immutable

(unsafe-vector-ref bindings 0) ?a ?b ?c))))))



Data Structures for Matching and Joining



(p1 ?a 12 ?b : (> ?b 10) (c . ?c))





(p1 10 12 14 (c . 16) (d . 18))





#(<assertion> 10 14 16)





(p1 ?a ?b ?c)

?p2 (no (p3 ?a))





(p1 1 2 3)

(p2 1 2 3)



(p3 1)



#(#(<assertion> 1 2 3)

#(<assertion> 1 2 3)

#(#t))



One of the major changes for version 3.0 of the inference collection is a rule compiler that generates code for (portions of) the match/join process. This will be done at expansion (compile) time by the rule language macros themselves. The rule language macro include, and. The rule compilation itself takes place in themacro.So far, I have the rule parsing, rule validation, and rule normalization phases pretty much complete. The rule normalization phase converts Boolean expressions within the pattern clauses on a standard. The first form is a simple rule while the second is a sequence of simple rules. For example:would expand into the equivalent of the two rules:This can simplify writing complex rulesets.I am now coding the actual rule compiler. This will generate procedures for matching pattern elements and joining matches - applying join constraints. For example, the patternwould compile into:where each pattern element is a five element list ( ), where is the pattern element type (e.g., #:variable or #:literal), is the key for association list matching, is a variable name, is a constraint expression, and is a procedure to match the pattern element. These are used at execution time to build the rule network.The structure of the pattern is maintained in the compiled form and is used to structure the match nodes and the data flow between them. The , , , and fields are used to determine when nodes can be shared among rules. Finally, the procedure in the field does the actual matching and variable binding.Immutable vectors are used to represent matches. For example, the patternmatched against the asserted factwould produce a match ofwhere is the assertion object for the specified fact and 10, 14, and 16 are the values matching the variables ?a, ?b, and ?c, respectively.Immutable vectors are also used to represent joined matches. For example, the patternsmatched against the asserted factswith no asserted) fact, would produce a match ofA major bottleneck in the current inference engine is the amount of work done in propagating (or unpropagating) assertions and deletions through the join (and rule) nodes of the rule network. For version 3.0, I'm planning on two (eq?) hash tables for each join node to index left and right matches and a double doubly-linked list of joined matches. This will allow efficient insertion and deletion of both left and right matches during join processing (for propagating and unpropagating).More to come.

Labels: inference