Last week, I visited a couple of my professors from Northwestern to chat and help out with an interesting engineering “bootcamp” class they are teaching. Beyond the nostalgia, those interactions brought me back to my days in college, when I regularly felt this mixture of lighthearted fun and simultaneously an overwhelming awe at learning new things I didn’t even have the cognitive machinery to imagine before.

Maybe I’m blocking out the midterm anxiety, but still.

Switching to now, I’ve been using generators a bit more than usual lately and got curious about how they work. I’d skirted this rabbit hole in the past, but hadn’t dug further, so I decided to give it go without setting a practical goal. Sometimes, an exploration is a worthy end in itself.

Imagine being able to abstract control flow directly. What would that mean?

Let’s find out by learning about continuations! We’ll start with generators and dig deeper until we find fun, new stuff.

Simple Example: Exceptions

There’s a common situation where normal control flow gets subverted: exceptions.

# ruby > def buggy_code > ( 0 .. 10 ). each { | item | if item == 7 ; raise "oh no it's a 7!" ; else ; print item ; end } > end => :buggy_code > begin > buggy_code > rescue => e > print e . inspect > end 0123456 #<RuntimeError: oh no it's a 7!> => nil

Above, the buggy_code method is being executed, but then the ruby interpreter yields control to the error handler that wraps the buggy_code invocation, with a special raise keyword. This can happen regardless of how deep down the buggy_code invocation is.

This seems like a really interesting way to think about control flow –– it’s like passing the ball to a different player who’s in a totally different part of the field without going through with your previously intended actions.

More Involved Example: Generators

Now, an example of using generators to obtain even numbers:

# python 3.x >>> import itertools # infinite generator of incrementing even numbers # note: the full range is unbounded -- it's realized one item at a time >>> gen = ( i for i in itertools . count () if i % 2 == 0 ) # grab the first 10 even numbers >>> next ( gen ) 0 >>> next ( gen ) 2 >>> list ( itertools . islice ( gen , 10 )) [ 6 , 8 , 10 , 12 , 14 , 16 , 18 , 20 , 22 , 24 ]

Above, we use an already defined basic generator itertools.count() which just keeps counting up, and then we make use of it, taking the first 10 values. The underlying range is infinite, but we’re able to do our work in a lazy way, one item at a time.

That (i for i in ...) expression is a generator expression, similar to a list comprehension (which is denoted by [...] instead). Semantically, a generator expression can be thought of as making an anonymous generator function and calling it. And a generator function is a function that calls yield in the body somehow.

Here’s the implicit anonymous generator function that gen = (i for i in itertools.count() if i % 2 == 0) would make:

def __gen ( exp ): for i in exp : if i % 2 == 0 : yield i gen = __gen ( itertools . count ())

That yield behaves differently from a return , it essentially makes the invocation of the function produce a generator object, and that generator object produces new values for consumers like next(...) or list(...) .

Isn’t it interesting how the control flow seemingly switches back and forth as you keep calling gen.next() ?? It’s like a conversation, a boring one but still.

<- The generator does some work and yields control to the console. -> We type some code and return it to the console with an 'enter'. <- Then, the generator does some more of the remaining work and yields control to the console again.

This is different from our usual experience with function invocations. Usually, invocations intuitively seem like they’re “one and done,” with some effects or returned values being the result.

What’s the generator doing?

If we were to make that implicit generator object for real (for pedagogical reasons), it might look like this pseudo code:

class MyGenerator : def __init__ ( self , expression ): self . expression = expression def my_next_value ( self ): ## FAKE CODE value , remaining_expression = $ execute_until_yield ( self . expression ) self . expression = remaining_expression return value my_gen = MyGenerator ( lambda : $ some_code_that_has_a_yield )

This fake generator object is providing step-by-step, resumable execution of an expression (meaning: any piece of computation, like imagine a function being passed as that argument that itself calls a whole host of other functions). It needs to know about how to execute until the next yield somehow… and it likely needs to hold onto the “remaining computation.”

When the consumer calls my_next_value repeatedly, the my_gen generator object resumes execution of the original code, returns immediately with a value, and pauses the remaining computation again.

But this is… something we don’t have a direct handle on. How would $execute_until_yield even work?

How do we get this kind of control over pausing-and-resuming computation?

And since it needs to be generalized for all possible code you could write, what does “remaining computation” really mean?

Enter: Continuations

A continuation is “remaining computation,” essentially. It makes some sense, then, that a generator would be dealing with continuations. But how do we get more specific?

More and more, I turn to Scheme when building intuition.

Racket’s docs describe their evaluation model so well, I’ll quote them:

Racket evaluation can be viewed as the simplification of expressions to obtain values. For example, just as an elementary-school student simplifies 1 + 1 = 2 Racket evaluation simplifies (+ 1 1) -> 2

Computation can be understood as simply repeated reduction of expressions down to a level where we’re left with just values, a value being anything that can’t be reduced further.

Paraphrasing some other resources here – but basically, operational semantics tries to answer “how is a program executed?”. There have been many presentations of operational semantics, and one of them is reduction semantics (introduced by Hieb and Felleison in ‘92).

The operational semantics for a programming language describe how a valid program is interpreted as sequences of computational steps. With the reduction semantics presentation, the description is a series of reductions.

Back to understanding basic evaluation:

Some simplifications require more than one step. For example: (- 4 (+ 1 1)) → (- 4 2) → 2 ------ ------- An expression that is not a value can always be partitioned into two parts: a redex, which is the part that changed in a single-step simplification (highlighted), and the continuation, which is the evaluation context surrounding an expression. In (- 4 (+ 1 1)), the redex is (+ 1 1), and the continuation is (- 4 []), where [] takes the place of the redex. That is, the continuation says how to “continue” after the redex is reduced to a value.

Continuations are described as the evaluation context surrounding some expression. But that’s just prose!

What does (- 4 []) look like? It’s some kind of closed form that has a hole in it. It looks like a lambda (anonymous function)!

So, continuations can be thought of as just functions!

Let’s apply this idea to understand the even number generators a bit more:

# lang racket ( require racket/generator ) ( define evens ( generator () ( let loop ([ x 1 ]) ( if ( zero? ( modulo x 2 )) ( begin ( yield x ) ( loop ( + 1 x ))) ( loop ( + 1 x )))))) ;; and in the console: > ( evens ) 2 > ( evens ) 4

Here’s one way to read the above if you’re unfamiliar with LISP-like languages:

The literal values look like 1 , 0.5 , 'c , #t , my-func , #hash((a . 1) (b . 2)) .

, , , , , . Function applications look like (foo bar ...) . foo here is in the function application position (the head of the form), and the parens indicate that this is an s-expression. bar ... are one or many arguments.

Alright, we see the familiar yield here, inside an infinite iteration counting up from 1.

Let’s say we invoke the generator once: (evens) .

What happens is that the generator evaluates some code, produces some effects, and then yields control. In the background, there is a continuation, a function, representing the rest of the computation. Each time this generator is called, it’s then calling this implicit continuation and yielding control.

But how do you implement a generator?

Introducing shift and reset

Racket provides a lot of firepower for us. I enlisted the help of my friend Max (PhD student at Northeastern) to put all this power to good use. I had started the exploration by just reading about call/cc on wikipedia, and after enough experimentation to figure that out and sharing that with Max, Max suggested something even better: shift and reset .

What are shift and reset ?

They are control operators that work together to allow you to do things like “non local return”. “The reset operator sets the limit for the continuation while the shift operator captures or reifies the current continuation up to the innermost enclosing reset” (from wikipedia, haha). But let’s unpack that.

When you have an expression you’re trying to compute, you essentially go from the leaves to the root of the tree. For example, (+ (+ 2 3) 4) is a tree with + at the root, and with (+ 2 3) and 4 as the leaves. So, the “remaining computation” roughly corresponds to the earlier parts of the tree. Like in the explanation about redex and continuation , the redex here would be (+ 2 3) ( 4 is already reduced to a value and can’t be reduced, and the only other leaf not already a value is (+ 2 3) ). And the continuation would be (lambda (x) (+ x 4)) .

So, reset marks the node of the tree until which the continuation is delimited, and then shift captures the continuation up to that point.

Let’s try to use shift and reset to figure them out:

;; some basics examples > ( require racket/control ) > ( reset 3 ) 3 > ( reset ( shift k 1 )) 1 > ( reset ( shift k ( k 1 ))) 1 ;; now, the same buggy_code from above, but with shift/reset! > ( define ( buggy-code ) ( for-each ( lambda ( x ) ( and ( = x 7 ) ( shift k "oh no it's a 7" ))) ( range 0 10 ))) > ( reset ( print ( string-append "Runetime Error: " ( buggy-code )))) "oh no it's a 7"

This is interesting! We’re able to mimic the behavior of rasing exceptions with shift and reset .

Instead of prose about shift and reset, here are the reduction rules for reset :

1. (reset val) => val 2. (reset E[(shift k expr)]) => (reset ((lambda (k) expr) (lambda (v) (reset E[v])))) ; where E has no reset

Rule 1 is easy. If the argument that reset is being applied to doesn’t have shift in it, we just reduce the reset form down to that value.

Rule 2 tells us a lot about how this works.

Notice that the reduction removes the shift form in E , and that there’s a sort of switching of things happening. The expr inside the shift form is now the function body of a lambda abstraction, with k as the parameter. Also, what is being provided as the argument to this lambda with the k parameter? It seems like it’s E but with the shift form punched out by the parameter v , inside another lambda abstraction with v as the parameter.

(lambda (v) (reset E[v])) is basically the current continuation. What reset does is to set the outer boundary up to which the continuation is delimited, and what shift does is to “capture” that delimited continuation.

For example, in (reset (print (string-append "Runetime Error: " (shift k "oh no it's a 7")))) , the continuation is (lambda (v) (reset (print (string-append "Runetime Error: " v)))) .

Continuing that example, the reset would reduce to:

(( lambda ( k ) "oh no it's a 7" ) ( lambda ( v ) ( reset ( print ( string-append "Runetime Error: " v )))))

So, when shift captures the delimited current continuation and passes it k as the parameter, it’s doing a control flow switch to that continuation.

Implementing Generators

Turns out, we can implement generators easily with shift and reset .

Here is an approach by making our own yield to do a non-local return of a value:

> ( define ( yield x ) ( shift k ( cons x ( k ( void ))))) > ( reset ( for-each ( lambda ( x ) ( and ( zero? ( modulo x 2 )) ( yield x ))) ( range 1 10 ))) ( 2 4 6 8 . # <void> )

Above, for-each goes through a list and applies the supplied function to each item. What’s happening is that the supplied function captures the (for-each ...) continuation, since it’s beneath the reset .

As the interpreter starts to reduce this reset form, it follows its reduction rules down to for-each , which essentially becomes a series of function applications that each get applied one by one. For example, here’s the application of the function to 0 .

(( lambda ( x ) ( and ( zero? ( module x 2 )) ( yield x ))) 0 )

Which reduces a bit further to…

(( lambda ( x ) ( shift k ( cons x ( k ( void ))))) 0 ) ; and then... ( shift k ( cons 0 ( k ( void ))))

And that shift reduces according to reset’s reduction rules above to:

(lambda (k) (cons 0 (k (void))))

…called with the current continuation, which includes the rest of the for-each , which reduces to just more applications of yield (which only gets called when it’s an even number) just with different values!

And that’s how we end up with:

(2 4 6 8 . #<void>)

It comes down to straightforward reduction of expressions!

Also, here is yet another implementation of a simple “generate one at a time” generator. Max noodled about this and provided this quick example.

( define ( generate-one-at-a-time lst ) ( define k #f ) ( define ( go ) ( if k ( k ) ( reset ( let () ( for-each ( lambda ( x ) ( shift cur-k ( let () ( set! k cur-k ) x ))) lst ) ‘done )))) go ) ;; some usage: > ( define my-gen ( generate-one-at-a-time ' ( 1 2 3 4 ))) > ( my-gen ) 1 > ( my-gen ) 2 > ( my-gen ) 3

The interesting thing about this one is that it’s all enclosed and feels much more like the generator from the very top of this post. It similarly has the for-each inside the reset , so the continuation contains the full remaining list, but it interestingly maintains and updates some simple state so that it knows where it is and can “resume” properly.

A nice way to sort of fill in the pseudo/fake code we had from before!

End Notes

For more, check out Danvy and Filinski’s Abstracting Control, from 1990. Really good one, I’m still reading it honestly.

Also, check out pypy’s source code instead of CPython’s if you are itching to dig into the Python implementation. It uses continuations and seemed easier to read to me.

Thoughts? Suggestions for what I should explore next? Find me as @gnarmis on Twitter.

Also, a big thank you to Max New for his super thorough review. Another thank you for Hillel Wayne for his overall feedback.