Usually, pygame shows its output in a specially created window. Instead of creating this window, then saving images in sequence from it, before finally feeding these images to a tool like ffmpeg , I'd like to pipe pygame 's output directly ffmpeg .

Does what I want make sense?

If yes, how can I redirect pygame 's output to console? From the documentation, I am aware of methods like pygame.Surface.get_view or pygame.Surface.get_buffer , but I don't know what the difference is between them, and whether they are quite what I need.

In this tutorial, a raw numpy -array based RGB representation of images is fed to ffmpeg . I figure I could do a similar thing, except instead I'd feed in some sort of RGB representation obtained from pygame .

I know that on Linux, it's possible to output pygame stuff to a framebuffer for display in the console? Not sure if it is related. In any case, the effect is achieved by changing pygame drivers.

So, I have some dots, but need to connect them.