Functional stream libraries let us easily build stream processing pipelines, by composing sequences of simple transformers such as map or filter with producers (backed by an array, a file, or a generating function) and consumers (reducers). The purely applicative approach of building a complex pipeline from simple immutable pieces simplifies programming and reasoning: the assembled pipeline is an executable specification. To be practical, however, a library has to be efficient: at the very least, it should avoid creating intermediate structures -- especially structures like files and lists whose size grows with the length of the stream. Even the bounded-size intermediate structures significantly, up to two orders of magnitude, slow down the processing. Eliminating the intermediate structures is the central problem in stream processing: so-called stream fusion.

Stream fusion has been the subject of intensive research since late 1950's. By now, the low-hanging fruit in stream processing has been all picked up -- although some of it quite recently, POPL 2017. Stream fusion made it to the front page of CACM (May 2017). Java 8 Streams and Haskell compilers, among others, have implemented some of the earlier research results. We have attained a milestone. What are the further challenges?

That was the topic of many discussions at the workshop. Several main questions have come up over and over again:

Push and pull duality, their expressiveness and efficiency

error handling, and, in general, control of stream processing

sparsity of data.

As a tangible outcome, the meeting has identified a set of problems -- challenges -- to help drive and evaluate further research: filterMax, sorted merge, multiple appends, parallel merge. The report of the meeting discusses them in detail.

The workshop is organized together with Aggelos Biboudis and Martin Odersky. The Shonan seminar series is sponsored by Japan's National Institute of Informatics (NII).