Since async/await was announced for VB/C#, developers have been asking about an asynchronous version of IEnumerable. But until C# 7 and ValueTask, that was potentially challenging from a performance standpoint.

In earlier versions of C#, each time the developer used await, a memory allocation is implied. If you are enumerating 10,000 items, that’s 10,000 Task objects that could be allocated. Even with task caching, that’s a lot. Using a ValueTask , which only allocates memory under certain circumstances, the idea of an IAsyncEnumerable<T> seems more feasible.

With that in mind, we’ll go back and look at the September 2015 proposal for Asynchronous Streams.

IAsyncEnumerable<T> and IAsyncEnumerator<T>

This set of interfaces are the asynchronous complement of IEnumerable<T> . They are somewhat simplified though:

public interface IAsyncEnumerable<T> { public IAsyncEnumerator<T> GetEnumerator(); } public interface IAsyncEnumerator<out T> { public T Current { get; } public Task<bool> MoveNextAsync(); }

As you can see, IEnumerator’s Dispose or Reset methods are missing. Reset only exists for COM compatibility and it would be surprising if more than a handful of enumerators actually implemented this method. Dispose was probably removed because many people think it was a mistake to assume that all enumerators were necessarily disposable.

By only making MoveNext asynchronous, we see two benefits:

It is much easier to cache a Task<bool> than a Task<T> , meaning the number of memory allocations can be reduced.

Classes that already support IEnumerator<T> only need to add one additional method.

As mentioned above, memory allocation is a major concern for “chatty” asynchronous libraries. For the class implementing IAsyncEnumerator, this isn’t necessarily going to be a problem:

Let's assume that you are foreach'ing over such an asynchronous sequence, which is buffered behind the scenes, so that 99.9% of the time an element is available locally and synchronously. Whenever a Task is awaited that is already completed, the compiler avoids the heavy machinery and just gets the value straight out of the task without pause. If all awaited Tasks in a given method call are already completed, then the method will never allocate a state machine, or a delegate to store as a continuation, since those are only constructed the first time they are needed. Even when the async method reaches its return statement synchronously, without the awaits having ever paused, it needs to construct a Task to return. So normally this would still require one allocation. However, the helper API that the compiler uses for this will actually cache completed Tasks for certain common values, including true and false. In summary, a MoveNextAsync call on a sequence that is buffered would typically not allocate anything, and the calling method often wouldn't either.

But what if the data isn’t necessarily buffered?

Speculation: This is where ValueTask or another custom task type may fit in. Theoretically, you could even offer a “resettable task” that clears the completed flag when the enumerator calls MoveNextAsync. This kind of optimization hasn’t been discussed yet, and may not even be possible, but it’s the kind of thing that C# 7 brings to the table.

Asynchronous LINQ

Continuing our look back at the 2015 proposal, the next issue to consider is LINQ. A major concern for LINQ is the sheer number of combinations between synchronous/asynchronous sources and synchronous/asynchronous delegates. For example, the simple Where function would need four overloads:

public static IEnumerable<T> Where<T>(this IEnumerable<T> source, Func<T, bool> predicate); public static IAsyncEnumerable<T> Where<T>(this IAsyncEnumerable<T> source, Func<T, bool> predicate); public static IAsyncEnumerable<T> Where<T>(this IEnumerable<T> source, Func<T, Task<bool>> predicate); public static IAsyncEnumerable<T> Where<T>(this IAsyncEnumerable<T> source, Func<T, Task<bool>> predicate);

The proposal goes on to say:

So either we'd need to multiply the surface area of Linq by four, or we'd have to introduce some new implicit conversions to the language, e.g. from IEnumerable<T> to IAsyncEnumerable<T> and from Func<S, T> to Func<S, Task<T>>. Something to think about, but we think it is probably worth it to get Linq over asynchronous sequences one way or another.

The 4X method count may even be an underestimate, as some LINQ operations require multiple delegates.

Another question is whether to use Task or ValueTask based delegates. A 2017 document includes the line, “Would want overloads with async delegates (using ValueTask for efficiency)”.

Language Support

Obviously the first thing people are going to ask for once they have IAsyncEnumerable<T> is an asynchronous foreach. But what would that look like when you never actually see the Task object in code? Some options discussed include:

foreach (string s in asyncStream) { ... } //implied await await foreach (string s in asyncStream) { ... } foreach (await string s in asyncStream) { ... } foreach async (string s in asyncStream) { ... } foreach await (string s in asyncStream) { ... }

Equally problematic is when doing things such as ConfigureAwait, which is important for performance reasons in libraries. If you don't have your hands on the Task, how can you ConfigureAwait it? The best answer is to add a ConfigureAwait extension method to IAsyncEnumerable<T> as well. It returns a wrapper sequence that will return a wrapper enumerator whose MoveNextAsync will return the result of calling ConfigureAwait on the task that the wrapped enumerator's MoveNextAsync method returns.

The proposal continues:

For this to work, it is important that async foreach is pattern based, just like the synchronous foreach is today, where it will happily call any GetEnumerator, MoveNext and Current members, regardless of whether objects implement the official "interfaces". The reason for this is that the result of Task.ConfigureAwait is not a Task.

Returning to our speculation, this means that a class could offer both a custom enumerator based on ValueTask , etc. and still support IAsyncEnumerable<T> . This would be similar to how List<T> works, with a generic IEnumerable<T> and an alternative struct based enumerator that offers better performance.

Cancellation Tokens

Fast forward to the January 2017 meeting notes and talk about asynchronous streams resume. And first up is the tricky question about cancellation tokens.

GetAsyncEnumerator could accept an optional cancellation token, but what would that look like? From the notes:

1. have another overload :-( 2. have a default parameter (CLS :-() 3. have an extension method (requires the extension method in scope)

It is interesting to see that even though C# has supported default parameters for a long time, CLS or Common Language Specification constraints are still in effect. If you are unfamiliar with the term, the CLS defines the minimum set of functionality that all languages on the .NET platform are supposed to have. The other side of that token is that most libraries, especially the foundational ones, must be CLS compliant.

API specifics aside, the way an IAsyncEnumerable<T> gets the cancellation token is pretty clear. What’s not clear is how an iterator such as a foreach block would get it. They are looking at some sort of backdoor method that would be able to pull it out of the state machine, but that would probably require altering the enumerator interface.

TryMoveNext?

Going back to the earlier performance discussion, what if you could check to see whether or not there is data already available, without making an asynchronous call?

This is the theory behind adding a bool? TryMoveNext() method. True/false work as expected, but if you get a null that means you need to call MoveNextAsync to find out if there is any additional data.

Explicit chunking would be another alternative, wherein each call to would return an enumerable representing just the buffered data.

public IAsyncEnumerable<IEnumerable<Customer>> GetElementsAsync();

Both of these proposals have problems on both the provider and consumer side so the decision is still pending.