We believe that the changes proposed here will help keep Python relevant and competitive in a quickly growing area of asynchronous programming, as many other languages have adopted, or are planning to adopt, similar features: , , , , , .

This PEP assumes that the asynchronous tasks are scheduled and coordinated by an Event Loop similar to that of stdlib module asyncio.events.AbstractEventLoop . While the PEP is not tied to any specific Event Loop implementation, it is relevant only to the kind of coroutine that uses yield as a signal to the scheduler, indicating that the coroutine will be waiting until an event (such as IO) is completed.

It is proposed to make coroutines a proper standalone concept in Python, and introduce new supporting syntax. The ultimate goal is to help establish a common, easily approachable, mental model of asynchronous programming in Python and make it as close to synchronous programming as possible.

The growth of Internet and general connectivity has triggered the proportionate need for responsive and scalable code. This proposal aims to answer that need by making writing explicitly asynchronous, concurrent Python code easier and more Pythonic.

See and for more details.

In CPython 3.7, the old __aiter__ protocol will no longer be supported: a RuntimeError will be raised if __aiter__ returns anything but an asynchronous iterator.

In CPython 3.6, the old __aiter__ protocol will still be supported with a DeprecationWarning being raised.

If the old protocol is used in 3.5.2, Python will raise a PendingDeprecationWarning .

Before 3.5.2, __aiter__ was expected to return an awaitable resolving to an asynchronous iterator. Starting with 3.5.2, __aiter__ should return asynchronous iterators directly.

This change was implemented based primarily due to problems encountered attempting to integrate support for native coroutines into the Tornado web server (reported in ).

Feedback on the initial beta release of Python 3.5 resulted in a redesign of the object model supporting this PEP to more clearly separate native coroutines from generators - rather than being a new kind of generator, native coroutines are now their own completely distinct type (implemented in ).

Native coroutines and the associated new syntax features make it possible to define context manager and iteration protocols in asynchronous terms. As shown later in this proposal, the new async with statement lets Python programs perform asynchronous calls when entering and exiting a runtime context, and the new async for statement makes it possible to perform asynchronous calls in iterators.

This proposal makes coroutines a native Python language feature, and clearly separates them from generators. This removes generator/coroutine ambiguity, and makes it possible to reliably define coroutines without reliance on a specific library. This also enables linters and IDEs to improve static code analysis and refactoring.

Current Python supports implementing coroutines via generators ( PEP 342 ), further enhanced by the yield from syntax introduced in PEP 380 . This approach has a number of shortcomings:

This proposal introduces new syntax and semantics to enhance coroutine support in Python.

This specification presumes knowledge of the implementation of coroutines in Python (PEP 342 and PEP 380). Motivation for the syntax changes proposed here comes from the asyncio framework (PEP 3156) and the "Cofunctions" proposal (PEP 3152, now rejected in favor of this specification).

From this point in this document we use the word native coroutine to refer to functions declared using the new syntax. generator-based coroutine is used where necessary to refer to coroutines that are based on generator syntax. coroutine is used in contexts where both definitions are applicable.

New Coroutine Declaration Syntax The following new syntax is used to declare a native coroutine: async def read_data(db): pass Key properties of coroutines: async def functions are always coroutines, even if they do not contain await expressions.

functions are always coroutines, even if they do not contain expressions. It is a SyntaxError to have yield or yield from expressions in an async function.

to have or expressions in an function. Internally, two new code object flags were introduced: CO_COROUTINE is used to mark native coroutines (defined with new syntax). CO_ITERABLE_COROUTINE is used to make generator-based coroutines compatible with native coroutines (set by types.coroutine() function).

Regular generators, when called, return a generator object; similarly, coroutines return a coroutine object.

StopIteration exceptions are not propagated out of coroutines, and are replaced with a RuntimeError . For regular generators such behavior requires a future import (see PEP 479).

exceptions are not propagated out of coroutines, and are replaced with a . For regular generators such behavior requires a future import (see PEP 479). When a native coroutine is garbage collected, a RuntimeWarning is raised if it was never awaited on (see also Debugging Features).

is raised if it was never awaited on (see also Debugging Features). See also Coroutine objects section.

types.coroutine() A new function coroutine(fn) is added to the types module. It allows interoperability between existing generator-based coroutines in asyncio and native coroutines introduced by this PEP: @types.coroutine def process_data(db): data = yield from read_data(db) ... The function applies CO_ITERABLE_COROUTINE flag to generator- function's code object, making it return a coroutine object. If fn is not a generator function, it is wrapped. If it returns a generator, it will be wrapped in an awaitable proxy object (see below the definition of awaitable objects). Note, that the CO_COROUTINE flag is not applied by types.coroutine() to make it possible to separate native coroutines defined with new syntax, from generator-based coroutines.

Await Expression The following new await expression is used to obtain a result of coroutine execution: async def read_data(db): data = await db.fetch('SELECT ...') ... await , similarly to yield from , suspends execution of read_data coroutine until db.fetch awaitable completes and returns the result data. It uses the yield from implementation with an extra step of validating its argument. await only accepts an awaitable, which can be one of: A native coroutine object returned from a native coroutine function.

A generator-based coroutine object returned from a function decorated with types.coroutine() .

An object with an __await__ method returning an iterator. Any yield from chain of calls ends with a yield . This is a fundamental mechanism of how Futures are implemented. Since, internally, coroutines are a special kind of generators, every await is suspended by a yield somewhere down the chain of await calls (please refer to PEP 3156 for a detailed explanation). To enable this behavior for coroutines, a new magic method called __await__ is added. In asyncio, for instance, to enable Future objects in await statements, the only change is to add __await__ = __iter__ line to asyncio.Future class. Objects with __await__ method are called Future-like objects in the rest of this PEP. It is a TypeError if __await__ returns anything but an iterator.

Objects defined with CPython C API with a tp_as_async.am_await function, returning an iterator (similar to __await__ method). It is a SyntaxError to use await outside of an async def function (like it is a SyntaxError to use yield outside of def function). It is a TypeError to pass anything other than an awaitable object to an await expression. Examples of "await" expressions Valid syntax examples: Expression Will be parsed as if await fut: pass if (await fut): pass if await fut + 1: pass if (await fut) + 1: pass pair = await fut, 'spam' pair = (await fut), 'spam' with await fut, open(): pass with (await fut), open(): pass await foo()['spam'].baz()() await ( foo()['spam'].baz()() ) return await coro() return ( await coro() ) res = await coro() ** 2 res = (await coro()) ** 2 func(a1=await coro(), a2=0) func(a1=(await coro()), a2=0) await foo() + await bar() (await foo()) + (await bar()) -await foo() -(await foo()) Invalid syntax examples: Expression Should be written as await await coro() await (await coro()) await -coro() await (-coro())

Asynchronous Context Managers and "async with" An asynchronous context manager is a context manager that is able to suspend execution in its enter and exit methods. To make this possible, a new protocol for asynchronous context managers is proposed. Two new magic methods are added: __aenter__ and __aexit__ . Both must return an awaitable. An example of an asynchronous context manager: class AsyncContextManager: async def __aenter__(self): await log('entering context') async def __aexit__(self, exc_type, exc, tb): await log('exiting context') New Syntax A new statement for asynchronous context managers is proposed: async with EXPR as VAR: BLOCK which is semantically equivalent to: mgr = (EXPR) aexit = type(mgr).__aexit__ aenter = type(mgr).__aenter__ VAR = await aenter(mgr) try: BLOCK except: if not await aexit(mgr, *sys.exc_info()): raise else: await aexit(mgr, None, None, None) As with regular with statements, it is possible to specify multiple context managers in a single async with statement. It is an error to pass a regular context manager without __aenter__ and __aexit__ methods to async with . It is a SyntaxError to use async with outside of an async def function. Example With asynchronous context managers it is easy to implement proper database transaction managers for coroutines: async def commit(session, data): ... async with session.transaction(): ... await session.update(data) ... Code that needs locking also looks lighter: async with lock: ... instead of: with (yield from lock): ...

Asynchronous Iterators and "async for" An asynchronous iterable is able to call asynchronous code in its iter implementation, and asynchronous iterator can call asynchronous code in its next method. To support asynchronous iteration: An object must implement an __aiter__ method (or, if defined with CPython C API, tp_as_async.am_aiter slot) returning an asynchronous iterator object. An asynchronous iterator object must implement an __anext__ method (or, if defined with CPython C API, tp_as_async.am_anext slot) returning an awaitable. To stop iteration __anext__ must raise a StopAsyncIteration exception. An example of asynchronous iterable: class AsyncIterable: def __aiter__(self): return self async def __anext__(self): data = await self.fetch_data() if data: return data else: raise StopAsyncIteration async def fetch_data(self): ... New Syntax A new statement for iterating through asynchronous iterators is proposed: async for TARGET in ITER: BLOCK else: BLOCK2 which is semantically equivalent to: iter = (ITER) iter = type(iter).__aiter__(iter) running = True while running: try: TARGET = await type(iter).__anext__(iter) except StopAsyncIteration: running = False else: BLOCK else: BLOCK2 It is a TypeError to pass a regular iterable without __aiter__ method to async for . It is a SyntaxError to use async for outside of an async def function. As for with regular for statement, async for has an optional else clause. Example 1 With asynchronous iteration protocol it is possible to asynchronously buffer data during iteration: async for data in cursor: ... Where cursor is an asynchronous iterator that prefetches N rows of data from a database after every N iterations. The following code illustrates new asynchronous iteration protocol: class Cursor: def __init__(self): self.buffer = collections.deque() async def _prefetch(self): ... def __aiter__(self): return self async def __anext__(self): if not self.buffer: self.buffer = await self._prefetch() if not self.buffer: raise StopAsyncIteration return self.buffer.popleft() then the Cursor class can be used as follows: async for row in Cursor(): print(row) which would be equivalent to the following code: i = Cursor().__aiter__() while True: try: row = await i.__anext__() except StopAsyncIteration: break else: print(row) Example 2 The following is a utility class that transforms a regular iterable to an asynchronous one. While this is not a very useful thing to do, the code illustrates the relationship between regular and asynchronous iterators. class AsyncIteratorWrapper: def __init__(self, obj): self._it = iter(obj) def __aiter__(self): return self async def __anext__(self): try: value = next(self._it) except StopIteration: raise StopAsyncIteration return value async for letter in AsyncIteratorWrapper("abc"): print(letter) Why StopAsyncIteration? Coroutines are still based on generators internally. So, before PEP 479, there was no fundamental difference between def g1(): yield from fut return 'spam' and def g2(): yield from fut raise StopIteration('spam') And since PEP 479 is accepted and enabled by default for coroutines, the following example will have its StopIteration wrapped into a RuntimeError async def a1(): await fut raise StopIteration('spam') The only way to tell the outside code that the iteration has ended is to raise something other than StopIteration . Therefore, a new built-in exception class StopAsyncIteration was added. Moreover, with semantics from PEP 479, all StopIteration exceptions raised in coroutines are wrapped in RuntimeError .

Coroutine objects Differences from generators This section applies only to native coroutines with CO_COROUTINE flag, i.e. defined with the new async def syntax. The behavior of existing *generator-based coroutines* in asyncio remains unchanged. Great effort has been made to make sure that coroutines and generators are treated as distinct concepts: Native coroutine objects do not implement __iter__ and __next__ methods. Therefore, they cannot be iterated over or passed to iter() , list() , tuple() and other built-ins. They also cannot be used in a for..in loop. An attempt to use __iter__ or __next__ on a native coroutine object will result in a TypeError . Plain generators cannot yield from native coroutines: doing so will result in a TypeError . generator-based coroutines (for asyncio code must be decorated with @asyncio.coroutine ) can yield from native coroutine objects. inspect.isgenerator() and inspect.isgeneratorfunction() return False for native coroutine objects and native coroutine functions. Coroutine object methods Coroutines are based on generators internally, thus they share the implementation. Similarly to generator objects, coroutines have throw() , send() and close() methods. StopIteration and GeneratorExit play the same role for coroutines (although PEP 479 is enabled by default for coroutines). See PEP 342, PEP 380, and Python Documentation for details. throw() , send() methods for coroutines are used to push values and raise errors into Future-like objects.

Debugging Features A common beginner mistake is forgetting to use yield from on coroutines: @asyncio.coroutine def useful(): asyncio.sleep(1) # this will do nothing without 'yield from' For debugging this kind of mistakes there is a special debug mode in asyncio, in which @coroutine decorator wraps all functions with a special object with a destructor logging a warning. Whenever a wrapped generator gets garbage collected, a detailed logging message is generated with information about where exactly the decorator function was defined, stack trace of where it was collected, etc. Wrapper object also provides a convenient __repr__ function with detailed information about the generator. The only problem is how to enable these debug capabilities. Since debug facilities should be a no-op in production mode, @coroutine decorator makes the decision of whether to wrap or not to wrap based on an OS environment variable PYTHONASYNCIODEBUG . This way it is possible to run asyncio programs with asyncio's own functions instrumented. EventLoop.set_debug , a different debug facility, has no impact on @coroutine decorator's behavior. With this proposal, coroutines is a native, distinct from generators, concept. In addition to a RuntimeWarning being raised on coroutines that were never awaited, it is proposed to add two new functions to the sys module: set_coroutine_wrapper and get_coroutine_wrapper . This is to enable advanced debugging facilities in asyncio and other frameworks (such as displaying where exactly coroutine was created, and a more detailed stack trace of where it was garbage collected).

New Standard Library Functions types.coroutine(gen) . See types.coroutine() section for details.

. See types.coroutine() section for details. inspect.iscoroutine(obj) returns True if obj is a native coroutine object.

returns if is a native coroutine object. inspect.iscoroutinefunction(obj) returns True if obj is a native coroutine function.

returns if is a native coroutine function. inspect.isawaitable(obj) returns True if obj is an awaitable.

returns if is an awaitable. inspect.getcoroutinestate(coro) returns the current state of a native coroutine object (mirrors inspect.getfgeneratorstate(gen) ).

returns the current state of a native coroutine object (mirrors ). inspect.getcoroutinelocals(coro) returns the mapping of a native coroutine object's local variables to their values (mirrors inspect.getgeneratorlocals(gen) ).

returns the mapping of a native coroutine object's local variables to their values (mirrors ). sys.set_coroutine_wrapper(wrapper) allows to intercept creation of native coroutine objects. wrapper must be either a callable that accepts one argument (a coroutine object), or None . None resets the wrapper. If called twice, the new wrapper replaces the previous one. The function is thread-specific. See Debugging Features for more details.

allows to intercept creation of native coroutine objects. must be either a callable that accepts one argument (a coroutine object), or . resets the wrapper. If called twice, the new wrapper replaces the previous one. The function is thread-specific. See Debugging Features for more details. sys.get_coroutine_wrapper() returns the current wrapper object. Returns None if no wrapper was set. The function is thread-specific. See Debugging Features for more details.