How Do The Async And Await Keywords Work, Exactly? What's At The End Of The Await Chain?
Solution 1:
Have you tried looking at the source for asyncio.sleep?
@coroutine defsleep(delay, result=None, *, loop=None):
"""Coroutine that completes after a given time (in seconds)."""if delay == 0:
yieldreturn result
if loop isNone:
loop = events.get_event_loop()
future = loop.create_future()
h = future._loop.call_later(delay,
futures._set_result_unless_cancelled,
future, result)
try:
return (yieldfrom future)
finally:
h.cancel()
Basically it uses loop.call_later to set a future, and then waits for the future. Not sure this entirely answers your questions, but it might help.
Solution 2:
So, I understand a lot better how to make what I was trying to do work. This is how my code should've read:
import types
@types.coroutinedeffoo(x):
yield x
yield x + 1asyncdefintermediary(y):
await foo(y)
defbar():
c = intermediary(5)
try:
whileTrue:
result = c.send(None)
print(f"Got {result} from the coroutine.")
except StopIteration as e:
print(f"StopIteration exception: {e!r}")
The basic answer is that the endpoint of this can be a normal generator decorated with types.coroutine
. There are more ways of making this work, and this further modification of my code demonstrates them:
import types
from collections.abc import Awaitable
@types.coroutinedeffoo(x):
sent = yield x
print(f"foo was sent {sent!r}.")
sent = yield x + 1print(f"foo was sent {sent!r}.")
return'generator'classMyAwaitable(Awaitable):
def__init__(self, x):
super().__init__()
self.x_ = x
def__await__(self):
defgen(x):
for i inrange(x-1, x+2):
sent = yield i
print(f"MyAwaitable was sent {sent!r}.")
return'class'returniter(gen(self.x_))
asyncdefintermediary(t, y):
awaited = await t(y)
print(f"Got {awaited!r} as value from await.")
defrunco(chain_end):
c = intermediary(chain_end, 5)
try:
sendval = NonewhileTrue:
result = c.send(sendval)
print(f"Got {result} from the coroutine.")
sendval = sendval + 1if sendval isnotNoneelse0except StopIteration as e:
print(f"StopIteration exception: {e!r}")
As you can see, anything that defines an __await__
method that returns an iterator can also be await
ed upon. What really happens is that the thing being await
ed upon is iterated over until it stops and then the await
returns. The reason you do this is that the final thing at the end of the chain may encounter some kind of blocking condition. It can then report on that condition (or ask a callback to be set or something else) by yield
ing or returning a value from the iterator (basically the same thing as yield
ing). Then the top level loop can continue on to whatever other thing can be run.
The nature of the whole chain of await
calls is that when you then go back and ask for the next value from the iterator (call back into the blocked function telling it that maybe it isn't blocked now) the entire call stack is reactivated. This whole chain exists as a way to preserve the state of the call stack while the call is blocked. Basically a thread that voluntarily gives up control rather than having control wrested from it by a scheduler.
The vision in my head of how asyncio
worked internally when I asked this question is apparently how something called curio works and is based on the end point routines yield
ing some sort of indicator of what they're being blocked by and the top level loop that's running it all (runco
in my example) then putting that in some sort of general pool of conditions to look for so it can resume the routine as soon as the condition it's blocked by changes. In asyncio
, something much more complex happens, and it uses objects with the __await__
method (like MyAwaitable
in my example) and some sort of callback mechanism to make it all work.
Brett Cannon wrote a really good article that talks about how generators evolved into coroutines. It will go into far more detail than I can go into in a StackOverflow answer.
One interesting tidbit I discovered is that when you do this:
deffoo(x):
yield11
bar = types.coroutine(foo)
Both foo
and bar
become 'coroutines' and can be await
ed on. All the decorator does is flip a bit in foo.__code__.co_flags
. This is, of course, an implementation detail and should not be relied upon. I think this is something of a bug actually, and I may report it as such.
Solution 3:
There is an example in the documentation, that looks almost exactly like what you are trying to do. It contains a sleep call (used instead of IO), so that the asyncio aspect makes sense.
import asyncio
asyncdefcompute(x, y):
print("Compute %s + %s ..." % (x, y))
await asyncio.sleep(1.0)
return x + y
asyncdefprint_sum(x, y):
result = await compute(x, y)
print("%s + %s = %s" % (x, y, result))
loop = asyncio.get_event_loop()
loop.run_until_complete(print_sum(1, 2))
loop.close()
Solution 4:
Going through the code you have supplied above, an async def
that includes a yield
creates an Asynchronous Generator:
asyncdeffoo(x):
yield x
yield x + 1
To consume data from it, use async for
:
asyncdefintermediary(y):
results = []
asyncfor x in foo(y):
results.append(x)
return results
To consume a result from a simple coroutine such as intermediary
from a regular function, you will need to create an event loop and to use run_until_complete()
:
loop = asyncio.get_event_loop()
result = loop.run_until_complete(intermediary(5))
print(result)
loop.close()
Post a Comment for "How Do The Async And Await Keywords Work, Exactly? What's At The End Of The Await Chain?"