Issue
I'm toying around with asyncio
and I can't achieve what I'm trying to achieve. Here's my code.
import random
import asyncio
async def my_long_operation(s: str):
print("Starting operation on", s)
await asyncio.sleep(3)
print("End", s)
def random_string(length: int = 10):
alphabet = \
[chr(x) for x in range(ord('a'), ord('z') + 1)]
result = ""
for i in range(length):
result += random.choice(alphabet)
return result
def get_strings(n = 10):
for i in range(n):
s = random_string()
yield s
async def several(loop):
tasks =list()
for s in get_strings():
task = asyncio.create_task(my_long_operation(s))
asyncio.ensure_future(task, loop = loop)
print("OK")
loop = asyncio.get_event_loop()
loop.run_until_complete(several(loop))
# loop.run_forever()
loop.close()
Now my problem is the following.
I'd like to run all of my my_long_operation
s concurrently, waiting for all of them to finish. The thing is that when I run the code, I get the following error:
Task was destroyed but it is pending!
task: <Task pending coro=<my_long_operation() done, defined at test.py:4> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x7f75274f7168>()]>>
Which seems to make sense, since my script ended right after starting these operations (without waiting for them to complete).
So we arrive to my actual question: how am I supposed to do this? How can I start these tasks and wait for them to complete before terminating the script, without using run_forever
(since run_forever
never exits Python...)
Thanks!
Solution
ensure_future
is basically the "put it in the event loop and then don't bother me with it" method. What you want instead is to await the completion of all your async functions. For that, use asyncio.wait
if you're not particularly interested in the results, or asyncio.gather
if you want the results:
tasks = map(my_long_operation, get_strings())
await asyncio.wait(list(tasks), loop=loop)
print('OK')
Answered By - deceze
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.