Issue
I'd add several jobs to the celery queue and wait for the results. I have many ideas about how I would accomplish this using some type of shared storage (memcached, redis, database, etc.), but I think it was something Celery could handle automatically, but I can't find any resources online.
Code example
def do_tasks(b):
for a in b:
c.delay(a)
return c.all_results_some_how()
Solution
For Celery >= 3.0, TaskSet is deprecated in favour of group.
from celery import group
from tasks import add
job = group([
add.s(2, 2),
add.s(4, 4),
add.s(8, 8),
add.s(16, 16),
add.s(32, 32),
])
Start the group in the background:
result = job.apply_async()
Wait:
result.join()
Answered By - laffuste
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.