Issue
I'm creating a discord bot where one of the commands (I'll call it long_method()
) creates a call to the Youtube API and downloads comments. This takes a while and I want my bot to continue responding to other commands (short_method()
, for example) while the comments are downloading.
Using asyncio, the definitions of my methods look like this:
@commands.command()
async def long_method(self, ctx, params):
await ctx.send("starting long method")
loop = asyncio.get_event_loop()
loop.create_task(self.long_method_helper(ctx, params))
async def long_method_helper(self, ctx, params):
# download youtube comments
# ...
await ctx.send("long method complete")
@commands.command()
async def short_method(self, ctx):
await ctx.send("short method called")
However, when I send the following to the discord channel, without waiting for the long method to finish:
+long_method
+short_method
The bot outputs:
long method started
[ takes time to process long method ]
long method finished
short method called
Instead, I want it to output:
long method started
short method called
[ time to process long method or responses to other methods]
long method finished
Is there something I'm missing? Shouldn't the call to create_task(self.long_method_helper(ctx, params))
make it run in the background? Thanks!
Edit: the contents of long_method_helper
:
async def search_yt_helper(self, query, confirmation_ctx):
# get video ids and titles from Comments package
video_ids, video_titles, channels = await Comments.youtube_search_keyword(self.api_key, query, max_results=5)
n = 0
new_comments = pd.DataFrame()
logging.info(f"got {len(video_ids)} results")
# for each video, get comments from id and add them to new_comments dataframe
for v_id in video_ids:
c = await Comments.youtube_get_comments(self.api_key, v_id, scrolls=3)
n += len(c)
temp = pd.DataFrame(c, columns=['comment'])
new_comments = pd.concat([new_comments, temp], ignore_index=True)
self.comments = pd.concat([new_comments, self.comments], ignore_index=True)
ctx.send(f"added {n} comments from query")
where Comments.youtube_search_keyword()
is defined as:
youtube = build("youtube", "v3", developerKey=config.youtube_api_key)
search_keyword = youtube.search().list(q=query, part="id, snippet",
maxResults=max_results).execute()
results = search_keyword.get("items", [])
titles = []
ids = []
channels = []
for result in results:
if result['id']['kind'] == "youtube#video":
ids.append(result['id']['videoId'])
titles.append(result["snippet"]["title"])
channels.append(result["snippet"]["channelTitle"])
return ids, titles, channels
and Comments.youtube_get_comments()
is implemented similarly
Using build
from googleapiclient.discovery
Solution
Ok so since the process I was trying to offload was a sync process, the way I solved this was by changing a few things:
import asyncio
import concurrent.futures
async def long_method(self, ctx, params):
loop = asyncio.get_running_loop
with concurrent.futures.ThreadPoolExecutor() as pool:
await ctx.send("long method started")
# heavy processing
result = await loop.run_in_executor(pool, self.long_method_helper, params)
# light processes or other async calls
await ctx.send("long method finished")
I also changed long_method_helper
to be a sync function since that's essentially what it was before.
Answered By - nikhilk
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.