Issue
I am building a django app under docker. In the views I call a task where paramiko makes a ssh connection from a container to another to run a third party app. The problem is that during the call to the other container I zip the 'results' folder and move it to another place. This requires a bit. The code though goes back to the views and looks for the zip file before it appears where it should be.
tasks.py
@shared_task()
def my_task():
command2 = f"""sudo cp /.../ibt1.msh /.../ibt1.msh && \n
until sudo zip -r result.zip ./output/; do sleep 5; done && \n
until sudo mv result.zip /europlexusData/result.zip; do sleep 5; done && \n
sudo rm -rf ./output
"""
host2 = "+"
port = ++
username = "+"
password = "+"
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(host2, port, username, password)
stdin, stdout, stderror = ssh.exec_command(command2, get_pty=True)
return 'Done'
views.py
my_task.delay()
file=open('zip/file/to/be/created/in/the/task')
return FileResponse(file)
Solution
Using my_task.delay()
means my_task will run in parallel with the rest of your view. This means my_task might not even be started when you call open on the line after. If you remove the call to delay()
your task will run until its end and open will happen after.
This is not a good practice because your request might timeout and your server will get busy for nothing, instead you should:
- Warn your client that the task has been started.
- Once you know it is done (via a messaging service, or waiting sufficiently long enough) have your client query an other route that fetches the resulting zip.
As mentioned by Martin Prikryl: That's still only one part of the problem. The other is that my_task does not wait for the remote command to finish. For that, see Paramiko SSH exec_command (shell script) returns before completion – Though for really correct code, see also Paramiko ssh die/hang with big outpu..
Answered By - Taek
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.