Issue
I have two Python scripts (assume script_1 and script_2). Script_1 scans the network using subprocess
to call some routines, and updates certain values. Its update rate is slow. It could also be running indefinitely on its own process, in contrast to being invoked. Script_2 has a main loop that performs a number of different actions, and some of the actions depend on the latest status of the values updated by script_1. Here is how it shouldn't be done:
### script_1 ###
from time import sleep
import random
# This could also be an infinitely running
# independent task, updating a file/database/whatever
def tedious_task():
sleep(10) # working hard...
value = random.random() * 10
return value
### script_2 ###
from script_1 import tedious_task
from time import sleep
while True:
value = tedious_task() # waiting...
if value > 5:
print("Do something")
else:
print("Do something else")
print("Do other stuff")
sleep(1)
As a side note, I do not care about logging the updated values of script_1. I just need the latest value.
I have in mind a few different ways that it could be implemented, such as interrupts/callbacks, async, threading, multiprocessing, or even writing on external locations such as files, databases etc. However, most of them are overkills and the rest are just not ideal.
Threading is a potential candidate. I am comfortable using threads, however I've read a lot on how it can destabilize a system if it's not implemented right, and I need this system to be stable for a long time. "Use async IO when you can; use threading when you must", I read somewhere.
Async IO feels a bit of an overkill (not as much as using a database), but I may be mistaken. I haven't used asynchronous tasks for many years (and never in Python), so I have forgotten most of it. My main concern is that since I only need the last value returned, if my main routine slows down for some reason, it will lead to a lagging series of returns of an async function. I may be missing something.
So, is there any way that this is preferably done? This might pass as an opinion based question, however what I really need is an answer based on facts.
Solution
For multiprocessing something like the following should work
from multiprocessing import Process, Value
from time import sleep
import random
def producer(v):
while True:
sleep(10) # working hard...
with v.get_lock():
v.value = random.random() * 10
def worker(v):
while True:
value = v.value
if value > 5:
print("Do something")
else:
print("Do something else")
print("Do other stuff")
if __name__ == '__main__':
v = Value('i', 7) # See the docs for multiprocessing.Value
producer_process = Process(target=producer, args=(v,))
worker_process = Process(target=worker, args=(v,))
producer_process.start()
worker_process.start()
Answered By - Iain Shelvington
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.