I know that this is a frequent question on the forum but I couldn't find answer to my problem.
I need two distinct scripts to communicate via multiprocessing.Pipe.
I have code1.py:
import time
#...
def code_one(pipe):
for _ in range(100):
pipe.send("data")
time.sleep(1)
if __name__ == '__main__':
code_one(parent_pipe)
and code2.py:
import time
#...
def code_two(pipe):
while True:
if pipe.poll():
data = pipe.recv()
print(data)
if __name__ == '__main__':
code_two(child_pipe)
My problem follows:
I then need to share parent_pipe, child_pipe = Pipe() but I can't manage how to use. I tried to place it in a third .py file but I couldn't import the object.
Can anyone guide me?
Edit:
- Is it ever possible in python?
- What other methods than
multiprocessingmay help? - Maybe
Pipeisn't the tool butListeners are?
Edit2:
There should be a main.py to start processes whose minimal case is the following:
import multiprocessing
import code1
import code2
parent_pipe, child_pipe= multiprocessing.Pipe()
if __name__ == '__main__':
p1 = multiprocessing.Process(target=code1.code_one, args=(child_pipe,))
p2 = multiprocessing.Process(target=code2.code_two, args=(parent_pipe,))
p1.start()
p2.start()
I don't know if this solves my problem or not. Because I want to run my program with two different arguments at the background with a shared resource:
nohup python3 my_package.py --send &
nohup python3 my_package.py --recieve &
in that way, the following code is used:
import multiprocessing
from code1 import code_one
from code2 import code_two
import argparse
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument('-s', '--send', action='store_true')
parser.add_argument('-r', '--recieve', action='store_true')
args = parser.parse_args()
# Create a pipe
parent_pipe, child_pipe = multiprocessing.Pipe()
# Spawn process for code_one
p1 = multiprocessing.Process(target=code_one, args=(parent_pipe,))
if args.send:
p1.start()
# Spawn process for code_two
p2 = multiprocessing.Process(target=code_two, args=(child_pipe,))
if args.recieve:
p2.start()
# Join the processes (optional, based on your use case)
#p1.join()
#p2.join()
First, You need to create connections with the multiprocessing pipe either as a parent or child from your code1 and code2 scripts. Otherwise, there's no bridge between them to communicate. Modify them like this:
Then you can call both of this files as connected to the
multiprocessing-pipewith a third script: