Interprocess Communication with multiprocessing.Pipe

92 Views Asked by At

I know that this is a frequent question on the forum but I couldn't find answer to my problem.

I need two distinct scripts to communicate via multiprocessing.Pipe.

I have code1.py:

import time
#...

def code_one(pipe):
    for _ in range(100):
        pipe.send("data")
        time.sleep(1)

if __name__ == '__main__':
    code_one(parent_pipe)

and code2.py:

import time
#...

def code_two(pipe):
    while True:
        if pipe.poll():
            data = pipe.recv()
            print(data)

if __name__ == '__main__':
    code_two(child_pipe)

My problem follows:

I then need to share parent_pipe, child_pipe = Pipe() but I can't manage how to use. I tried to place it in a third .py file but I couldn't import the object.

Can anyone guide me?

Edit:

  1. Is it ever possible in python?
  2. What other methods than multiprocessing may help?
  3. Maybe Pipe isn't the tool but Listeners are?

Edit2: There should be a main.py to start processes whose minimal case is the following:

import multiprocessing

import code1
import code2

parent_pipe, child_pipe= multiprocessing.Pipe()

if __name__ == '__main__':
    p1 = multiprocessing.Process(target=code1.code_one, args=(child_pipe,))
    p2 = multiprocessing.Process(target=code2.code_two, args=(parent_pipe,))

    p1.start()
    p2.start()

I don't know if this solves my problem or not. Because I want to run my program with two different arguments at the background with a shared resource:

nohup python3 my_package.py --send &
nohup python3 my_package.py --recieve &

in that way, the following code is used:

import multiprocessing
from code1 import code_one
from code2 import code_two

import argparse

if __name__ == '__main__':

    parser = argparse.ArgumentParser()
    parser.add_argument('-s', '--send', action='store_true')
    parser.add_argument('-r', '--recieve', action='store_true')
    args = parser.parse_args()

    # Create a pipe
    parent_pipe, child_pipe = multiprocessing.Pipe()

    # Spawn process for code_one
    p1 = multiprocessing.Process(target=code_one, args=(parent_pipe,))
    if args.send:
        p1.start()

    # Spawn process for code_two
    p2 = multiprocessing.Process(target=code_two, args=(child_pipe,))
    if args.recieve:
        p2.start()

    # Join the processes (optional, based on your use case)
    #p1.join()
    #p2.join()
2

There are 2 best solutions below

4
Musabbir Arrafi On BEST ANSWER

First, You need to create connections with the multiprocessing pipe either as a parent or child from your code1 and code2 scripts. Otherwise, there's no bridge between them to communicate. Modify them like this:

  • for parent:
if __name__ == '__main__':
    # Create a connection end (either parent or child)
    parent_pipe, child_pipe = multiprocessing.Pipe()
    code_one(parent_pipe)
  • for child
if __name__ == '__main__':
    # Create a connection end (either parent or child)
    parent_pipe, child_pipe = multiprocessing.Pipe()
    code_two(child_pipe)

Then you can call both of this files as connected to the multiprocessing-pipe with a third script:

import multiprocessing
from code1 import code_one
from code2 import code_two

if __name__ == '__main__':
    # Create a pipe
    parent_pipe, child_pipe = multiprocessing.Pipe()

    # Spawn process for code_one
    p1 = multiprocessing.Process(target=code_one, args=(parent_pipe,))
    p1.start()

    # Spawn process for code_two
    p2 = multiprocessing.Process(target=code_two, args=(child_pipe,))
    p2.start()

    # Join the processes (optional, based on your use case)
    p1.join()
    p2.join()
2
KamilCuk On

Can anyone guide me?

It's impossible to communicate unrelated processes using a pipe.

Is it ever possible in python?

No. It is also not possible in anything else.

What other methods than multiprocessing may help?

Wikipedia lists methods of IPC https://en.wikipedia.org/wiki/Inter-process_communication . Pick one and use it.

For example tmux uses a socket file at /tmp/tmux-<UID>/here.

Maybe Pipe isn't the tool but Listeners are?

No. Multiprocessing, as https://docs.python.org/3/library/multiprocessing.html says, is " a package that supports spawning processes". Your processes are separate, not spanwed from one process, this package is unrelated to your problem.