How to broadcast with distributed TensorFlow

118 Views Asked by At

I want to implement broadcast some values from chief to all workers with distributed TensorFlow like MPI's bcast: https://mpi4py.readthedocs.io/en/stable/tutorial.html#collective-communication

I guess broadcast_send or tf.raw_ops.CollectiveBcastSend is the operation, but I cloud not found any examples on TensorFlow official document.

Is there a good example to use such the row level distributed operations?

0

There are 0 best solutions below