Reading the tensorflow text summarization model it states "The results described below are based on model trained on multi-gpu and multi-machine settings. It has been simplified to run on only one machine for open source purpose."
Further in the guide this command is invoked :
bazel build -c opt --config=cuda textsum/...
Does this command not relate to cuda/gpu? Why is this command truncated?
This is a
bazelcommand:--config=cudameans "use the CUDA build configuration (and generate GPU-compatible code)", andtextsum/...means "all targets under the directorytextsum" (i.e. the command isn't truncated and you should type the literal...when entering the command).