How to run pyspark script without show the running process in the window

114 Views Asked by At

I have a pyspark file test.py in a server and I want to run this file with an argument(teo).

spark-submit --driver-memory=32g --executor-memory=32g test.py teo 2>&1 test.logs.

Although when I am running it starts to show the whole process in the window but I do not want that. Instead of that I want to store all the running process in a .logs file.

1

There are 1 best solutions below

0
User12345 On BEST ANSWER

Please try like below

spark-submit --driver-memory=32g --executor-memory=32g test.py teo >> test.logs 2>&1 

This will redirect the terminal output to test.logs