Suppressing pyspark warnings before creating session

135 Views Asked by At

I am having trouble suppressing pyspark warnings when working locally on my computer. I am getting the following warnings, and I would like to "suppress" them:

WARN Utils: Your hostname, [HOSTNAME] resolves to a loopback address: 127.0.1.1; using 192.168.26.41 instead (on interface [INTERFACE])
WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

Here is what I have tried to suppress the warnings, but none of them is working:

from pyspark.sql import SparkSession

method_number = 1 # 2, 3

def get_spark_object():
    return SparkSession.builder.getOrCreate()

if method_number == 1:
    import sys
    save_stdout = sys.stdout
    sys.stdout = open('trash', 'w')
    spark = get_spark_object()
    sys.stdout = save_stdout

elif method_number == 2:
    import warnings
    warnings.filterwarnings('ignore')
    warnings.simplefilter('ignore')
    spark = get_spark_object()

elif method_number == 3:
    import shutup
    shutup.please()
    spark = get_spark_object()
0

There are 0 best solutions below