I have a problem with setting up custom environment variables in Dataproc Serverless Interactive PySpark sessions. I follow the documentation and setup up my python environment variable using spark.dataproc.driverEnv.MY_VAR=my_value. Once I do it in batch mode, environment variables are accessible without hesitation. However in the interactive mode, they are not. Any ideas what might be wrong or is there any walkaround for interactive sessions?
I set them up using spark.dataproc.driverEnv.MY_VAR=my_value config in session template. Then I call them using
import os
os.getenv("MY_VAR")
This is because Dataproc Serverless Sessions does not support
spark.dataproc.driverEnv.properties.