Zeppelin Spark Interpreter "scoped" mode - close spark SQL session?

27 Views Asked by At

In "scoped" binding mode for Zeppelin notebook of Spark Interpreter, from the docs here and here, each notebook is supposed to share the same Spark application, but create a new Spark SQL session.

Now, at the end of running a notebook, we cannot restart the interpreter because there may be other notebooks running. What needs to be done in order to release the resources used by the specific notebook after it has been run?

Should the Spark SQL session be closed/stopped, or does Zeppelin somehow automatically know when and how to release the resources? I did not see anything specifically mentioned in the Zeppelin docs about this.

Thanks!

0

There are 0 best solutions below