Fail to open SparkInterpreter in Apache Zeppelin

38 Views Asked by At

I've encountered an issue while deploying Apache Zeppelin on Rancher using a Helm chart. The error appears as shown in this screenshot: Error

Steps to Reproduce the Issue:

  1. Restart the Spark interpreter.
  2. Execute a Spark job.
  3. While the Spark job is running, click 'Add Paragraph', enter any text into the new paragraph, and repeat the 'Add Paragraph' action multiple times or even just one time only.
  4. Upon completion of the Spark job, the error shown in the screenshot appears.

Notably, the Spark job runs without issues if it is left undisturbed during its execution.

Here are the logs from kubectl logs for the Spark service: Log on spark service I have attempted to search for solutions online but have not found relevant information.

Below is a snippet of some relevant properties I configured in the Spark interpreter: Spark Properties

I attempted to increase the Spark core and memory allocations, but the issue persists.

0

There are 0 best solutions below