Almond sh scala kernel create NotebookSparkSession error on windows

201 Views Asked by At

I setup almond sh kernel for jupyter-lab on windows. It work well when run scala code. But when i create NotebookSparkSession, i facing an issue like that

val spark = {
    NotebookSparkSession.builder().master("yarn").getOrCreate()
}

Error java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: jar:file:/D:/Users/<user-name>/AppData/Roaming/jupyter/kernels/scala/launcher.jar!/coursier/bootstrap/launcher/jars/ammonite-compiler-interface_2.12.10-2.5.6-4-4a07420b-sources.jar!

The first jar file is launcher: D:/Users//AppData/Roaming/jupyter/kernels/scala/launcher.jar The second jar file is a class in launcher.jar: /coursier/bootstrap/launcher/jars/ammonite-compiler-interface_2.12.10-2.5.6-4-4a07420b-sources.jar!`

Version:

  • almond 0.13.4
  • scala 2.12.10
  • spark 3.2.2

Steps setup kernel:

  • setup java jdk 1.8.0
  • setup jupyter-lab
  • setup almond sh kernel
bitsadmin /transfer downloadCoursierCli https://git.io/coursier-cli "%cd%\coursier"
bitsadmin /transfer downloadCoursierBat https://git.io/coursier-bat "%cd%\coursier.bat"
echo -n | openssl s_client -showcerts -connect repo1.maven.org:443 | sed -ne '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/p' > maven.crt
set JAVA_HOME=C:\jdk1.8.0_241
"%JAVA_HOME%\bin\keytool.exe" -import -trustcacerts -keystore maven.crt -storepass changeit -noprompt -alias maven -file maven.crt

coursier bootstrap --standalone almond:0.13.4 --scala 2.12.10 -o almond 
almond --install --force
  • config kernel
{
  "argv": [
    "java",
    "-jar",
    "D:\\Users\\<user-name>\\AppData\\Roaming\\jupyter\\kernels\\scala\\launcher.jar",
    "--connection-file",
    "{connection_file}"
  ],
  "display_name": "Scala",
  "language": "scala",
  "env": {
    "JAVA_HOME": "C:\\jdk1.8.0_241",
    "HADOOP_HOME": "D:\\DATA\\Environment\\hadoop-2.7.2",
    "HADOOP_CONF_DIR": "D:\\DATA\\Environment\\hadoop-2.7.2\\etc\\hadoop",
    "SPARK_HOME": "D:\\DATA\\Environment\\spark-3.2.2-bin-hadoop2.7",
    "COURSIER_CACHE": "D:\\Users\\<user-name>\\AppData\\Local\\Coursier\\cache\\v1",
    "JVM_OPT": "-Dhttp.proxyHost=http://proxy.xxx.com.vn -Dhttp.proxyPort=8080 -Dhttps.proxyHost=http://proxy.xxx.com.vn -Dhttps.proxyPort=8080"
  }
}
  • run kernel in jupyter-lab
import $ivy.`org.apache.spark::spark-sql:3.2.2`
import $ivy.`sh.almond::almond-spark:0.13.4`
import org.apache.spark.sql._
val spark = {
    NotebookSparkSession.builder().master("yarn").getOrCreate()
}

how can i fix it. thank you

https://github.com/almond-sh/almond/issues/1082

0

There are 0 best solutions below