Error importing Spark version 2.12.10 in Scala project Intellij IDE

86 Views Asked by At

I am trying to run Scala Spark code in IntelliJ IDE.

I have created the following build.sbt file in my project root directory:

name := "SimpleProject"

version := "0.1"

scalaVersion := "2.12.10"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.4.8",
  "org.apache.spark" %% "spark-sql" % "2.4.8"
)

The Scala version I am using is 2.12.10 and the Spark version I am using is 2.4.8.

The build is failing with the error object apache is not a member of package org:

enter image description here

The object that I have created in the Scala file:

import org.apache.spark.SparkContext
import org.apache.spark.SparkConf

object Main {
  def main(args: Array[String]): Unit = {
    val conf = new SparkConf().setAppName("Simple Application")
    val sc = new SparkContext(conf)

  }
}

I tried reloading the project in the sbt tab and the build seems to be successful:

enter image description here

Then I tried sbt clean package, still getting the same error:

object apache is not a member of package org [error] import org.apache.spark.SparkContext

I am stuck here for a while now. My assumption was Spark would automatically resolve if it is passed as a dependency in the build.sbt file. But, it doesn't seem to work.

Would appreciate it if someone could help me fix this to work with Scala Spark.

1

There are 1 best solutions below

0
joshuaeisberg On

Have you tried to clear InteliJ Cache?

-> Select File | Invalidate Caches from the main menu