I wrote a example with spark maven support in Intelligent IDEA.
The spark version is 2.0.0, hadoop version is 2.7.3, scala version is 2.11.8. Enviroment in system and IDE is the same version. Then application runs with error:
Exception in thread "main" java.lang.NoSuchMethodError:
scala.Predef$.$scope()Lscala/xml/TopScope$; atorg.apache.spark.ui.jobs.StagePage.(StagePage.scala:44) atorg.apache.spark.ui.jobs.StagesTab.(StagesTab.scala:34) atorg.apache.spark.ui.SparkUI.(SparkUI.scala:62) atorg.apache.spark.ui.SparkUI$.create(SparkUI.scala:215)atorg.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:157)atorg.apache.spark.SparkContext.<init>(SparkContext.scala:443)atorg.apache.spark.SparkContext.<init>(SparkContext.scala:149)atorg.apache.spark.SparkContext.<init>(SparkContext.scala:185)atorg.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:92)atcom.spark.test.WordCountTest.main(WordCountTest.java:25)atsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:606)atcom.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
Spark 2.0.0 build with scala 2.10, you have to add scala 2.10 as framework support