I wrote a example with spark maven support in Intelligent IDEA. The spark version is 2.0.0, hadoop version is 2.7.3, scala version is 2.11.8. Enviroment in system and IDE is the same version. Then application runs with error:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$scope()Lscala/xml/TopScope$; at org.apache.spark.ui.jobs.StagePage.(StagePage.scala:44) at org.apache.spark.ui.jobs.StagesTab.(StagesTab.scala:34) at org.apache.spark.ui.SparkUI.(SparkUI.scala:62) at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:215) at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:157) at org.apache.spark.SparkContext.<init>(SparkContext.scala:443) at org.apache.spark.SparkContext.<init>(SparkContext.scala:149) at org.apache.spark.SparkContext.<init>(SparkContext.scala:185) at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:92) at com.spark.test.WordCountTest.main(WordCountTest.java:25) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)

share
    
There's a version mismatch somewhere. Check your cluster, sbt, etc. – Reactormonk Oct 9 '16 at 10:50

Spark 2.0.0 build with scala 2.10, you have to add scala 2.10 as framework support

share

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.