I dont know what is wrong here. When I run I keep getting "Error: Could not find or load main class com.sundogsoftware.spark.RatingsCounter" in my scala IDE.
this is my scala code
package com.sundogsoftware.spark
import org.apache.spark._
import org.apache.spark.SparkContext._
import org.apache.log4j._
/** Count up how many of each star rating exists in the MovieLens 100K
data set. */
object RatingsCounter {
/** Our main function where the action happens */
def main(args: Array[String]) {
// Set the log level to only print errors
Logger.getLogger("org").setLevel(Level.ERROR)
// Create a SparkContext using every core of the local machine, named RatingsCounter
val sc = new SparkContext("local[*]", "RatingsCounter")
// Load up each line of the ratings data into an RDD
val lines = sc.textFile("../ml-100k/u.data")
// Convert each line to a string, split it out by tabs, and extract the third field.
// (The file format is userID, movieID, rating, timestamp)
val ratings = lines.map(x => x.toString().split("\t")(2))
// Count up how many times each value (rating) occurs
val results = ratings.countByValue()
// Sort the resulting map of (rating, count) tuples
val sortedResults = results.toSeq.sortBy(_._1)
// Print each result on its own line.
sortedResults.foreach(println)
}
}
here is my project structure
Here is my run configuration
Here is my scala compiler option selected.
Trying to debug this for a few hours now, nothing seems to be working.
Any pointers will help.



Check out https://wiki.eclipse.org/Eclipse.ini I had to change the vm arg in my eclipse.ini file, and for my JRE options I selected 'Use default JRE (currently 'Java SE 8 [1.8.0_172]')' when I created the scala project. That fixed this error for me.
I am using OS X, so I had to add
above -vmargs