Getting error while compiling Spark Application in Eclipse



I am getting below error while compiling my Java object in eclipse.

"17/10/15 18:21:18 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Could not parse Master URL: 'Local'
    at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2735)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:522)
    at org.text.spark.wordcount$.main(wordcount.scala:13)" 

It could not be able to create Spark context. 

Compile below Scala Object : 

object wordcount {
  def main(args: Array[String])
    val host="LocalHost"
    val conf = new SparkConf()
    val sc = new SparkContext(conf) 
    val test = sc.textFile(args(0))
    test.flatMap(line => line.split(" "))
    .map( word => (word, 1))
    .reduceByKey(_ + _)


Could you please help me to solve this error.  Please refer screen print for more details. 

Also please provide me details related to "Submit a Spark Application from Eclipse IDE to Hadoop Cluster in fully distributed mode" . 

I have a fully distributed Spark Cluster and want to submit my application from IDE. 


Thanks in advance !!



Ankit Mishra





3 Answer(s)


Hi Ankit,

You need to set the master address correctly, below is the example for you:

val conf = new SparkConf().setAppName("Wordcount")


Hope this helps.



Issue was related to build. Its working fine now. 

Thanks !!


Thanks Abhijit !!

It was my mistake. I did't build my current project before executing wordCount object so It could not be able create spark context object. 



Ankit Mishra 

Your Answer

Click on this code-snippet-icon icon to add code snippet.

Upload Files (Maximum image file size - 1.5 MB, other file size - 10 MB, total size - not more than 50 MB)