Getting error "value read is not a member of org.apache.spark.sql.SQLContext" while creating Dataframe



0

Hello, 

I am getting error below highlighted error while doing compilation. 

package org.text.spark.dataSource

import org.apache.spark.SparkContext;
import org.apache.spark.SparkConf ;
import org.apache.spark.sql.SQLContext 
object CsvFile {
  def main(args: Array[String]) {
    
    val conf = new SparkConf()
    .setAppName("CsvFile")
    .setMaster("local")
    val sc = new SparkContext(conf)
   
    val sqlContext  = new SQLContext(sc)
 
    val salesDf = sqlContext.read.format("com.databricks.spark.csv").option("header", "true").load(args(0))  ---> Getting error "value read is not a member of org.apache.spark.sql.SQLContext"

  }

}

 

Could you please help me solve this issue. 

Thanks in advance !!

Regards,

Ankit K Mishra

 



3 Answer(s)


0

Hi Ankit,

Try this:

import sqlContext.implicits._

Add this in your code and let me know its works for you or not?

Thanks.

 


0

Hello Abhijit, 

Thanks for your quick response !!

I am getting below error now while importing  sqlContxt.implicits._ . 

import sqlContext.implicits._  -->  not found: object sqlContext

So tried to import with full package name but getting error : 

import org.apache.spark.sql.SQLContext.implicits._ 

object SQLContext is not a member of package org.apache.spark.sql Note: class SQLContext exists, but it has no companion object.

Can I have to include some dependency in pom.xml for this package import to fix this issue ?

pom.xml is attached as attachment. 

 

Regards,

Ankit Mishra 



0

Hello,

Issue is fixed by including these dependency in pom.xml. 

<dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>1.4.1</version>
    </dependency>
    <dependency>
        <groupId>com.databricks</groupId>
        <artifactId>spark-csv_2.10</artifactId>
        <version>1.2.0</version>
    </dependency>

Regards,

Ankit Mishra