sqlcontext not found?


Did anyone encounter the error "sqlcontext not found" when starting spark-shell?

I know it's probably an authorization issue. I tried changing the ownership of every folder we use but still having the problem. I googled every possible solution but failed.

Let me know if you know how to solve it.




4 Answer(s)


Hey Kai

Try this

go to spark-1.4.1-bin-hadoop2.6/conf dir 

and add 

export SPARK_LOCAL_IP=""

to file spark-env.sh in the end. Hope this is fix the problem.



Thanks Prakash, but it didn't work. There is a connection refused error.

If I log in as root and run spark-shell, I don't see such error, but using root I can not get access to other folders which I have changed ownership to, so I guess it must be some permission issue, and there was an error message in the log saying "failed to create metastore_db".


Hi Kai,

Could you please share the complete error including the command from the terminal(cmd).
metastore_db is default database metadata folder created by the database server in the default directory.

Please share the following:

1. Complete error including command.

2. Your Operating system name and database used with version.

I will look forward for your response.


Abhijit Kumar


Thanks all! I’m now able to run spark successfully.

However, when I tried to run spark in yarn using “spark-shell --master yarn-client”, it ran for several seconds and output the logs on my screen, but suddenly the system shut off and rebooted. I’ve tried several times and restarted the virtual machine and same thing happens.

Please let me know if you have any solution to it.