Has anyone completed the setup to run Spark under Yarn?



0

 

I followed the instructions on Session 6 notes and when I run the command to start Spark shell, Scala prompt comes up at the end but the Spark context gets created and closed during the starup.

 

spark-shell --master yarn-client

I see this in the startup putput;

"17/03/14 22:23:06 ERROR cluster.YarnClientSchedulerBackend: Yarn application has already exited with state FINISHED!"

 

I tried various things including increasing memory, to no avail.

I am using Spark ver 1.6.3 prebuilt for 2.6 Hadoop.


3 Answer(s)


0

Hi Tejus,

You can force YARN to ignore this by setting up the following properties in yarn-site.xml

<property>
    <name>yarn.nodemanager.pmem-check-enabled</name>
    <value>false</value>
</property>

<property>
    <name>yarn.nodemanager.vmem-check-enabled</name>
    <value>false</value>
</property>

Hope this helps.

Thanks.


1

Cool, that works!

 

So we are just bypassing the physical and virtual memory checks? Does that mean earlier either was hitting some default limit?

 

Thank you Abhijit.


0

Hi Tejus,

Yes, you assumptions are correct.

Thanks.

Your Answer

Click on this code-snippet-icon icon to add code snippet.

Upload Files (Maximum image file size - 1.5 MB, other file size - 10 MB, total size - not more than 50 MB)

Email
Password