Scala/Spark Developer

Company Name: Experis
Location: San Jose, CA
Date Posted: 27th Jan, 2017

1. This senior engineer should be comfortable and informed of the Hadoop ecosystem with a minimum of 5 years experience in Big Data implementations. 
2. Experience in Scala programming is a MUST. A minimum of 3 years of programming (both object and Functional) in Scala handling data processing (ETL - Spark) is required 
3. Must have 3+ years of experience developing programs using Apache Spark Framework (Core & SQL) using Scala 
4. Must be comfortable in extending Hadoop framework components with experience in managing data (format/structure) including performance 
5. Experience with Hive, HBase (NoSQL), Scoop, SQL, Git, Unix, Bash is a required 


Must have top 5 skills: 
(1) Scala (3 years) 
(2) Spark (3 Years) 
(3) Hive (4+ Years) 
(4) HBase (4+ Years) 
(5) SQL (5+ Years)