Hadoop Developer

Company Name: Saama Technologies Inc
Location: Campbell, CA
Date Posted: 24th May, 2016
  • Execute Hadoop development projects independently as well as leading the project
  • Analyze complex distributed production deployments, and make recommendations to optimize performance
  • Deliver programming results that meet specs and functional requirements on time
  • Align project team and client as requirements change
  • Use Saama methods and IP to effectively contribute high quality work to Client projects and Saama Solutions
  • Drive innovation across practice in project management, technology and client relationships
  • Manage the offshore teams as well as be the liaison for the local and the offshore team
  • BS/MS in computer science or related discipline
  • 8-10 years or more years of programming experience with at  least 2 years of Hadoop and minimum of 2 years of Java experience
  • Proven understanding and related experience with Hadoop, HBase, Hive, Pig, Sqoop, Flume, Hbase and/or Map/Reduce
  • MU Unix OS Core Java programming, shell scripting experience.
  • Excellent RDBMS (Oracle, Netezza, SQL Server) knowledge for development using SQL/PL SQL.
  • Hands on Experience in Oozie Job Scheduling, Storm or other similar technologies
  • Previously worked on at least one successful Production Implementation of Hadoop Clusters.
  • Solid knowledge on Java Design Patterns.
  • Working Knowledge on Predictive Modeling using R or Related programming language.
  • Willingness to travel 25-50% of the time