Senior Hadoop Developer

Company Name: Cypress HCM
Location: San Diego, California
Date Posted: 01st Feb, 2017
  • Design and develop end-to-end Hadoop data ingestion process.
  • Design and develop Hadoop data integration in both real-time and scheduled mode leveraging appropriate Hadoop open integration framework. Utilizing tools like: Storm, Flume, Sqoop, Oozie, etc.
  • Manage Hadoop clusters and guide the operational best practices for Hadoop cluster configuration.
  • Build Libraries, user defined functions and frameworks around Hadoop
  • Manage the performance and allocations in open distributed file system (HDFS) leveraging Hadoop
  • Implement standards based monitoring and tuning practices for the 20+ nodes Hadoop ecosystem.
  • Translate requirements from business & engineering teams to Hadoop technology solutions.
  • Extensive Hadoop Development experience.
  • Strong Java application development experience.
  • Experience setting up J2EE application build, deploy and promote configurations experience.
  • Strong knowledge of SQL (Oracle, NoSQL, SQL Server) databases.
  • Solid backgrounds in fundamentals of computer science and development process.
  • Experience with open source frameworks and tools.
  • Familiarly with Map-Reduce, Hive and other big-data based open source projects.
  • Understanding of virtualization and cloud computing concepts and technologies.
  • Experience with Unix-like operation systems.
  • Experience in setting up, optimizing and sustaining Hadoop ecosystem at mid-to-large scale.
  • Familiarity with AWS technologies a plus.
  • Greenplum/Teradata database development experience for large scale of data warehouse.
  • Experience with distributed systems, persistence, caching and concurrent programming.
  • Experience in Storm/Kafka is a plus.