Company Name: HCSC
Location: San Jose, CA
Date Posted: 26th Oct, 2016
This position is responsible for developing, integrating, testing, and maintaining existing and new applications; knowing one or more programming languages; knowing one or more development methodologies / delivery models.
- Bachelor Degree and 2 years Information Technology experience OR Technical Certification and/or College Courses and 4 year Information Technology experience OR 6 years Information Technology experience.
- Possess ability to sit and perform computer entry for entire work shift, as required.
- Possess ability to manage workload, manage mutliple priorities, and manage conflicts with customers/employees/managers, as applicable.
- Rapid prototyping.
- Must have hands on experience in designing, developing, and maintaining software solutions in Hadoop cluster.
- Must Have experience with strong UNIX shell scripting, SQOOP, eclipse, HCatalog .
- Must Have experience with NoSql Databases like HBASE, Mongo or Cassandra
- Must Have experience with Developing Pig scripts/Hive QL ,UDF for analyzing all semi-structured/unstructured/structured data flows.
- Must have working experience with Developing MapReduce programs running on the Hadoop cluster using Java/Python.
- Must have experience in Knowledge of cloud computing infrastructure (e.g. Amazon Web Services EC2) and considerations for scalable, distributed systems
- Must demonstrate Hadoop best practices
- Demonstrates broad knowledge of technical solutions, design patterns, and code for medium/complex applications deployed in Hadoop production.
- Must have working experience in the data warehousing and Business Intelligence systems
- Participate in design reviews, code reviews, unit testing and integration testing.