Hadoop Developer

Company Name: Enquero
Location: San Jose
Date Posted: 22nd Jan, 2018
Description:

Responsibilities:

· Work with all phases of SDLC, including business goals understanding, architect, system analysis, design, build, debugging and documentation

· Perform analysis and assessment of big data cluster effectiveness and translate analysis into recommendations

· Review market trends on big data and recommend latest improvements to the business models

· An interest in following all security market trends and new technologies

· Strong ability to handle multiple tasks simultaneously

 

 

 

 

 

 

 

 

 

 

Qualification:

Must have:

· Bachelor's degree in Computer Science or related degree

· Experience in Architecture, design & development of Hadoop based solutions

· Four or more years of experience in Architecture, design & development of Hadoop based solutions

· Seven or more years of experience in software development with focus on backend architecture

· Two or more years of hands on working experience with Big Data technologies; MapReduce, ElasticSearch, Pig, Hive, Hbase, Apache Spark, YARN, Kafka, Storm

· Experience working with commercial distributions of HDFS (Horton works / Cloudera) - HDFS, Oozie, Flume, Zookeeper

· Hands on experience with programming languages Java, Scala, Python,R

· Knowledge of Kibana, Logstash, Linux

· Ability to communicate clearly with both business and technical resources

· Ability to perform in a fast paced, dynamic and innovative work environment and meet aggressive deadlines

· Ability to quickly adapt to changes, enhancements and new technologies.

· Good Analytical skills

· Experience with Apache Impala

· Experience working with multi-Terabyte data sets