Big Data Developer

Company Name: Hewlett Packard Enterprise
Location: Herndon, VA
Date Posted: 17th Sep, 2016
  • Participates as a member of development teams as Hadoop Admin
  • Experienced and full understanding with large scale Hadoop environments build and support including design, capacity planning, cluster set up, performance tuning and monitoring.
  • Provide Hadoop Architecture Consulting to Customer in support of solution design activities.
  • Hadoop development and implementation.
  • Loading from disparate data sets.
  • Pre-processing using Spark, Hive and/or Pig.
  • Designing, building, installing, configuring and supporting Hadoop.
  • Translate complex functional and technical requirements into detailed design.
  • Perform analysis of vast data stores and uncover insights and data strategies.
  • Maintain security and data privacy.
  • Create High-speed querying and alerting from different data streams.
  • Being a part of a POC effort to help build new Hadoop capabilities.
  • Bachelor's degree in Computer science or equivalent experience
  • 5+ years of total experience in DBA or application DBA activities
  • Strong understanding of Hadoop eco system such as HDFS, MapReduce, HBase, Zookeeper, Pig, Hadoop streaming, Sqoop, oozie and hive
  • Experience in installing, administering, and supporting operating systems and hardware in an enterprise environment. (CentOS/RHEL).
  • Expertise in typical system administration and programming skills such as storage capacity management, performance tuning
  • Proficient in Bash and shell scripting (e.g. ksh,)
  • Knowledge in programming and/or scripting, specifically java, python, scala, Pig Latin, HiveQL, SparkSQL/RDD’s and others.
  • Experienced with MapReduce and Yarn.
  • Experience in setup, configuration and management of security for hadoop clusters using Kerberos and integration with LDAP/AD at an Enterprise level
  • Experienced in solutioning and architecting Hadoop based systems.
  • Knowledge of proper development techniques, ie. Agile
  • Experienced with MapReduce and Yarn.
  • Knowledge of workflow/schedulers like Oozie
  • At least 1 year plus experience in managing a Hadoop cluster