Hadoop Developer

Company Name: CloudBigData Technologies
Location: Dallas, Texas
Date Posted: 02nd Apr, 2017

Responsibilities :

  • Selecting and integrating Big Data tools and frameworks required to provide requested capabilities
  • Implementing Data ingestion and ETL processes on Hadoop
  • Monitoring performance and advising any necessary infrastructure changes
  • Defining data retention policies
  • Design and build data processing pipelines for structured and unstructured data using tools and frameworks in the Hadoop ecosystem
  • Develop applications that are scalable to handle millions of events/records
  • Design and launch scalable, reliable and efficient processes to move, transform and report on large amounts of data.

Qualifications :

  • Bachelor's degree and 8+ years relevant experience or Master’s degree and 6+ years of relevant experience
  • 4+ years in industry implementing big data solutions on Hadoop
  • Proficient understanding of distributed computing principles
  • Proficiency with Hadoop v2, MapReduce, HDFS
  • Experience with building stream-processing systems, using solutions such as Storm or Kafka and Spark-Streaming
  • Good knowledge of Big Data querying tools, such as Pig, Hive, Phoenix
  • Experience with Spark
  • Experience with integration of data from multiple data sources
  • Experience with 1 or 2 NoSQL/Graph databases, such as HBase, Cassandra, MongoDB, Neo4j
  • Proficiency in a programming languages like SCALA, Java,Python
  • Experience with Linux OS, shell scripting
  • Experience with relational databases (SQL)
  • Experience in working with real-time data feeds
  • Experience in working with unstructured data
  • Experience in implementing Scoop Jobs to Import/Export data from Hadoop