Big Data Hadoop Consultant
YOUR ROLE: Analytics Delivery – Big Data Consultant
· Deliver large-scale programs that integrate processes with technology to help clients achieve high performance.
· Design, implement and deploy custom applications on Hadoop.
· Implementation of complete Big Data solutions, including data acquisition, storage, transformation, and analysis.
· Design, implement and deploy ETL to load data into Hadoop.
- Minimum 1 year of Building and deploying Java applications
- Minimum 1 year of building and coding applications using at least two Hadoop components, such as MapReduce, HDFS, Hbase, Pig, Hive, Spark, Sqoop, Flume, etc
- Minimum 1 year of coding, including one of the following : Python, Pig programming, Hadoop Streaming, HiveQL
- Minimum 1 year implementing relational data models
- Minimum 1 year understanding of traditional ETL tools & RDBMS
- Minimum of a Bachelor’s Degree or 3 years IT/Programming experience