Big Data Hadoop Consultant
Company Name: Accenture
Location: San Francisco, CA
Date Posted: 20th Feb, 2017
- Deliver large-scale programs that integrate processes with technology to help clients achieve high performance.
- Design, implement and deploy custom applications on Hadoop.
- Implementation of complete Big Data solutions, including data acquisition, storage, transformation, and analysis.
- Design, implement and deploy ETL to load data into Hadoop.
- Minimum 1 year of Building and deploying Java applications
- Minimum 1 year of building and coding applications using at least two Hadoop components, such as MapReduce, HDFS, Hbase, Pig, Hive, Spark, Sqoop, Flume, etc
- Minimum 1 year of coding, including one of the following: Python, Pig programming, Hadoop Streaming, HiveQL
- Minimum 1 year implementing relational data models
- Minimum 1 year understanding of traditional ETL tools & RDBMS
- Minimum of a Bachelor's Degree or 3 years IT/Programming experience
SET YOURSELF APART: Preferred Qualifications
- Full life cycle Development
- Minimum 1 year of experience Developing REST web services
- Industry experience (financial services, resources, healthcare, government, products, communications, high tech)
- Experience leading teams
- Data Science and Analytics (machine learning, analytical models, MAHOUT, etc.)
- Data Visualization