• Strong hands on experience with Java, web-services and APIs
• Strong knowledge and hands-on programming experience in Hadoop ecosystem
• Experience in Data ingestion (Batch and Real time) , Data Encryption, Reconciliation
• Should have worked on large data sets and experience with performance tuning and troubleshooting
• Experience in all the life cycle phases of the project
• Should be a strong communicator and be able to work independently with minimum involvement from client SMEs
• Should be able to work in team in diverse/ multiple stakeholder environment
• Experience in NoSQL Databases is preferred
• Experience to Financial domain is preferred
• Experience and desire to work in a Global delivery environment
The job entails sitting as well as working at a computer for extended periods of time. Should be able to communicate by telephone, email or face to face. Travel may be required as per the job requirements.
• Bachelors degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
• At least 4 years of experience with Information Technology.
• At least 4 years of Design and development experience in Java and Big data related technologies
• Atleast 3 years of hands on design and development experience on Big data related technologies Hive, Impala, Kafka, Spark, Java ,Mapreduce, HDFS, HBase, and shell scripting
Mandatory Technical Skills: