Senior Big Data Application Developer-160057852
The candidate will be responsible for application development supporting the Client Service Workspace program, which is an Investment Bank-wide strategic initiative to improve the operational and customer service capabilities of the firm.
She/he will have responsibility for design, implementation and testing of a highly scalable data layer using HDFS, Hadoop, HBase, Phoenix, HortonWorks, Spark, and related tools. Must have a thorough understanding of both the tools and concepts that underlie Big Data, such as MapReduce, Massively Parallel Processing (MPP), and HDFS job scheduling.
The candidate must have a sound grasp of development best practice and system architecture. She/he will be expected to produce high-quality deliverables that can pass critical peer review, and to work under a high-pressure and timeline-driven environment. In addition, the candidate must be highly proficient in Java and must be able to demonstrate deep expertise in core Java technologies.
The role requires hands-on Java coding as part of the Hadoop/HBase/Phoenix build-out. Deep technical knowledge and the ability to communicate ideas is an integral part of the role, and as such the successful candidate will be required to demonstrate proficiency in the technical areas required for the project, and should possess good verbal and written communication skills.
•Minimum 5 years experience building mission-critical enterprise applications, with a proven delivery track record.
•Minimum 2 years implementing Big Data tools in a production environment (proof of concept and lab work do not quality).
•Bachelor of Science in Computer Science or equivalent degree.
•Demonstrates exceptional analytical and problem-solving skills.
•Strong communication, organizational, and collaboration skills.
•Ability to follow complex design and development standards.
•Experience working in multi-time zone development team.
•Experience with Agile development.
•Experience building high volume systems with real-time performance and read and write capabilities.
•Hands-on experience with installing, configuring, developing on, and tuning HDFS, Hadoop, HortonWorks, and optionally Spark.
•Strong understanding of data modeling, and experience interfacing with Oracle, Sybase or other major RDBMS.
•Experience with real-time, event-driven systems and service-oriented architectures.