1-844-696-6465 (US)        +91 77600 44484        help@dezyre.com

Big Data Solutions Architect

Company Name: Domino’s
Location: Ann Arbor, MI
Date Posted: 29th Nov, 2016
Description:

· Lead the design and development of highly scalable and optimized data models, by utilizing modeling software to document and maintain versions to support Data Marts, Cubes, Data Warehouse, and Operational Data Stores (ODS).
· Establish data standards in terms of nomenclature, storage, design and deployments.
· Work in concert with a team of ETL developers to ensure efficient and accurate data transfer within the entire EDW echo system with Big Data Platforms.
· Assure optimized source system replication models and operations.
· Lead design and maintenance of enterprise meta-data solution to communicate data definitions to the BI audience.
· Acts as a DW liaison to our Infrastructure Engineering teammates and coordinates initiatives with the other database administration groups in that sister team.
· Ensures appropriate technical standards and procedures are defined. Manages the development of centers of excellence around key storage sub-system technologies.

  • Collaborate in planning initiatives in Application Development, System Architecture, Future Roadmaps, Operations and Strategic Planning
  • Work with business teams and technical analysts to understand business requirements. Determine how to leverage technology to create solutions that satisfy the business requirements.
  • Present solutions to the business, project teams, and other stakeholders with the ability to speak technical and non-technical
  • Create architecture and technical design documents to communicate solutions that will be implemented by the development team.
  • Work with development, infrastructure, test, and production support teams to ensure proper implementation of solution.
  • Ability to assess the impact of new requirements on an existing suite of complex applications
  • Educate organization on available and emerging toolsets.
  • Drive the evolution of infrastructure, processes, products, and services by convincing decision makers
  • Develop proofs-of-concept and prototypes to help illustrate approaches to technology and business problems. Experience in building Business Intelligence platforms in an enterprise environment. Data integration (batch, micro-batches, real-time data streaming) across Hadoop, RDMSs, and Data warehousing (SQL Server 2016 preferred)
  • Build real-time data pipelines using technologies such as Apache Kafka, Spark, Storm, and Flume etc.
  • Analyze data using technologies such as Python, R, Scala, Pig, and Hive etc.
  • Buildi consumption frameworks on Hadoop (Restful services, Self-service BI and Analytics)
  • Optimize Hadoop environment using MapReduce, Spark and HDFS footprints Hadoop security, Data management and Governance
Qualification:
  • Bachelor's degree in computer science, information technology, engineering, business administration or related field.
  • Knowledge in HADOOP required
  • Understands the capabilities of key technologies (Data modeling, data processing, BI analytics) and can quickly assess the applicability of commercial off the shelf technology.
  • Excellent grasp of integrating multiple data sources into an enterprise data management platform and can lead data storage solution design.
  • Strong communication skills (oral and written).
  • Good analytical and problem solving skills.
  • Understanding of the software development lifecycle including agile methodology.
  • Ability to understand business requirements and building pragmatic/cost effective solutions using Agile project methodologies
  • Ability to collaborate with business users to understand requirements Excellent problem solving and analytical skills
  • Minimum of 8-10 years enterprise IT application experience that includes at least 3 years of architecting strategic, scalable BI, Bigdata solutions
  • 6 to 8 years’ experience with Relational DBMS technology - SQL Server focused.
  • 6 to 8 years’ experience in developing procedures, packages and functions in DW environment.
  • Deep experience in ANSI SQL, Stored Procedures;
  • 2+ years’ experience with Map Reduce, Pig, Hive-QL Hadoop languages a plus.