Company Name: The New York Times Company
Location: New York, NY
Date Posted: 01st Oct, 2016
- Implement complex data projects with a focus on collecting, parsing, analyzing and visualizing large sets of data to turn information into insights across multiple platforms.
- Build fault tolerant, self-healing, adaptive and highly accurate ETL platforms.
- Design and develop the data model(s) for the Data Warehouse, Data Marts or NoSQL solutions which will be vetted by the team and or guided by the Senior Data Solutions Architect.
- Responsible for data administration of warehouse solutions.
- Take ownership of the warehouse solutions, troubleshoot issues, and provide production support.
- Document processes and standard operating procedures for processes.
- Generate reports using a variety of reporting tools such as Business Objects, Tableau, Pentaho.
- Work with team of developers who are transitioning current in-house data warehouse solution to Google Cloud platform.
- 5+ years experience building traditional data warehouse solutions, are knowledgeable about data modeling, data access, and data storage techniques.
- You understand standard methodologies for ETL, and are proficient in debugging and optimizing pipelines.
- You can easily transition from one ETL tool set e.g., Informatica to Kettle to more programmatic approaches such as Python/SQL or Spark/Scala
- Extensive Experience with SQL and understanding of NoSQL solutions
- You have significant coding experience in Java, Python and want to apply those skills to processing big data.
- 5+ years experience with object-oriented design, coding and testing patterns as well as experience in engineering open source software platforms and large-scale data infrastructures.
- You have an understanding of distributed systems, NoSQL solutions such as Redshift or BigQuery.
- Familiar with Google Cloud or other cloud provider products and servers
- You are able to work in teams and collaborate with others to clarify requirements
- You have a Bachelor’s degree or Master’s in information Technology or relevant discipline.
- Experience with automation, build tools, release engineering
- Bonus: Experience with Spark/Scala, distributed data systems and MPP databases