Data Analyst

Company Name: Funding Circle
Location: San Francisco, CA, United States
Date Posted: 03rd Jun, 2016
  • Be an explorer: our data analysts wade through datasets large and small, internal and external, to bring order to chaos. They partner with business users / data scientists to provide insight for all types of questions. Some will take 5 minutes, others 5 months.
  • Be a builder: you’ll be building our global data platform that includes our data warehouse and front-end tools.
  • Be an owner: you’ll be the go-to person for some of our key data assets. You’ll also work closely with business users and product management to identify which new data assets we should acquire.  
  • Be a sentry: you’ll help architect our reporting framework and be expected to sound the alarm if something looks wrong.
  • Be a collaborator: you’ll be expected to forge deep bonds with your business counterparts, understanding their data needs and sharing ownership of their outcomes. No tossing responses over the fence.
  • Be a data evangelist: lead by example and push for high quality data-driven decision-making throughout the company.
  • Be a teacher: be generous with your time and expertise to teach stakeholders how to answer their own questions with the data platform you build.
  • a passion for data… you dabble in APIs on the weekend and dream about left joins.
  • confidence when hacking through large, often messy, data sets.
  • 2 - 5 years in a highly analytical role preferably in tech, consulting, or finance.
  • a strong conceptual ability to understand how our key data assets fit together into a global platform. Know your piece, but also how it fits in.
  • data modeling skills for OLTP and OLAP systems.
  • a strong SQL query/development skills (PostgreSQL, MySQL, MS SQL) plus query tuning and optimization skills.
  • a good understanding of ETL processes and data pipelines with cloud-based tech (AWS).
  • an understanding of data quality and data quality controls, and the ability to convert them into specific technical requirements and documentation.
  • an understanding of different data formats (JSON, XML, etc.).
  • experience with scripting languages like Python.
  • a relentless drive for answers regardless of obstacles.
  • excellent verbal and written communication skills, and the ability to communicate effectively with non-technical people.
  • an eye for elegant data visualization, and experience with tools like Tableau.
  • confidence in owning process: you’ll identify and execute on high impact projects, triage external requests, and make sure you bring projects to conclusion in time for the results to be useful.
  • a Bachelor’s Degree or equivalent from a top-tier university.

Brownie points for:

  • Engineering or Computer Science degree.
  • agile software development experience (Ruby, Python, Java etc.).
  • exposure to event-driven architecture (Confluent platform).
  • data remediation experience.
  • experience interacting with 3rd party data providers.