Numerator is looking for a Senior Data Engineer who is passionate about data quality and collaboration. This person will work to support the Data Quality Airflow which is essential to our team’s success.
As our Senior Data Engineer, you will get the opportunity to democratize data automation by building simple, stable, and user-friendly tools. You will be working with a passionate team of data scientists and analysts to solve challenging problems and ensure that we can ensure data consistency and deliver powerful insights.
Our team uses the latest cloud data warehouse technologies which allow us to build robust and reliable data pipelines.
We take pride in our data engineering and are looking for driven talented and highly motivated and curious engineers who love using data to enhance products and tools.
What you’ll get to do
- Develop expertise in the different in-house data monitoring systems, expand, and improve them
- Design, develop and maintain data science and data analytics pipelines
- Collaborate with data and engineering teams to take requirements from prototype to production
- Build data validation testing frameworks to ensure high data quality and integrity
- Write and maintain documentation on data pipelines and schemas
What you’ll bring
- Expert in SQL, including advanced analytical queries, window functions, CTEs and query optimization
- Proficiency in Python (data structures, algorithms, object oriented programming, using APIs)
- Experience administering a cloud data warehouse (Redshift, Snowflake, Vertica)
- Knowledge of software engineering best practices across the development lifecycle, coding standards, code reviews, source management, build processes, testing, and operations
- Knowledge of and experience implementing data security and governance best practices
- Experience with a data pipeline scheduling framework (Airflow)
- Experience with schema design and dimensional data modeling
- Great problem solving and analytical skills combined with the ability to explain concepts to both technical and non-technical audiences
- Curious and interested in learning about the latest in data warehouse technology
- Bachelor’s degree in Computer Science or related field of study required; Masters degree preferred
Nice to Haves
- Airflow – Experience building and monitoring DAGs, developing custom operators and using script templating solutions
- Amazon Web Services (EC2, DMS, RDS) experience
- Terraform and/or ansible (or similar) for infrastructure deployment
- Experience supporting production systems and developing on-call/incident management playbooks
- Ability to work with team members located in multiple geographies and time zones.
- Interest and willingness to mentor junior team members
What we offer
- An inclusive and collaborative company culture – we work in an open environment while working together to get things done, and adapt to the changing needs as they come.
- An opportunity to have an impact in a technologically data driven company.
- Ownership over platforms and environments of an industry leading product.
- Market competitive total compensation package.
- Volunteer time off and charitable donation matching.
- Strong support for career growth, including mentorship programs, leadership training, access to conferences and employee resource groups.
- Regular hackathons to build your own projects and Engineering Lunch and Learns.
- Great benefits package including health/vision/dental, unlimited PTO, 401k matching, travel reimbursement and more.
If this sounds like something you would like to be part of, we’d love for you to apply! Don’t worry if you think that you don’t meet all the qualifications here. The tools, technology and methodologies we use are constantly changing and we value talent and interest over specific experience.
We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law.