About the Role:
We are looking for a Lead Data Engineer to drive the development of modern data platform. This role will focus on building scalable and reliable data pipelines using tools like DBT, Snowflake, and Apache Airflow, and will play a key part in shaping data architecture and strategy.
As a technical leader, you’ll work closely with cross-functional teams including analytics, product, and engineering to deliver clean, accessible, and trustworthy data for business decision-making and machine learning use cases.
Key Responsibilities:
- Lead the design and implementation of ELT pipelines using DBT and orchestrate workflows via Apache Airflow.
- Design, implement, and maintain robust data models to support analytics and reporting.
- Architect and optimize our cloud data warehouse in Snowflake, ensuring performance, scalability, and cost efficiency.
- Collaborate with data analysts and stakeholders to model and deliver well-documented, production-grade datasets.
- Establish data engineering best practices around version control, testing, CI/CD, and observability.
- Build and maintain data quality checks and data validation frameworks.
- Mentor junior data engineers and foster a strong engineering culture within the team.
- Collaborate on data governance efforts, including metadata management, data lineage, and access controls.
- Evaluate and integrate new tools and technologies to evolve our data stack.
Requirements:
- 8+ years of experience in data engineering with at least 2 years in a lead role.
- Strong experience designing and managing data pipelines with DBT and Airflow.
- Proven expertise in data modeling techniques (dimensional modeling, star/snowflake schemas, normalization, denormalization) and translating business requirements into scalable data models.
- Deep understanding of Snowflake, including performance tuning and cost optimization.
- Strong SQL and Python skills for data transformation and automation.
- Experience with Git-based workflows and CI/CD for data pipelines.
- Excellent communication skills and experience working with cross-functional teams.
- Experience with data cataloging and lineage tools
- Exposure to event-driven architectures and real-time data processing.
- Understanding of data privacy and security standards
Education:
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field