Data Engineer
Job Title: Data Engineer (Python dbt Snowflake Airflow)
Role Summary
Key Responsibilities
- Design and maintain batch and incremental ELT pipelines using Python and Airflow
- Build analytics models using dbt (staging, marts, tests, documentation)
- Develop and optimize Snowflake schemas, warehouses, and SQL performance
- Implement data quality checks, monitoring, and failure handling
- Apply RBAC, secure views, and cost/performance optimization in Snowflake
- Support production pipelines, SLAs, and on-call rotation
- Follow Git-based CI/CD and engineering best practices
Required Skills
- Strong Python and SQL
- Hands-on Snowflake experience (performance, security, cost)
- Production dbt experience (models, tests, macros)
- Apache Airflow for orchestration
- Solid understanding of ELT, dimensional modeling, and data quality
Experience
- 4–8+ years in data engineering or analytics engineering
- Bachelor’s degree in Computer Science or related field