Snowflake DBT
We are looking for a skilled Data Engineer with strong experience in Snowflake and SnapLogic to design build and optimize scalable data pipelines and cloud-based data solutions The ideal candidate will have hands-on experience in modern data warehousing ETLELT pipelines and data integration frameworks
Key Responsibilities
Design develop and maintain ELT pipelines using SnapLogic for data ingestion transformation and integration
Build and optimize largescale data models and data warehouse solutions on Snowflake
Develop reusable data components scripts and frameworks for data processing
Implement data quality checks validation rules and monitoring dashboards
Collaborate with data architects BI developers and business stakeholders to understand data needs
Optimize Snowflake performance warehouses micro partitioning clustering query optimization
Manage data security governance and compliance within Snowflake environments
Troubleshoot data pipeline issues and ensure high data availability and reliability
Participate in code reviews best practices and continuous improvement efforts
Required Skills Experience
4z to 7 years of experience as a Data Engineer or in a similar role
Strong hands on experience with Snowflake data modeling performance tuning SQL Snowpipe tasks streams
Practical experience creating and managing pipelines with SnapLogic pipeline design Snap Packs API integration
Advanced SQL skills and familiarity with complex query optimization
Experience working with cloud platforms AWS Azure GCP
Strong understanding of data warehousing concepts ETL ELT starsnowflake schema data marts
Proficiency in a programming language such as Python
Knowledge of version control systems Git and CICD workflows for data pipelines.