Data EngineerP2 C2 STSPrimary SkillsMust have working knowledge in designing and implementing data pipelines on any of the cloud providers (AWS is preferred). Must be able to work with large volumes of data coming from various sources. Responsibility1. Perform data cleansing, data validation etc2. Hands on ETL developer who is good at python, SQL. AWS services like glue, glue crawlers, lambda, red shift, athena, s3, EC2, IAM, Monitoring and Logging mechanisms- AWS cloudwatch, setting up alerts.3. Deployment knowledge on cloud. Integrate CI/CD pipeline to build artifacts and deploy changed to higher Environments.4. Scheduling frame works Airflow, AWS Step functions5. Excellent Communication skills, should be able to work collaboratively with other teams. JDAs a Data Engineer, you will be responsible for designing and implementing data pipelines on cloud providers, with a preference for AWS. You will work with large volumes of data from various sources, performing tasks such as data cleansing and validation. In this role, you will also utilize your hands-on experience as an ETL developer, proficient in python and SQL, and familiar with AWS services like glue, glue crawlers, lambda, red shift, athena, s3, EC2, IAM, and monitoring and logging mechanisms such as AWS cloudwatch and setting up alerts. Additionally, you should have experience with deployment on the cloud and integrating CI/CD pipelines for deploying changes to higher environments. Familiarity with scheduling frameworks like Airflow and AWS Step functions is also required. Excellent communication skills are a must, as you will be working collaboratively with other teams in a hybrid work mode.