This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
About the Role: Key Responsibilities: Architect and implement scalable, secure, and cost-effective data platforms on AWS. Lead the development of complex ETL/ELT pipelines and data workflows. Define, standardize, and enforce data engineering best practices and coding standards. Collaborate closely with data scientists, analysts, and business stakeholders to deliver data-driven solutions. Conduct code reviews to ensure high-quality, maintainable deliverables. Optimize data workflows for performance, scalability, and cost-efficiency. Ensure strong data quality, governance, and security across all data environments. Work with cross-functional teams to gather, analyze, and refine data requirements. Monitor, maintain, and troubleshoot data pipelines and associated infrastructure.
Job Responsibility:
Architect and implement scalable, secure, and cost-effective data platforms on AWS
Lead the development of complex ETL/ELT pipelines and data workflows
Define, standardize, and enforce data engineering best practices and coding standards
Collaborate closely with data scientists, analysts, and business stakeholders to deliver data-driven solutions
Conduct code reviews to ensure high-quality, maintainable deliverables
Optimize data workflows for performance, scalability, and cost-efficiency
Ensure strong data quality, governance, and security across all data environments
Work with cross-functional teams to gather, analyze, and refine data requirements
Monitor, maintain, and troubleshoot data pipelines and associated infrastructure
Requirements:
Bachelor’s degree in Computer Science, Engineering, or related field
5–8 years of experience in data engineering, including at least 3 years of hands-on expertise with AWS Big Data platforms
Proficiency in Python, SQL, Spark, and various AWS data services
Strong understanding of data architecture, data modeling, and data warehousing concepts
Experience in DevOps practices and tools, including CI/CD, monitoring, and infrastructure-as-code (IaC)
Hands-on experience with key AWS services: Glue, S3, Redshift, Lambda, CloudWatch, IAM
Experience with CI/CD tools and IaC technologies such as Terraform or CloudFormation
Proficiency with Git for version control and collaborative development