This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Architecture & InfrastructurePlatform Design: Architect and implement scalable, production-grade data platform components using AWS, Databricks, and modern data stack patterns
IaC Management: Develop and maintain Infrastructure as Code (IaC) to ensure environment standardization, reproducibility, and automation
Pipeline Leadership: Lead the development of critical data pipelines, ensuring they adhere to strict security, reliability, and performance standards
Large-Scale Processing: Design complex transformation workflows capable of handling massive datasets with maximum efficiency
Observability: Implement advanced monitoring and observability solutions across the entire data ecosystem to ensure platform health and visibility
Compliance & GovernanceSystems Integrity: Ensure all data components meet organizational requirements for segregation of duties and rigorous data governance
Requirements:
Master’s degree in Computer Science, Data Engineering, or a related technical field
Expert-level proficiency in Python (preferred), Scala, or Java
Deep expertise in AWS (preferred) and cloud platforms
Advanced Infrastructure as Code (IaC) skills using Terraform and AWS CDK
Specialization in Spark and distributed computing
Extensive experience with Lakehouse architecture and columnar formats (Parquet, Delta, ORC)
Mastery of modern SQL Warehouses (specifically Databricks, Snowflake, or BigQuery)
Proficient in DBT (Data Build Tool) and advanced data modeling methodologies