This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Design and build scalable, governed data products aligned to data‑as‑a‑product strategy, Bridge architecture, engineering, and business process requirements. Enable analytics and AI use cases on AWS and GCP platforms.
Job Responsibility:
Design end‑to‑end data architectures aligned with UDA standards
Build and optimize batch and near‑real‑time data pipelines
Implement Bronze / Silver / Gold data models and curated data products
Engineer solutions on AWS (S3, Glue, MWAA, Athena/Redshift) and GCP (GCS, BigQuery, Dataproc, Dataflow)
Apply data governance, quality, lineage, and security controls
Partner with business and analytics teams to translate requirements into data solutions
Contribute to reusable patterns, lighthouse initiatives, and platform modernization
Mentor junior engineers and drive engineering best practices
Requirements:
4 to 7 years of experience in Data Engineering / Data Platforms
Strong hands‑on experience with AWS and GCP
Proficiency in PySpark, SQL, Python
Experience with Airflow / Cloud Composer / MWAA
Solid understanding of lakehouse architecture and data modeling
Experience working with enterprise business processes (Finance, Supply Chain, Sales, etc.)
Strong communication and stakeholder collaboration skills