This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We’re looking for a Middle Data Engineer to join our Data & AI Department and work closely with cross-functional Marketing & product teams. In this role, you’ll design, build, and maintain scalable data pipelines that support analytics, reporting, and AI/ML use cases. You’ll work in a cloud-native environment on GCP, collaborate with analysts and data scientists, and help ensure data reliability, quality, and accessibility across the organization.
Job Responsibility:
Build and maintain scalable data pipelines using Apache Airflow and GCP (BigQuery, Cloud Storage, Dataform)
Develop and optimize ETL/ELT workflows for batch, streaming, and diverse data sources
Improve pipeline and query performance through tuning, partitioning, and clustering
Ensure data quality, reliability, and freshness through monitoring and validation
Apply data governance standards, including access control and schema management
Monitor pipelines, resolve failures, and meet defined SLAs
Write clean, well-documented code and maintain technical documentation
Work closely with analysts, data scientists, and product teams to deliver usable data
Requirements:
3+ years of experience as a Data Engineer working with cloud-native data architectures
Strong Python skills with a focus on maintainable, testable, and modular code (OOP, error handling, logging)
Advanced SQL skills, including complex queries and performance optimization
Deep hands-on experience with Apache Airflow (DAG design, retries, sensors, custom operators)
Strong experience with Google Cloud Platform, especially BigQuery, Cloud Storage, and Dataform
Solid understanding of data warehousing, data modeling, and ETL/ELT best practices
Experience working with MySQL and Oracle as source systems
Familiarity with Git and CI/CD workflows
Understanding of data security, privacy, and monitoring best practices
Working knowledge of Docker and containerization
Nice to have:
Experience supporting analytics and AI/ML workloads
Exposure to streaming data pipelines
Experience working in cross-functional, product-driven teams