This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We’re looking for a Senior Data Engineer to design, build, and optimize modern data pipelines and architecture. You’ll support analytics, reporting, and data‑driven applications by creating scalable, efficient data systems across cloud environments.
Job Responsibility:
Design and build ETL/ELT pipelines across cloud platforms (Azure, AWS, or GCP)
Architect and maintain Data Lake / Lakehouse environments
Develop and optimize data ingestion, transformation, and orchestration workflows
Ensure data quality, reliability, and scalability across all pipelines
Collaborate with BI developers, analysts, and business stakeholders
Implement best practices around versioning, testing, and deployment
Support real‑time and batch data processing initiatives
Requirements:
5+ years of data engineering experience
Strong SQL and Python skills
Experience with cloud data platforms (Azure Data Factory, Databricks, AWS Glue, GCP Dataflow, etc.)
Familiarity with Lakehouse architectures and distributed computing
Experience building both batch and streaming data pipelines
Strong understanding of data modeling and performance optimization
Ability to mentor teammates and contribute to architectural decisions