This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are looking for a Senior Data Engineer to design and build scalable, cloud-native data platforms. You will work with modern data architectures and process large-scale datasets to power analytics and business intelligence. This is a hands-on technical leadership role with strong ownership over architecture, pipelines, and data quality.
Job Responsibility:
Design and implement end-to-end data pipelines (batch & streaming)
Build and maintain data lake / Lakehouse architectures using modern cloud tools
Implement Medallion Architecture (Bronze, Silver, Gold layers) for scalable data processing
Process and optimize high-volume datasets using Apache Spark (PySpark) or SQL
Develop robust data models for analytics and reporting
Ensure data quality, governance, lineage, and security
Integrate and leverage AI tools (LLMs, embeddings, ML pipelines) where relevant
Collaborate with analysts, data scientists, and stakeholders to deliver insights
Optimize pipelines for performance, scalability, and cost efficiency
Build automated data pipelines and workflows (DataOps mindset, CI/CD)
Mentor junior engineers and drive best practices
Requirements:
5+ years experience in Data Engineering or similar roles
Strong expertise in Python, Apache Spark / PySpark, and SQL & Data Warehousing concepts
Proven experience with Data Lakes, large-scale data processing, and Medallion architecture
Hands-on experience with Microsoft ecosystem: Microsoft Fabric / Azure Data Services / Synapse / Data Factory
Experience with Power BI or similar BI tools
Strong understanding of ETL/ELT pipelines, data modeling, and performance optimization
Experience with real-time or near real-time data pipelines
Experience working with AI/ML-enabled data platforms
Familiarity with LLMs (e.g. GPT-based tools), embeddings, or vector databases
Nice to have:
Experience with Databricks
Experience with DBT
Experience with streaming tools (Kafka, Event Hubs)
Familiarity with automation of data pipelines (DataOps, orchestration tools)
Infrastructure as Code (Terraform, ARM)
Data governance tools
What we offer:
Real responsibilities and challenging projects
Friendly environment suited for professional and personal development
Clear paths and real help in your professional development
Training and certifications
Fun team-building sessions
Private health insurance
50% discount for gym or sports pass of your choice