This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are looking for an Data Engineer to join our Data Platform (Ingestion) team. Your mission is to help streamline data access to promote scalability as we transition from a web-centric to a data-centric design within cloud-based micro-services.
Job Responsibility:
Build and maintain scalable data ingestion pipelines and ETL/ELT processes (from staging to production)
Contribute to the evolution of our data architecture to improve performance, scalability, and reliability
Partner with analytics engineers, data scientists, and business teams to understand and implement data requirements
Support and optimize cloud-based data infrastructure (AWS)
Deploy automated data quality checks and monitoring systems to ensure data reliability
Develop and keep up-to-date technical documentation for data processes and systems
Investigate and resolve data-related issues to guarantee data integrity across the stack
Help streamline data access to promote scalability as we transition from a web-centric to a data-centric design within cloud-based micro-services
Requirements:
Intermediate level with 3 to 5 years of hands-on experience in data engineering
Prior experience in the Tech or SaaS sector is a plus
Master / Engineer in Computer Science, Engineering, or related field
Fluent in English and French
Solid SQL & Modeling skills (efficient schemas) and hands-on experience with Python or Java
Experience with AWS, GCP, or Azure is required
Proven experience building robust ETL/ELT pipelines (GB/TB scale) with batch / streaming frameworks (e.g. Spark, Apache Beam, Flink, Hadoop) with automated quality checks and proactive monitoring
Experience with Dagster or Airflow (or equivalent)
Experience with Lakehouse or DataWarehouse is a plus (Snowflake, BigQuery, Apache Iceberg / S3)
Ability to apply best practices to build maintainable pipelines while balancing technical debt with feature delivery (balancing speed and code quality)
We value engineers who use AI-assisted tools (Cursor, Claude, Copilot)
Interest in healthcare data and familiarity with GDPR/HDS. Previous exposure to FHIR is a plus
Pragmatic & Focused: Able to work in fast-paced environments with a focus on delivering value
Autonomous & Self-driven: Comfortable working independently while contributing effectively to cross-functional teams
Curious & Adaptable: Eager to learn and develop expertise in emerging data technologies
Nice to have:
Prior experience in the Tech or SaaS sector
Experience with Lakehouse or DataWarehouse (Snowflake, BigQuery, Apache Iceberg / S3)
Interest in healthcare data and familiarity with GDPR/HDS. Previous exposure to FHIR
We value engineers who use AI-assisted tools (Cursor, Claude, Copilot)
What we offer:
Health care plan: Alan (50% employer)
Luncheon voucher: 9€ (50% employer)
Transport: 50% of your pass OR sustainable mobility pass
Eligible for stock option (BSPCEs) according to the company's existing rules