This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Our partner is a premium catering and event services company founded in 1820. It specializes in high-end corporate, institutional, and private events in France and internationally. The company operates complex logistical environments, managing large-scale events that require coordination across production, transport, warehouse operations, equipment management, and finance.
Job Responsibility:
Design the data platform environment (AWS, GCP, Azure, or hybrid)
Select and implement the Modern Data Stack components: ingestion tools (e.g., Airbyte, Fivetran), transformation layer (dbt), orchestration (Airflow, Dagster)
Design and implement a scalable Data Warehouse / Lakehouse architecture
Build structured, documented, high-performance data models via dbt
Connect the entire business ecosystem: accounting, CRM, expense management, payment platforms
Build ERP abstraction pipelines enabling ERP switching without impacting reporting
Ensure data integrity, freshness, and security
Manage coexistence of multiple ERPs during transition phase
Connect BI/DataViz tools (Tableau or future solutions) to the platform
Optimize query performance
Enable near real-time or real-time business monitoring
Requirements:
Proven experience building a Data Platform from scratch (Data Warehouse / Lakehouse)
Experience with ERP migration projects is a strong plus
Expert-level SQL
Strong Python
Deep knowledge of Modern Data Stack (dbt, ingestion tools, cloud warehouses)