This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are seeking a highly skilled Data Engineer specialising in Python and Google Cloud Platform (GCP). The individual will design, build, and optimise scalable data pipelines, develop robust back-end components, and work collaboratively with cross-functional partners to deliver high-quality data products. This role is ideal for someone with strong technical expertise in cloud-native data engineering, semantic data modelling, and automated workflows.
Job Responsibility:
Design and maintain ETL/ELT pipelines to ingest, transform, and load large datasets into GCP-based data platforms such as BigQuery and Cloud Storage
Develop and optimise scalable back-end components and modular Python code for data processing, workflow orchestration, API integrations, and automation
Build and operationalise semantic data layers to support standardised metrics and improved data accessibility
Utilise GCP services including Cloud Composer, Dataflow, Pub/Sub, and Cloud Functions to create automated and reliable workflows
Develop transformation workflows using Dataform, including SQLX transformations, automated tests, CI/CD integrations, and documentation
Collaborate with data scientists, analysts, and business partners to convert data needs into efficient technical solutions
Implement data quality checks to ensure accuracy, integrity, and performance in all data workflows
Integrate semantic layers into BI tools and leverage metadata and lineage tools for improved governance
Requirements:
Proficient Python developer with hands‑on experience in building scalable data processing solutions
Skilled in GCP services such as BigQuery, Cloud Storage, Pub/Sub, Cloud Functions, and Cloud Composer
Experienced in semantic modelling, workflow orchestration, and Dataform‑based transformations
Knowledgeable in SQL, data validation techniques, API-driven microservices (Flask/FastAPI), and automation frameworks
Strong collaborator with excellent communication skills and the ability to work effectively with diverse teams
Focused, detail‑oriented, and driven to create reliable, high-performance data solutions
What we offer:
Opportunities to work on modern cloud-native data engineering projects with cutting-edge GCP services
Exposure to advanced semantic modelling, orchestration, and automation frameworks
Collaboration with experts across engineering, data science, and analytics teams
Ability to influence large-scale, high-impact data programmes within a global organisation
Continuous professional development through hands-on technical challenges