This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are seeking a skilled Data Engineer with strong expertise in Google Cloud Platform to design, build, and optimise scalable data pipelines and modern data solutions. This role focuses on enabling high-quality, reliable, and reusable data assets by working closely with business stakeholders, data science teams, and platform partners. The individual will contribute to stabilising and enhancing existing GCP-based data frameworks while ensuring operational excellence through automation, monitoring, and data quality controls.
Job Responsibility:
Engage with end users and business stakeholders to understand data requirements and contribute to ETL framework strategies
Design, build, and implement modern, scalable data solutions using Google Cloud Platform services
Automate manual processes to optimise data delivery, including orchestration workflows, logging, and alerting mechanisms
Analyse datasets to assess data quality, reusability, and integrity, and provide effective solutions to resolve data quality issues
Review reports and performance indicators to filter data and identify, troubleshoot, and correct code-related issues
Monitor daily data pipelines to ensure solution readiness, zero downtime, and rapid alerting in case of failures
Maintain and track audit tables to ensure consistency and reliability of data workflows
Perform reverse engineering of existing workflows to understand current functionality and support enhancements
Enhance and optimise existing GCP data pipelines to deliver stable, efficient, and scalable solutions
Collaborate with data science and machine learning teams, along with key stakeholders, to support business objectives
Identify and resolve data anomalies and pipeline-related issues within the GCP framework
Requirements:
Experienced in building and managing data pipelines on Google Cloud Platform
Proficient in BigQuery, Google Cloud Storage, Dataproc, and Cloud Composer
Strong hands-on skills in SQL, Python, Unix, and Spark
Comfortable analysing data quality issues and designing practical, sustainable solutions
Able to work collaboratively with cross-functional teams in a fast-paced, delivery-focused environment
Detail-oriented, with a strong focus on data reliability, performance, and operational stability
What we offer:
Opportunity to work on large-scale, enterprise-grade data platforms within a global organisation
Exposure to modern cloud-native data engineering practices and emerging GCP technologies
Collaboration with diverse, international teams across data engineering, data science, and business domains
A culture that values continuous learning, improvement, and inclusive teamwork