This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are seeking a skilled GCP Data Engineer to design, build, and operationalise data processing systems within the VOIS Technology function. The individual will work with modern GCP services including BigQuery, Data Fusion, Dataproc, Cloud Composer, and more, ensuring reliable, efficient, and high‑quality data pipelines. This role is ideal for individuals with strong experience in cloud-based data engineering, programming, and pipeline optimisation.
Job Responsibility:
Build and operationalise data processing systems based on detailed design specifications
Apply strong Spark knowledge and hands-on experience with Dataproc
familiarity with Dataflow is beneficial
Utilise GCP Data Fusion, BigQuery, Airflow, and related tools to deliver optimised data solutions
Apply cloud-based data pipeline patterns and contribute creative approaches to navigate platform limitations
Design, test, and maintain data pipelines following data modelling, warehousing, and manipulation standards
Design and develop programmes using languages such as Python, Spark, PySpark, Scala, or Java
Contribute to process adoption, resource and tool optimisation, and continuous quality uplift
Recommend improvements to enhance data reliability, operational efficiencies, and solution quality
Requirements:
Experienced in GCP tools including BigQuery, Data Fusion, Dataproc, Cloud Composer, Workflows, and Cloud Scheduler
Skilled in programming languages such as Python, Spark, PySpark, or Java
Knowledgeable in Apache Airflow, Dataproc clusters, and Dataflow
Possess 2–4 years of overall experience, including at least 2–3 years in cloud platforms (GCP/AWS/Azure)
Preferably certified as a Google Cloud Professional Data Engineer
Hold a relevant degree such as B.E./B.Tech, BCA/MCA, or BSc/MSc in Computer Science