This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are hiring highly skilled Senior Data Engineer to design, build, and optimize scalable data platforms and pipelines on Google Cloud Platform (GCP) for our GCC client — Europe’s top retail brand. The ideal candidate will have strong experience in modern data architecture, distributed processing, and cloud-native engineering practices. Exposure to Azure Data Factory (ADF) is considered a strong plus. This opportunity gives you exposure to enterprise-scale initiatives in retail and supply chain, working alongside a peer group of talented engineers, architects, and domain specialists across geographies in a collaborative, innovation-driven environment. It’s a role that not only sharpens your technical expertise but also provides long-term visibility and growth within a global organization.
Job Responsibility:
Design and implement scalable batch and streaming data pipelines on GCP
Build and maintain data solutions using BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, and related services
Architect robust data models for analytics, BI, and AI/ML use cases
Lead technical discussions and mentor junior data engineers
Optimize data processing performance, cost, and reliability
Implement CI/CD, DevOps, and Infrastructure-as-Code practices
Collaborate with data architects, analysts, and business stakeholders to deliver high-quality solutions
Ensure data governance, security, and compliance standards are followed
Requirements:
6–9 years of experience in Data Engineering or Data Platform development