This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are seeking a skilled Big Data Engineer to design, build, and manage data ingestion and processing applications on Google Cloud. The role involves working with technologies such as BigQuery, Dataflow, Composer, Cloud Storage, and Dataproc, while ensuring optimal performance and scalability. You will also contribute to real-time data processing systems, containerisation, and modern DevOps practices, supporting machine learning applications and MLOps industrialisation.
Job Responsibility:
Design, build, and manage Big Data ingestion and processing applications on Google Cloud using BigQuery, Dataflow, Composer, Cloud Storage, and Dataproc
Optimise performance and analyse distributed computing tools such as Spark and Apache Beam (Dataflow)
Develop cloud-optimised solutions for machine learning applications
Build real-time data processing systems using Kafka, Pub/Sub, Spark Streaming, or similar technologies
Manage the development lifecycle for agile software projects
Convert proof-of-concept machine learning models into production-ready solutions (MLOps)
Deliver customer-oriented solutions in a timely and collaborative manner
Proactively plan and manage dependencies across projects
Implement robust solutions in test and production environments
Requirements:
Professional Google Cloud Data Engineer certification
Strong analytical and technical skills for impact analysis and root cause identification
Excellent communication skills for engaging with business, technical, and project stakeholders
Good understanding of Big Data and Data Science ecosystems
3–5 years of relevant work experience
Experience in building and deploying containers using Swarm/Kubernetes
Knowledge of container concepts, including creating lean and secure images
Familiarity with modern DevOps pipelines
Experience with streaming data pipelines using Kafka or Pub/Sub (Kafka experience is mandatory)
Telecommunications industry experience
Exposure to CI/CD tools such as Jenkins, Git, Jira, and Confluence
Certified Google Cloud Professional Data Engineer
Graduate or Postgraduate in a technical stream (Engineering preferred)
Nice to have:
Telecommunications industry experience
Exposure to CI/CD tools such as Jenkins, Git, Jira, and Confluence
What we offer:
Opportunity to work on cutting-edge Big Data and cloud technologies
Exposure to real-time data processing and machine learning applications
Collaborative work environment with global teams
Career growth through continuous learning and certification opportunities