This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As a Cloud Engineer, you are passionate about experience innovation and eager to push the boundaries of what’s possible. You bring 6+ YEARS of experience, a growth mindset and a drive to make a lasting impact.
Job Responsibility:
Design GCP-native architectures using BigQuery, Dataflow, Cloud Composer (Airflow), Pub/Sub, Cloud Storage, Vertex AI, and Cloud Run
Build and maintain batch and streaming data pipelines using medallion architecture (Bronze, Silver, Gold)
Implement infrastructure as code using Terraform
Manage deployments through CI/CD pipelines such as Cloud Build
Define and enforce GCP landing zone standards including IAM, VPC, Shared VPC, Private Service Connect, and organization policies
Build end-to-end Databricks Lakehouse solutions on GCP
Design Delta Lake tables with proper governance using Unity Catalog
Develop and optimise PySpark and SQL workloads for large-scale transformations
Configure Databricks clusters, job scheduling, autoscaling, and cost controls
Implement Databricks Workflows and Asset Bundles for orchestration and CI/CD
Advise clients on Databricks adoption and migrations from legacy or on-prem platforms
Lead technical workshops and requirements-gathering sessions
Create client-facing deliverables such as architecture diagrams, specifications, runbooks, and data dictionaries
Present solution designs and progress updates to both technical and business stakeholders
Manage technical risks, dependencies, and delivery issues
Actively participate in Agile ceremonies and sprint delivery
Implement data quality checks using tools such as Great Expectations, dbt tests, or Delta constraints
Support metadata management and data catalogue initiatives using Dataplex or Unity Catalog
Ensure GDPR and data residency requirements are embedded in solution design
Support RFP and RFI responses with solution design and effort estimation
Contribute to internal documentation, runbooks, and knowledge-sharing sessions
Stay up to date with GCP and Databricks product developments and recommend new capabilities to clients
Requirements:
6+ years of hands-on professional experience in cloud data engineering or GCP platform roles
Bachelor's degree in Computer Science, Engineering, Information Systems, or equivalent practical experience
BigQuery with advanced SQL, partitioning, clustering, and cost optimization
Cloud Storage, Cloud Functions, Cloud Run
Dataflow (Apache Beam) for batch and streaming pipelines
Cloud Composer / Airflow for orchestration
Pub/Sub for event-driven architectures
Vertex AI exposure for model serving or pipelines
IAM, VPC, organization policies, and security governance
Terraform for infrastructure as code
Cloud Build and Artifact Registry for CI/CD
Looker or Looker Studio for analytics and reporting
Strong PySpark skills using DataFrame APIs and performance optimisation techniques
Delta Lake concepts including ACID transactions, time travel, and Z-ordering
Unity Catalog for governance and access control
Databricks Workflows and job orchestration
Databricks on GCP cluster configuration and tuning
MLflow for experiment tracking and model registry
dbt on Databricks or BigQuery
Databricks Asset Bundles and CI/CD integration
Python for data engineering and automation
Advanced SQL for analytics
Git-based version control and pull request workflows
Docker and basic containerisation concepts
Unit testing and data pipeline testing strategies
REST API integration experience
Google Cloud Professional Data Engineer or Professional Cloud Architect certification
Databricks Certified Associate Developer for Apache Spark or Databricks Certified Data Engineer certification
Nice to have:
Experience with AWS or Azure in addition to GCP
Exposure to composable commerce or content platforms
Salesforce Marketing Cloud or CDP integrations
Knowledge of Generative AI and LLM integrations using Vertex AI or Gemini APIs
Databricks Agent Framework or LangChain experience
Kafka or Confluent Cloud
Experience in luxury, retail, or FMCG domains
What we offer:
Flexibility, with remote and hybrid work options (country-dependent)
Career advancement, with international mobility and professional development programs
Learning and development, with access to cutting-edge tools, training and industry experts