This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are seeking a Senior Data Engineer with a strong background in data engineering, ideally with hands-on experience in Google Cloud Platform (GCP) services such as Pub/Sub, Dataflow, and BigQuery. The ideal candidate will have solid proficiency in SQL and Python, along with familiarity in moving and transforming data across cloud-based data zones and streaming pipelines (ETL/ELT). Comfort with cloud services, data management, automation tools, and real-time data processing is essential.
Job Responsibility:
Demonstrate deep knowledge of the data engineering domain to build and support non-interactive (batch, distributed) & real-time, highly available data pipelines
Build fault-tolerant, self-healing, adaptive, and highly accurate data computational pipelines
Provide consultation and lead the implementation of complex programs
Develop and maintain documentation relating to all assigned systems and projects
Tune queries running over billions of rows of data running in a distributed query engine
Perform root cause analysis to identify permanent resolutions to software or business process issues
Implement and maintain dbt transformation models, CI pipelines, and data contracts for curated campaign, ad group, keyword, audience, and landing-page marts
Build and monitor data quality gates (Great Expectations and reconciliation checks) and freshness SLOs
Optimize BigQuery cost and performance using query tuning, storage design, and reservation strategy
Implement platform hardening controls including retries, dead-letter queues, DR runbooks, and support for VPC-SC and DLP validation
Requirements:
Strong expertise in Python for building data pipelines, processing tasks, and automation
Hands-on experience with Core GCP Data Stack: BigQuery (expert-level SQL, performance tuning, partitioning and clustering strategy, production-grade curated marts and feature tables, working knowledge of BQML)
Dataflow (Apache Beam)
Strong proficiency in building reliable batch and incremental pipelines for GA4/Google Ads data into BigQuery, with streaming patterns
Pub/Sub: Experience with event-driven architecture and message queuing
Hands-on experience with Visualization & BI: Looker Core – Advanced proficiency in LookML (derived tables, explores, Liquid syntax)
Semantic Modeling: Develop robust LookML models
Dashboard Creation: Design intuitive dashboards in Looker for operational teams and executives
Ability to independently own workstreams while collaborating closely with data science, analytics, and engineering peers in agile delivery
Advanced English skills
Nice to have:
Google Cloud Professional Data Engineer certification
Google Professional Machine Learning Engineer certification
Google Cloud Professional Cloud Architect certification
Bachelor’s or Master’s degree in a quantitative or technical field (e.g., Computer Science, Engineering, Statistics)
Working knowledge of cloud architecture components in GCP
Proficiency in Big Data environments and tools such as Spark, Hive, Impala, Pig, etc.
Proficiency in Terraform
Familiarity with front and back-end web application stacks and frameworks, and API design and usage (REST/GraphQL)
Experience leading and managing technical data/analytics/machine learning projects
Experience supporting data products consumed by conversational analytics surfaces
What we offer:
Growth opportunities
A values-driven culture
International careers
The chance to shape the future of experience
An environment designed for continuous learning, meaningful impact, and professional growth
A workplace culture that fosters creativity, diversity and autonomy
A borderless, global framework, which enables seamless collaboration