This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Senior BigQuery Engineer will design and maintain scalable data warehouse solutions in BigQuery, optimize performance, and build data processing pipelines. You will join the OneMIS stream, responsible for management, regulatory & risk reporting, and advanced analytics. Our mission includes enhancing data quality via KPIs and migrating data platforms to modern, cloud-native ecosystems. We operate in an agile environment, committed to responsible data practices.
Job Responsibility:
Design, develop, and maintain scalable data warehouse solutions in BigQuery, including dataset architecture, schema design, and data modeling (star/snowflake)
Design and implement a cloud-native data warehouse to replace on-premises infrastructure
Optimize performance through partitioning, clustering, query tuning, and workload management
Build data processing pipelines and workflows feeding BigQuery from multiple sources
Implement data quality checks, validation frameworks, and reconciliation processes
Work closely with product owners, analysts, to ensure solution align with the business requirements specification
Requirements:
University degree in computer science or a comparable qualification
At least 5 years of data engineering experience
Strong SQL knowledge, preferably on Google Cloud Platform (GCP)
Experience orchestrating workflows with Cloud Composer (Apache Airflow), including DAG development, scheduling, monitoring, and dependency management
Hands-on experience building data processing pipelines using Dataflow (batch and streaming)
Good understanding of data warehousing concepts, data flows, data feeds
Experience using Bitbucket for Git-based source control and collaboration
Ability to work collaboratively in a dynamic environment
Nice to have:
Hands-on experience with Python/PySpark for scalable distributed data processing
Experience with Infrastructure as Code (Terraform, Ansible, Chef)
Knowledge of shell scripting
Experience in financial services or regulated environments
What we offer:
Smooth integration and a supportive mentor
Flexible working style (Remote, Hybrid or Office)
Flexible working hours
Sponsored certifications, trainings and top e-learning platforms
Private Health Insurance
Individual coaching sessions or accredited Coaching School