This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As a Senior Data Engineer within the Data Engineering team, you will play a key role in building, enhancing, and maintaining our enterprise data platform on Snowflake. You will develop and optimise scalable data pipelines and models that bring data from core business systems into Snowflake, enabling analytics, reporting, and data-driven insights across the organisation. You will translate the data platform strategy into high-quality technical solutions, ensuring our Snowflake environment is reliable, well-structured, and performant. You will champion engineering best practices and contribute to standards that improve the quality, consistency, and usability of data assets. Your work will ensure the business has access to trusted, timely, and well-modelled data to support decision-making, operational reporting, and the foundations for advanced analytics and future AI/ML capabilities.
Job Responsibility:
Design, build, and maintain high-quality data pipelines and models in Snowflake to support business analytics, BI, and operational reporting needs
Translate the defined data architecture and standards into implemented solutions—including ingestion, transformation, storage, and performance optimisation
Develop robust ELT/ETL pipelines using dbt and workflow/orchestration tools (e.g., Argo Workflows), ensuring reliability, maintainability, and adherence to engineering best practices
Implement Snowflake warehouse configurations and query optimisation techniques to ensure efficient usage and predictable cost
Apply data quality checks, lineage tracking, and security standards across the data estate. Ensure compliance with data policies, InfoSec controls, and regulatory requirements as required
Leverage Snowflake capabilities (Tasks, Streams, Snowpark, Time Travel, Secure Data Sharing) to improve automation, reduce manual effort, and enhance data accessibility across the business
Work closely with analysts, data consumers, and business stakeholders to support data product delivery, troubleshoot data issues, and enable effective usage of Snowflake datasets
Implement dimensional models that provide clean, well-structured, reusable datasets for reporting, scenario modelling, and emerging ML/AI use cases
Implement and maintain monitoring, alerting, logging, and cost-management processes for Snowflake and data pipelines to ensure a stable and well-maintained platform
Contribute to shared engineering standards to simplify development and accelerate delivery across the team
Requirements:
Proven experience in delivering cloud-based data engineering solutions, ideally with Snowflake
Strong hands-on proficiency with SQL, Python, and dbt for data transformations, modelling, and pipeline automation
Practical experience with Snowflake and RBAC management
Experience with data ingestion and replication tools such as Airbyte, Fivetran, Hevo, or similar
Working knowledge of cloud services (AWS preferred)
Strong understanding of data modelling and data governance principles
Experience supporting BI/reporting tools (Power BI) and enabling them through well-designed Snowflake data models
Solid knowledge of CI/CD and version-controlled development practices in git
Nice to have:
Exposure to CRM (Salesforce), BSS/OSS (Netadmin), Call Centre, Telephony, or similar enterprise data sources
Participation in migrating data platforms (e.g., PostgreSQL or other cloud RDBMS) into a data warehouse like Snowflake with minimal disruption and strong data validation controls
Experience supporting business teams during platform transitions (e.g., training, documentation, user onboarding, issue resolution)
Experience contributing to naming conventions, schema standards, environment management, testing frameworks, and security patterns for data platforms
Interest in staying up to date with the latest technologies, modern data stack tooling, and best practices to contribute to ongoing platform evolution