This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We’re looking for an Analytics Engineer to help build and maintain the data models that power analytics and data science across the business. You’ll focus on developing robust and scalable dbt pipelines and contributing to the evolution of our data platform, ensuring that data is accessible, trusted, and well-structured. This role is hands-on and ideal for someone with a strong technical foundation who enjoys solving data problems, writing clean and efficient SQL, and collaborating with analysts, business stakeholders and product teams.
Job Responsibility:
Build and maintain dbt models to transform raw data into clean, documented, and accessible data sets
Translate business and analytics requirements into scalable data models
Design and implement data warehouse schemas using dimensional modelling techniques (fact and dimension tables, slowly changing dimensions, etc.)
Participate in design and code reviews to improve model design and query performance
Expose these models and associated metrics via our Semantic Layer
Implement and maintain dbt tests to ensure data quality and model accuracy
Document data models clearly to support cross-functional use
Use GitHub and CI/CD pipelines to manage code and deploy changes safely and efficiently
Optimise dbt models and SQL queries for performance and maintainability
Work with Snowflake
developing on top of a data lake architecture
Ensure dbt models are well-integrated with data catalogs and accessible for downstream use
Requirements:
2+ years of building and optimising complex SQL (including complex joins, window functions and optimisation methods)
Strong understanding of data modelling and warehouse design (e.g., Kimball-style dimensional modelling)
Experience using dbt in production environments, including testing and documentation
Familiar with version control (GitHub)
Experience tuning dbt models and SQL queries for performance
Able to independently transform business logic into technical implementation
Comfortable participating in and contributing to code reviews
Nice to have:
Experience with Snowflake
Experience with Semantic Layers (e.g. Looker, Cube etc.)
Experience with CI/CD for data workflows
Familiarity with Python/Airflow for data transformation or orchestration tasks
Experience with data visualisation tools (e.g., Tableau, Looker)
Working knowledge of infrastructure-as-code tools like Terraform
What we offer:
27 days holiday
5 additional days off: 1 life event day, 2 volunteer days, 2 company-wide wellbeing days (M-Powered Weekend)
8 bank holidays per year
private medical Insurance with Bupa
a medical cashback scheme
life insurance
gym membership & wellness resources through Wellhub
access to Spill - all in one mental health support
Hybrid work offering - for most roles we collaborate in the office three days per week
Work-from-anywhere scheme - you'll have the opportunity to work from anywhere, up to 10 days per year
Space to connect: Beyond the desk, we make time for weekly catch-ups, seasonal celebrations, and have a kitchen that’s always stocked!