This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are recruiting a Data Engineer on a long-term contract basis for a Challenger Bank, responsible for designing, building and maintaining a scalable on-premise data warehouse using modern data engineering practices. This role sits in a non-cloud environment and requires strong ownership of infrastructure-focused data solutions, delivering robust end-to-end ETL/ELT pipelines and curated data models to support analytics and reporting. This position requires strong hands-on experience with Python, Apache Airflow and dbt as essential skills, as these will be central to building, orchestrating and transforming the bank's data pipelines. You will use Python and SQL to develop ETL/ELT processes, Apache Airflow to design and manage workflow orchestration and scheduling, and dbt to build scalable transformation layers, data models and testing frameworks (including medallion architecture where applicable). Strong experience working in Unix/Linux environments is also key for scripting, deployment and operational support. In addition, you will work with advanced SQL (T-SQL/PL-SQL) and the Microsoft BI stack (SSIS, SSRS, SSAS), supporting data warehousing, reporting and analytics capabilities. The role also involves CI/CD practices, test automation, and exposure to containerisation tools such as Docker. You will collaborate with cross-functional teams to translate business requirements into technical solutions and deliver reliable data products and visualisations using tools such as Power BI, Tableau or Qlik.
Job Responsibility:
Designing, building and maintaining a scalable on-premise data warehouse using modern data engineering practices
Building end-to-end ETL/ELT pipelines and curated data models
Collaborating with cross-functional teams to translate business requirements into technical solutions
Delivering reliable data products and visualisations using tools such as Power BI, Tableau or Qlik
Requirements:
Strong hands-on experience with Python, Apache Airflow and dbt
Experience with SQL (T-SQL/PL-SQL) and the Microsoft BI stack (SSIS, SSRS, SSAS)
Experience with CI/CD practices, test automation, and exposure to containerisation tools such as Docker
Strong experience working in Unix/Linux environments