This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As a Data Engineer, you will be a key contributor to the design and development of high-performance data pipelines and platform features. This role sits at the heart of our transformational data initiatives, where you will apply advanced problem-solving skills to build practical, well-structured solutions that underpin the bank's future data capabilities.
Job Responsibility:
Design and write complex configurations for diverse data feeds, ensuring seamless integration and data consistency across the platform
Build and maintain scalable, reliable data pipelines using Python, SQL, and dbt
Develop and optimize data solutions within AWS, specifically leveraging Amazon Redshift
Work directly with business and operational stakeholders to translate complex requirements into structured technical solutions
Requirements:
Strong, hands-on Python expertise for medium-to-complex applications
Advanced SQL for complex querying and database structuring
Proven experience in AWS environments, specifically working with Amazon Redshift and Linux/Unix systems
Proficiency with dbt for data transformation and containerization (Docker/Kubernetes)
Experience with pipeline tools such as Airflow or Argo Workflows
Familiarity with APIs and big data technologies (Spark, Hive, or Presto)
A solid understanding of security best practices and version control using Bitbucket
Previous experience in a banking or highly regulated financial environment is highly preferred
Nice to have:
Previous experience in a banking or highly regulated financial environment is highly preferred