This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Senior Data Engineer role focuses on designing and maintaining data pipelines for payments and banking systems. Candidates should have at least 5 years of experience in data engineering, strong skills in SQL and Python, and familiarity with cloud platforms like Azure. A bachelor's degree in a related field is preferred. This position requires collaboration in an Agile environment and a commitment to data integrity and compliance.
Job Responsibility:
Design, develop, and maintain scalable batch and real-time data pipelines supporting payment processing, card transactions, billing, and banking systems
Build and automate ETL/ELT workflows for ingestion, transformation, aggregation, and loading of high-volume financial data
Develop and support Kafka integrations, including topics, producers, consumers, and streaming applications for near real-time transaction processing
Develop and optimize cloud-native data solutions in Azure, leveraging services such as Azure Data Factory (ADF), Azure Synapse, Azure Data Lake, and Event Hubs
Build and manage enterprise data platforms using Snowflake, including data modeling, performance tuning, clustering, and secure data sharing
Develop ETL/ELT workflows using Python and SQL for ingestion, transformation, validation, and aggregation of high-volume financial data
Translate payment and banking data requirements into functional specifications, mapping documents, and technical designs
Ensure data accuracy, integrity, security, and compliance (including PCI-DSS and financial regulatory requirements)
Optimize SQL queries and tune RDBMS performance for high-throughput transactional environments
Design and maintain enterprise microservices (security, logging, APIs) using Java/Spring Boot where applicable
Monitor data pipeline performance and implement optimization and corrective actions
Collaborate with product owners, architects, QA, and compliance teams in an Agile environment
Troubleshoot and resolve complex production issues across data platforms and transaction systems
Work on-site 3 days per week to support collaboration, architecture discussions, and agile ceremonies
Requirements:
Minimum 5+ years of experience in Data Engineering or Data Warehouse environments
Minimum 5+ years of strong experience designing, building, and maintaining batch and real-time data pipelines
5+ years of proven experience in the payments, banking, or financial services industry, supporting transaction-based systems
5+ years of strong expertise in SQL, PL/SQL, and query optimization
Minimum 5+ years of experience working with relational databases such as Oracle, DB2, Teradata, SQL Server
5+ years of experience with Kafka or streaming platforms
Minimum 5+ years of strong programming skills in Java and/or Python
Minimum 5+ years of experience developing API integrations with cloud or enterprise systems
Minimum 5+ years of experience with Unix Shell scripting
Minimum 5+ years of experience working in Agile environments
Nice to have:
Bachelor’s degree in Computer Science, Engineering, or related technical field
Hands-on experience with any ETL/ELT tool
Understanding of data modeling concepts and ETL frameworks