This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As the Technical Lead, you will drive the high-stakes migration of legacy SAS analytics to a modern, cloud-native PySpark ecosystem on AWS. This isn't just a lift and shift you will refactor complex procedural logic into scalable, production-ready distributed pipelines for a Tier-1 financial services environment.
Job Responsibility:
Engineering Leadership: Design and develop complex ETL/ELT pipelines and Data Marts using PySpark, EMR, and Glue
Legacy Modernisation: Architect the conversion of SAS Base/Macros into modular, testable Python code using SAS2PY and manual refactoring
Performance Tuning: Optimise Spark execution (partitioning, shuffling, caching) to ensure cost-efficient processing of massive financial datasets
Quality & Governance: Implement rigorous CI/CD, unit testing, and data reconciliation frameworks to ensure "penny-perfect" accuracy
Requirements:
Expert in PySpark
Expert in Python with Clean Code/SOLID principles
Proficiency in reading/debugging SAS (Base, Macros, DI Studio)
Experience with AWS: EMR, Glue, S3, Athena, IAM, Lambda
Experience with Data Modeling: SCD Type 2, Fact/Dimension tables, Data Vault/Star Schema
Experience with DevOps: Git-based workflows, Jenkins/GitLab CI, Terraform