This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are seeking an experienced Python + OOPS Developer capable of designing, engineering, and delivering scalable enterprise solutions within the banking and financial domain. The ideal candidate will be a hands-on developer with strong expertise in Python, object‑oriented design, data processing, and distributed computing. This role involves end‑to‑end development, performance optimization, workflow orchestration, and collaboration across business and technology teams.
Job Responsibility:
Design and implement modular, reusable Python components for index construction, rebalancing, and backtesting
Run large‑scale historical simulations using Pandas, NumPy, and PySpark
Integrate compute engines with Airflow/Temporal using configuration-driven workflows
Query and consume reference data (pricing, security master, corporate actions) from Snowflake
Build automated test harnesses to validate outputs and ensure reproducibility
Optimize performance using vectorization, caching, and distributed computing patterns
Ensure data and calculation reconciliation against benchmarks
Collaborate with Business, Index Operations, and Platform teams to move research into production
Follow best practices in code quality, architecture, testing, and deployment
Contribute to design discussions and technical decision-making
Requirements:
Strong proficiency in OOP, clean architecture, maintainable application design
Deep experience in numerical computing and time-series analysis
Working knowledge for distributed data processing
Understanding of portfolio mathematics, weighting algorithms, and time-series transformations
Experience building rules-based or metadata-driven frameworks
Strong SQL skills and ability to consume structured data from Snowflake
Expertise in unit testing, regression testing, deterministic replay
Familiarity with Airflow, Temporal, or similar orchestration frameworks
Good understanding of S3, Lambda, IAM, and integration with Snowflake Data Cloud
Nice to have:
Experience with Docker, containerization, or Kubernetes
Knowledge of CI/CD pipelines for Python applications
Familiarity with event-driven architectures (Kafka, EventBridge)
Experience working in BFSI / financial index domain
Understanding of compute optimization using multiprocessing or Ray
Exposure to data governance, lineage, and metadata management tools