This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Join our dynamic and innovative team as a Data Engineer III, where you will take ownership of designing, building, and optimizing our data infrastructure and pipelines. As an experienced member of the team, you will work independently on complex data engineering challenges while beginning to provide technical guidance to junior engineers. Working closely with business stakeholders, data scientists, and cross-functional teams, you'll ensure data is reliable, efficient, secure, and accessible at scale. You will contribute to architectural decisions and help drive best practices across the organization.
Job Responsibility:
Design, build, and maintain complex end-to-end data pipelines using Airflow
Develop and optimize advanced DAGs (Directed Acyclic Graphs) with sophisticated retry logic, error handling, alerting, and monitoring
Contribute to the establishment of best practices and standards for pipeline development, testing, and deployment
Design and implement scalable data integration solutions using Airbyte for ingestion and dbt for transformations
Collaborate with Data Analysts, Data Scientists, and business stakeholders to implement complex transformations and business logic
Design and implement efficient and scalable data models for both structured and semi-structured data in AWS S3 and Snowflake
Utilize AWS Lake Formation APIs and best practices to maintain data security, access controls, and compliance standards
Monitor, analyze, and tune Airflow DAGs, Snowflake queries, AWS Athena configurations, and Databricks jobs to optimize throughput, reliability, and cost-effectiveness
Provide technical guidance and mentorship to junior data engineers
Lead cross-functional partnerships with DevOps, Platform Engineering, Data Science, and business teams
Participate in architectural decisions and provide input on data engineering strategy
Actively participate in and occasionally lead architecture reviews, design reviews, and code reviews
Stay current with emerging trends in data engineering, orchestration tools, and cloud services
Contribute to DevOps and DataOps best practices, including CI/CD for data pipelines, infrastructure as code, and automated testing
Requirements:
Bachelor's or Master's degree in Computer Science, Engineering, or a related field
4+ years of experience as a Data Engineer or in a similar role working with cloud-based data platforms
Demonstrated track record of delivering complex data engineering projects independently
Advanced proficiency with Airflow (self-managed or managed services like Amazon MWAA) for workflow orchestration, DAG development, and scheduling
Strong hands-on experience with AWS Lake Formation, S3, Athena, and related services
Advanced proficiency in designing and implementing data warehouses in Snowflake
Strong experience with Databricks for large-scale data processing
Strong experience with Airbyte or similar tools for complex data ingestion scenarios
Advanced proficiency with dbt or other SQL-based transformation frameworks
Advanced proficiency in Python and/or Java/Scala
Advanced to expert-level knowledge of SQL
Working experience with CI/CD pipelines, containerization (Docker, Kubernetes), and infrastructure as code (Terraform, CloudFormation)
Strong problem-solving and analytical abilities
Excellent communication and collaboration skills
Ability to work independently and take ownership of projects
Nice to have:
Experience providing technical guidance to other engineers is a plus
Exposure to real-time streaming technologies (e.g., Kafka, Kinesis) is beneficial
AWS certifications (e.g., AWS Certified Data Analytics – Specialty, AWS Certified Solutions Architect) are highly desirable
Snowflake certifications (e.g., SnowPro Core) are a plus
Databricks certifications are beneficial
Demonstrated experience with CI/CD pipelines, containerization (Docker, Kubernetes), and infrastructure as code (Terraform, CloudFormation)
Working knowledge of DevOps best practices for managing production data environments
Familiarity with financial services data, especially private equity and alternative investments, is not necessary, but it will be highly impactful for this role
What we offer:
Support for professional accreditations such as ACCA and study leave
Flexible arrangements, generous holidays, plus an additional day off for your birthday
Continuous mentoring along your career progression
Active sports, events and social committees across our offices
24/7 support available from our Employee Assistance Program
The opportunity to invest in our growth and success through our Employee Share Plan