This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are looking for a skilled Data Engineer to join our growing team. You will play a pivotal role in designing, building, and maintaining our data infrastructure and pipelines. Working closely with data analysts, data scientists, and various business stakeholders, you’ll ensure data is reliable, efficient, and accessible in a scalable manner.
Job Responsibility:
Data Pipeline Orchestration: Design, build, and maintain end-to-end data pipelines using Airflow (including managed services like Amazon MWAA) to orchestrate, schedule, and monitor batch/streaming workflows
Implement DAGs (Directed Acyclic Graphs) with retry logic, error handling, and alerting to ensure data quality and pipeline reliability
Data Ingestion & Transformation: Integrate data from various sources using Airbyte for ingestion and dbt for transformations in a scalable and modular fashion
Collaborate with Data Analysts and Data Scientists to implement transformations and business logic, ensuring data is analytics-ready
Data Modeling & Warehousing: Design and implement efficient data models for both structured and semi-structured data in AWS S3 (data lake) and Snowflake (data warehouse)
Ensure data schemas and transformations support advanced analytics, BI reporting, and machine learning use cases
Data Governance & Security: Utilize AWS Lake Formation APIs and best practices to maintain data security, access controls, and compliance
Work closely with IT security to establish robust encryption standards, audit trails, and identity/role-based access
Performance Optimization: Optimize AWS Athena queries and configurations (e.g., data partitioning) for performance and cost efficiency
Monitor and tune Airflow DAGs, Snowflake queries, and data transformations to improve throughput and reliability
Collaboration & Stakeholder Management: Partner with cross-functional teams, including DevOps, Platform Engineering, and Data Science, to ensure seamless integration of data workflows and systems
Communicate technical solutions effectively to non-technical stakeholders and leadership, translating requirements into actionable tasks
Continuous Improvement: Participate in architecture reviews, code reviews, and troubleshooting sessions to ensure quality and alignment with best practices
Remain current with emerging trends in data engineering, orchestration tools (Airflow, MWAA), and cloud services (AWS, Snowflake)
Requirements:
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field
3+ years of experience as a Data Engineer or in a similar role working with cloud-based data platforms
Technical Skills: Cloud & Orchestration: Airflow (self-managed or managed services like Amazon MWAA) for workflow orchestration, DAG development, and scheduling
Familiarity with best practices for Airflow DAG structure, dependency management, and error handling
AWS Expertise: Hands-on experience with AWS Lake Formation, S3, Athena, and related services (e.g., Lambda, Glue, IAM)
Snowflake: Proficient in setting up data warehouses, configuring security, and optimizing queries on Snowflake
Data Ingestion & Transformation: Experience with Airbyte or similar tools for data ingestion
dbt or other SQL-based transformation frameworks for modular data processing
Programming: Proficiency in Python and/or Java/Scala for building data pipelines and custom integrations
Query Languages: Advanced knowledge of SQL for data manipulation and analysis
Soft Skills: Strong problem-solving and analytical abilities
Excellent communication and collaboration skills, able to effectively work in cross-functional teams
Ability to operate in a fast-paced, agile environment and manage multiple priorities simultaneously
Nice to have:
AWS certifications (e.g., AWS Certified Data Analytics – Specialty, AWS Certified Solutions Architect) are a plus
Experience with CI/CD pipelines, containerization (Docker, Kubernetes), and infrastructure as code (Terraform, CloudFormation) is beneficial
Familiarity with DevOps best practices for managing Airflow environments (e.g., version control for DAGs, automated testing, monitoring)
While familiarity with financial services data especially private equity and alternative investments is not necessary, it will be highly impactful for this role
What we offer:
Support for professional accreditations
Flexible arrangements, generous holidays, plus an additional day off for your birthday
Continuous mentoring along your career progression
Active sports, events and social committees across our offices
24/7 support available from our Employee Assistance Program
The opportunity to invest in our growth and success through our Employee Share Plan
Plus additional local benefits depending on your location
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.