This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are seeking a Professional Data Engineer to join our dynamic team, where you will play a crucial role in developing and maintaining robust data solutions. As a Professional Data Engineer, you will collaborate with data science, business analytics, and product development teams to deploy cutting-edge techniques and utilise best-in-class third-party products. The Data team operates with engineering precision, prioritising security, privacy, and regulatory compliance in every initiative. As a Professional Data Engineer, you will contribute to the team's commitment to utilising the latest tools and methodologies, ensuring that our data solutions align with industry best practices.
Job Responsibility:
Develop and maintain ETL pipelines using SQL and/or Python
Use tools like Dagster/Airflow for pipeline orchestration
Collaborate with cross-functional teams to understand and deliver data requirements
Ensure a consistent flow of high-quality data using stream, batch, and CDC processes
Use data transformation tools like DBT to prepare datasets to enable business users to self-service
Ensure data quality and consistency in all data stores
Monitor and troubleshoot data pipelines for performance and reliability
Requirements:
3+ years of experience as a data engineer
Proficiency in SQL is a must
Experience with modern cloud data warehousing, data lake solutions like Snowflake, BigQuery, Redshift, Azure Synapse
Experience with ETL/ELT, batch, streaming data processing pipelines
Excellent ability to investigate and troubleshoot data issues, providing fixes and proposing both short and long-term solutions
Knowledge of AWS services (like S3, DMS, Glue, Athena, etc.)
Familiar with DBT or other data transformation tools
Familiarity with GenAI, and how to leverage LLMs to resolve engineering challenges
Nice to have:
Experience with AWS services and concepts (like EC2, ECS, EKS, VPC, IAM, etc)
Familiar with Terraform and Terragrunt
Experience with Python
Experience with orchestration tools like Dagster, Airflow, AWS Step functions, etc
Experience with pub-sub, queuing, and streaming frameworks such as AWS Kinesis, Kafka, SQS, SNS