This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Data Operations Engineer provides 1st and 2nd level support for operational teams using our data platform. This role focuses on troubleshooting, root cause analysis, and proactive monitoring to ensure the reliability and performance of data services. The ideal candidate will possess strong analytical skills, problem-solving abilities, and a passion for data operations. Responsibilities include monitoring and maintaining systems and pipelines that deliver critical financial data, implementing quality checks, and collaborating with development and infrastructure teams to optimize workflows.
Job Responsibility:
Serve as the primary point of contact for operational teams, providing 1st and 2nd level support for data platform issues
Monitor and maintain ETL/ELT workflows for uptime, performance, and timely data processing
Implement data validation and quality checks to ensure accuracy and reduce mean time to detect (MTTD) and mean time to resolve (MTTR) for data issues
Troubleshoot and resolve operational issues in data pipelines, conducting root cause analysis and documenting findings to prevent recurrence
Collaborate with upstream teams to ingest data sources, while partnering with downstream teams and lateral teams to normalize data feeds and ensure that pipeline deliveries consistently meet agreed-upon SLAs
Optimize workflows for cost and performance across cloud platforms, while maintaining documentation and runbooks for operational processes
Proactively review and analyze system logs and performance metrics to identify potential issues before they impact operations
Support incident management, create incident playbooks, and root cause analysis for data movement failures
Participate in on-call support rotation to address critical incidents
Assist in the deployment and configuration of new data services and tools, ensuring they meet operational requirements
Identify opportunities for process improvements and automation to enhance operational efficiency
Stay current with industry trends and best practices in data operations and support
Requirements:
Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent experience
2+ years of proven experience in a data operations or support role within a data platform environment
Strong understanding of data management concepts, ETL processes, and data warehousing
Proficient in SQL and scripting languages (Python, Bash)
Experience in workflow orchestration tools like Apache Airflow or DataBricks Lakeflow
Knowledge of AWS data services, including Athena, MWAA, Amazon S3, and Iceberg
Experience with FTP, data distribution platforms, and data integration tools like Airbyte
Familiarity with data transformation tools such as DBT and data processing frameworks like Apache Spark
Basic observability skills, including monitoring, logging, and alerting
Excellent analytical and problem-solving skills, with a keen attention to detail
Strong communication skills, capable of conveying technical information to non-technical stakeholders
Ability to work independently and collaboratively in a fast-paced environment
Nice to have:
Financial data knowledge
Specific experience in Grafana being desirable
What we offer:
Support for professional accreditations such as ACCA and study leave
Flexible arrangements, generous holidays, plus an additional day off for your birthday
Continuous mentoring along your career progression
Active sports, events and social committees across our offices
24/7 support available from our Employee Assistance Program
The opportunity to invest in our growth and success through our Employee Share Plan
Plus additional local benefits depending on your location