This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are seeking an experienced Ab Initio Developer to design, develop, optimise, and support robust ETL solutions within large-scale data processing environments. This role is ideal for an individual with strong hands-on expertise in Ab Initio, Unix, and Teradata, combined with exposure to cloud platforms such as Google Cloud Platform (GCP). The role involves close collaboration with operations, infrastructure, and business stakeholders, as well as active participation in production support and continuous improvement initiatives.
Job Responsibility:
Design, develop, and maintain ETL workflows using Ab Initio tools, including GDE, EME, and the Co Operating System
Build, enhance, and optimise data ingestion, transformation, and extraction pipelines
Develop and maintain Unix/Linux shell scripts for automation, scheduling, and monitoring
Create and optimise SQL queries, data models, and utilities within Teradata environments
Perform performance tuning and troubleshooting of ETL jobs and database queries
Collaborate with operations, infrastructure, and business teams to support production issues and enhancements
Follow best practices for data quality, security, governance, and compliance
Provide support during releases, incident resolution, and root cause analysis activities
Maintain clear technical documentation, including designs, workflows, and operational procedures
Requirements:
4–8 years of hands-on experience in Ab Initio development
Proficient in Unix/Linux commands and shell scripting
Experienced in Teradata SQL development, performance tuning, and related utilities
Knowledgeable in data warehousing concepts and ETL best practices
Comfortable working in production support environments with structured incident management processes
Familiar with scheduling tools such as Autosys or Control-M (preferred)
Basic working knowledge of GCP services, including Google Cloud Storage, BigQuery, and Compute Engine (preferred)
Experienced in Agile or Scrum-based delivery environments (preferred)
Nice to have:
Familiar with scheduling tools such as Autosys or Control-M
Basic working knowledge of GCP services, including Google Cloud Storage, BigQuery, and Compute Engine
Experienced in Agile or Scrum-based delivery environments
What we offer:
Opportunities to work on enterprise metadata environments
Exposure to multi‑stage release cycles, CI/CD pipelines, and industry‑standard governance models
A collaborative environment with continuous learning and cross‑functional engagement
Experience working across a diverse, global organisation