This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Reliable and accurate asset, inspection, and maintenance data is essential for effective decision-making, investment, and sustainable asset management. To support this, Aurora is rolling out a new Asset Management System (IBM Maximo) along with updated frameworks to improve data quality and operational efficiency. You will be instrumental in ensuring Aurora’s asset data is accurate, structured, and actionable.
Job Responsibility:
Analyse asset, inspection, and maintenance data to identify gaps, inconsistencies, and opportunities for improvement
Apply best-practice approaches to asset data structures, data flows, and preventative maintenance planning
Configure and implement data structures in IBM Maximo, including job plans, service plans, and maintenance schedules aligned to engineering standards
Work with field service partners, GIS, and integration teams to ensure data flows correctly from project design through to field execution
Support data migration, transformation, and validation to improve quality, completeness, and consistency
Document data models, processes, and data capture points to support handover to BAU teams
Collaborate with engineers, lifecycle managers, and change leads to embed data best practices into everyday workflows
Requirements:
Experience with asset, inspection, or maintenance data in an asset-intensive environment
Ability to analyse datasets to identify trends, gaps, and data quality issues, and propose improvements
Curiosity, independence, and ability to design and implement processes that maintain clean, structured, and AI-ready data
Excellent communication skills to work with engineers, field teams, and integration specialists
Nice to have:
Experience with IBM Maximo or similar Asset Management Systems
SQL and data analysis skills to extract, validate, and manipulate large datasets
A Tertiary qualification in data engineering, or computer science
Ability to create repeatable, scalable processes that ensure accurate and complete data capture