CrawlJobs Logo

Ab Initio Data Engineer

https://www.citi.com/ Logo

Citi

Location Icon

Location:
India, Pune

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are looking for Ab Initio Data Engineer to design and build Ab Initio-based applications across Data Integration, Governance & Quality domains, and contribute to Compliance Risk programs by acquiring data across multiple internal and external sources, providing analytical insights, and integrating with Citi systems. The position entails working with technical leads and senior engineers, building applications, supporting production environments, and ensuring success of high-visibility projects.

Job Responsibility:

  • Build production environments leveraging Ab Initio tech-stack
  • Ensure code adheres to performance optimization, interoperability standards, requirements, and compliance policies
  • Conduct bug fixing, code reviews, unit and integration testing
  • Design and build Ab Initio graphs and Conduct>it Plans
  • Create RAML or Swagger documentations for RESTful graphics
  • Develop ETL applications and configure Express>It frameworks
  • Build Autosys or Control Center Jobs for process orchestration
  • Participate in agile development process and document issues
  • Develop applications leveraging Big Data technologies

Requirements:

  • Bachelor's degree in a quantitative field (such as Engineering, Computer Science, Statistics, Econometrics) and a minimum of 5 years of experience
  • Minimum 5 years of extensive experience in design, build and deployment of Ab Initio-based applications
  • Expertise in handling complex large-scale Data Lake and Warehouse environments
  • Hands-on experience writing complex SQL queries, exporting and importing large amounts of data using utilities

Nice to have:

  • Ability to design and build Ab Initio graphs and Conduct>it Plans
  • Build Web-Service and RESTful graphs and create RAML or Swagger documentations
  • Analytical ability of Metadata Hub metamodel
  • Hands-on Multifile system level programming, debugging and optimization skill
  • Experience in developing complex ETL applications
  • Good knowledge of RDBMS – Oracle
  • Strong in UNIX Shell/Perl Scripting
  • Build graphs interfacing with heterogeneous data sources – Oracle, Snowflake, Hadoop, Hive, AWS S3
  • Build automation pipelines for CI-CD integrating Jenkins, JIRA and/or Service Now
  • Parse XML, JSON & YAML documents including hierarchical models
  • Identify performance bottlenecks in graphs and optimize them
  • Pair up with other data engineers to develop analytic applications leveraging Big Data technologies

Additional Information:

Job Posted:
July 12, 2025

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.