CrawlJobs Logo

Ab Initio Data Engineer

https://www.citi.com/ Logo

Citi

Location Icon

Location:
India , Chennai

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities.

Job Responsibility:

  • Ability to design and build Ab Initio graphs (both continuous & batch) and Conduct>it Plans
  • Build Web-Service and RESTful graphs and create RAML or Swagger documentations
  • Complete understanding and analytical ability of Metadata Hub metamodel
  • Strong hands on Multifile system level programming, debugging and optimization skill
  • Hands on experience in developing complex ETL applications
  • Good knowledge of RDBMS – Oracle, with ability to write complex SQL needed to investigate and analyze data issues
  • Strong in UNIX Shell/Perl Scripting
  • Build graphs interfacing with heterogeneous data sources – Oracle, Snowflake, Hadoop, Hive, AWS S3
  • Build application configurations for Express>It frameworks – Acquire>It, Spec-To-Graph, Data Quality Assessment
  • Build automation pipelines for Continuous Integration & Delivery (CI-CD), leveraging Testing Framework & JUnit modules, integrating with Jenkins, JIRA and/or Service Now
  • Build Query>It data sources for cataloguing data from different sources
  • Parse XML, JSON & YAML documents including hierarchical models
  • Build and implement data acquisition and transformation/curation requirements in a data lake or warehouse environment, and demonstrate experience in leveraging various Ab Initio components
  • Build Autosys or Control Center Jobs and Schedules for process orchestration
  • Build BRE rulesets for reformat, rollup & validation usecases
  • Build SQL scripts on database, performance tuning, relational model analysis and perform data migrations
  • Ability to identify performance bottlenecks in graphs, and optimize them
  • Ensure Ab Initio code base is appropriately engineered to maintain current functionality and development that adheres to performance optimization, interoperability standards and requirements, and compliance with client IT governance policies
  • Build regression test cases, functional test cases and write user manuals for various projects
  • Conduct bug fixing, code reviews, and unit, functional and integration testing
  • Participate in the agile development process, and document and communicate issues and bugs relative to data standards
  • Pair up with other data engineers to develop analytic applications leveraging Big Data technologies: Hadoop, NoSQL, and In-memory Data Grids
  • Challenge and inspire team members to achieve business results in a fast paced and quickly changing environment
  • Perform other duties and/or special projects as assigned

Requirements:

  • Bachelor's degree in a quantitative field (such as Engineering, Computer Science, Statistics, Econometrics)
  • Minimum 5 years of extensive experience in design, build and deployment of Ab Initio-based applications
  • Expertise in handling complex large-scale Data Lake and Warehouse environments
  • Hands-on experience writing complex SQL queries, exporting and importing large amounts of data using utilities

Additional Information:

Job Posted:
March 22, 2025

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Ab Initio Data Engineer

Data Engineer

The Data Engineer is accountable for developing high quality data products to su...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • First Class Degree in Engineering/Technology/MCA
  • 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies
  • Experience of relational databases and using SQL for data querying, transformation and manipulation
  • Experience of modelling data for analytical consumers
  • Ability to automate and streamline the build, test and deployment of data pipelines
  • Experience in cloud native technologies and patterns
  • A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training
  • Excellent communication and problem-solving skills
  • ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica
  • Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing
Job Responsibility
Job Responsibility
  • Developing and supporting scalable, extensible, and highly available data solutions
  • Deliver on critical business priorities while ensuring alignment with the wider architectural vision
  • Identify and help address potential risks in the data supply chain
  • Follow and contribute to technical standards
  • Design and develop analytical data models
  • Fulltime
Read More
Arrow Right

Data Engineer

The Data Engineer is accountable for developing high quality data products to su...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • First Class Degree in Engineering/Technology/MCA
  • 3 to 4 years’ experience implementing data-intensive solutions using agile methodologies
  • Experience of relational databases and using SQL for data querying, transformation and manipulation
  • Experience of modelling data for analytical consumers
  • Ability to automate and streamline the build, test and deployment of data pipelines
  • Experience in cloud native technologies and patterns
  • A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training
  • Excellent communication and problem-solving skills
  • ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica
  • Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing
Job Responsibility
Job Responsibility
  • Developing and supporting scalable, extensible, and highly available data solutions
  • Deliver on critical business priorities while ensuring alignment with the wider architectural vision
  • Identify and help address potential risks in the data supply chain
  • Follow and contribute to technical standards
  • Design and develop analytical data models
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Join Inetum as a Data Engineer! At Inetum, we empower innovation and growth thro...
Location
Location
Portugal , Lisbon
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Teradata – advanced SQL and data warehousing
  • CONTROL-M – job scheduling and automation
  • UNIX – working in a UNIX environment (directories, scripting, etc.)
  • SQL (Teradata) – strong querying and data manipulation skills
  • Ab Initio – data integration and ETL development
  • DevOps – CI/CD practices and automation
  • Collaborative tools – GIT, Jira, Confluence, MEGA, Zeenea
Job Responsibility
Job Responsibility
  • Design, development, and optimization of data solutions that support business intelligence and analytics
  • Fulltime
Read More
Arrow Right

Lead Data Engineer

At Citi Fund Services is undergoing a major transformation effort to transform t...
Location
Location
United Kingdom , Belfast
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Significant years of hands-on experience in software development, with proven experience in data integration / data pipeline developments
  • Exceptional technical leader with a proven background in delivery of significant projects
  • Multi-year experience in Data integration development (Ab Initio, Talend, Apache spark, AWS Glue, SSIS or equivalent) including optimization, tuning and benchmarking
  • Multi-year experience in SQL Oracle, MSSQL and equivalents including optimization, tuning and benchmarking
  • Expertise with Cloud-native development and Container Orchestration tools (Serverless, Docker, Kubernetes, OpenShift, etc.) a significant plus
  • Strong understanding of Agile methodologies (Scrum, Kanban) and experience working in Agile teams
  • Exposure to Continuous Integration and Continuous Delivery (CI/CD) pipelines, either on-premises or public cloud (i.e., Tekton, Harness, Jenkins, etc.)
  • Demonstrable expertise in financial services considered a plus
  • Self-starter with the ability to drive projects independently and deliver results in a fast paced environment
Job Responsibility
Job Responsibility
  • Architect and develop enterprise-scale data pipelines using the latest data streaming technologies
  • Implement and optimize delivered solutions through tuning for optimal performance through frequent benchmarking
  • Develop containerised solutions capable of running in private or public cloud
  • Ensure solution is aligned to ci/cd tooling and standards
  • Ensure solution is aligned to observability standards
  • Effectively communicate technical solutions and artifacts to non-technical stakeholders and senior leadership
  • Contribute to the journey of modernizing existing data processors move to common and cloud
  • Collaborate with cross function domain experts to translate business requirements to scalable data solutions
What we offer
What we offer
  • 27 days annual leave (plus bank holidays)
  • A discretional annual performance related bonus
  • Private Medical Care & Life Insurance
  • Employee Assistance Program
  • Pension Plan
  • Paid Parental Leave
  • Special discounts for employees, family, and friends
  • Access to an array of learning and development resources
  • Fulltime
Read More
Arrow Right

Data Engineer

The Data Engineer is accountable for developing high quality data products to su...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • First Class Degree in Engineering/Technology (4-year graduate course)
  • 4 to 8 years' experience implementing data-intensive solutions using agile methodologies
  • Experience of relational databases and using SQL for data querying, transformation and manipulation
  • Experience of modelling data for analytical consumers
  • Ability to automate and streamline the build, test and deployment of data pipelines
  • Experience in cloud native technologies and patterns
  • A passion for learning new technologies, and a desire for personal growth
  • Excellent communication and problem-solving skills
  • ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica
  • Big Data: Experience of 'big data' platforms such as Hadoop, Hive or Snowflake for data storage and processing
Job Responsibility
Job Responsibility
  • Developing and supporting scalable, extensible, and highly available data solutions
  • Deliver on critical business priorities while ensuring alignment with the wider architectural vision
  • Identify and help address potential risks in the data supply chain
  • Follow and contribute to technical standards
  • Design and develop analytical data models
What we offer
What we offer
  • Programs and services for physical and mental well-being including access to telehealth options, health advocates, confidential counseling
  • Empowerment to manage financial well-being and help plan for the future
  • Fulltime
Read More
Arrow Right

Data Engineer

At Citi Fund Services is undergoing a major transformation effort to transform t...
Location
Location
United Kingdom , Belfast
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Substantial hands-on experience in software development, with proven experience in data integration / data pipeline developments
  • Proven experience in Data integration development (Ab Initio, Talend, Apache Spark, AWS Glue, SSIS or equivalent) including optimization, tuning and benchmarking
  • Proven experience in SQL Oracle, MSSQL and equivalents including optimization, tuning and benchmarking
  • Understanding of Cloud-native development and Container Orchestration tools (Serverless, Docker, Kubernetes, OpenShift, etc.) as significant plus
  • Understanding of Agile methodologies (Scrum, Kanban) and experience working in Agile teams
  • Exposure to Continuous Integration and Continuous Delivery (CI/CD) pipelines, either on-premises or public cloud (i.e., Tekton, Harness, Jenkins, etc.)
  • Demonstrable expertise in financial services considered a plus
  • Self-starter with the ability to drive projects independently and deliver results in a fast-paced environment.
Job Responsibility
Job Responsibility
  • Develop enterprise-scale data pipelines using the latest data streaming technologies
  • Optimize delivered solutions through tuning for optimal performance through frequent benchmarking
  • Develop containerised solutions capable of running in private or public cloud
  • Ensure solution is aligned to ci/cd tooling and standards
  • Ensure solution is aligned to observability standards
  • Contribute to the journey of modernizing existing data processors and move to common and cloud
  • Collaborate with cross function domain experts to translate business requirements to scalable data solutions.
  • Fulltime
Read More
Arrow Right

Ab Initio Developer

We are seeking an Ab Initio Developer to support the development, deployment, an...
Location
Location
India , Pune
Salary
Salary:
Not provided
vodafone.com Logo
Vodafone
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Skilled in BI ETL development using Ab Initio and Control Centre
  • Proficient in UNIX/LINUX with strong scripting capabilities
  • Experienced in the telecommunications domain
  • Brings 3–6 years of relevant technical experience
  • Holds a B.E., B.Tech, MCA or M.Sc degree
  • Understands data warehousing concepts
Job Responsibility
Job Responsibility
  • Lead the build phase of ETL jobs to load and transform data using Ab Initio, Control Centre, and related tools
  • Develop bespoke graphs, reusable frameworks, and scalable data assets following a “build once, deploy many” approach
  • Create clear and comprehensive low-level design documentation and implementation details
  • Support and mentor junior team members to help them develop technical capability and confidence
  • Contribute to service quality by adopting efficient processes, optimising resources, and encouraging modern tool usage
  • Support the organisation’s quality standards by implementing best practices within the BI Service Line
What we offer
What we offer
  • Opportunity to work on enterprise‑grade ETL and data engineering technologies
  • Exposure to a large‑scale telecommunications data environment
  • The ability to influence quality processes and help uplift engineering standards
  • A collaborative environment supporting continuous learning and mentorship
Read More
Arrow Right

Data Engineer

Join us as a Data Engineer at Barclays, where you'll spearhead the evolution of ...
Location
Location
India , Pune
Salary
Salary:
Not provided
barclays.co.uk Logo
Barclays
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Core Ab Initio Skills: GDE (graphs, XFR, plans, PSETs, DQE)
  • EME (versioning, branching, tagging)
  • Conduct‑IT / Control Center
  • Continuous Flows, PDL, metadata‑driven design
  • Big Data Skills: Hadoop Ecosystem: HDFS, Hive, Yarn, MapReduce basics
  • Optional/Good to Have: Spark (PySpark/Scala)
  • HBase
  • Kafka streaming interfaces
  • Strong knowledge of distributed computing concepts (partitions, parallelism, memory tuning)
  • Databases & Languages: SQL (HiveQL, Teradata, Oracle)
Job Responsibility
Job Responsibility
  • Investigation and analysis of data issues related to quality, lineage, controls, and authoritative source identification, documenting data sources, methodologies, and quality findings with recommendations for improvement
  • Designing and building data pipelines to automate data movement and processing
  • Apply advanced analytical techniques to large datasets to uncover trends and correlations, develop validated logical data models, and translate insights into actionable business recommendations that drive operational and process improvements, leveraging machine learning/AI
  • Through data-driven analysis, translate analytical findings into actionable business recommendations, identifying opportunities for operational and process improvements
  • Design and create interactive dashboards and visual reports using applicable tools and automate reporting processes for regular and ad-hoc stakeholder needs
  • Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function
  • Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes
  • Consult on complex issues
  • providing advice to People Leaders to support the resolution of escalated issues
  • Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda
What we offer
What we offer
  • Hybrid working
  • Structured approach to hybrid working with fixed 'anchor' days
  • Supportive and inclusive culture and environment
  • Flexible working arrangements can be discussed
  • Opportunities for colleagues to innovate, collaborate, and deliver great outcomes
  • Welcoming and inclusive culture supports you to bring your whole self to work, explore your potential and pursue your passions
  • Environment-friendly, ultra-modern workplace with excellent facilities for work, socialising and leisure
  • Fulltime
Read More
Arrow Right