CrawlJobs Logo

Lead Data Integration Engineer

https://www.citi.com/ Logo

Citi

Location Icon

Location:
India , Pune

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

A highly experienced and technically adept Senior Integration Engineer to join our Wealth Lending platforms team. This role demands a blend of strategic thinking, technical leadership, and hands-on development expertise. The ideal candidate will not only own the data & integration strategy but also actively contribute to the design, development, and implementation of robust and scalable integration solutions

Job Responsibility:

  • Owning overall data integration strategy, design, development, & delivery across Wealth lending platforms
  • Providing technical leadership to integration developers, and actively contributing to the design, development, and implementation of integration pipelines
  • Design & Develop integration patterns, including batch, real-time and event-based integrations with robust error handling, monitoring & replay mechanisms
  • Design, Build, & Manage data access integration layer, ensuring consistent data extracts & schema management
  • Perform & oversee source system data profiling, data discovery and data quality assessments, identifying gaps and driving remediation strategies
  • Define, Implement and validating complex transformation, standardization, and mapping rules for data integration
  • Champion and utilize AI assisted development tools and authentication frameworks to improve developer productivity, code quality, testing & documentation across the engineering team
  • Provide technical leadership and delivery ownership, guiding the team on architecture decisions, best practices, and ensuring high quality delivery of features and platform enhancements
  • Identify problems, analyze information, and make evaluative judgments to recommend and implement solutions
  • Resolve issues by identifying and selecting solutions through the applications of acquired technical experience and guided by precedents
  • Has the ability to operate with a limited level of direct supervision
  • Apply fundamental knowledge of programming languages for design specifications and active development
  • Appropriately assess risk when business decisions are made, demonstrating consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency

Requirements:

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field
  • 10+ years of progressive experience in data integration, data engineering, or a similar role, with at least 4 years in a technical leadership or senior capacity
  • Demonstrable experience within the financial services or banking industry, specifically with wealth management or lending platforms
  • Extensive hands-on experience with end-to-end design, development, coding, and deployment of complex data integration solutions
  • Proficiency in designing, developing, and managing data access integration layers, ensuring data consistency, efficient extraction, and robust schema management
  • Strong background in source system data profiling, data discovery, data quality assessments, and driving remediation strategies. With knowledge of wholesale Lending domain
  • Skilled in defining and implementing intricate data transformation, standardization, and mapping rules
  • Hands-on experience with leading data integration platforms, ETL/ELT tools, message queuing systems, and API management solutions
  • Solid understanding and hands-on RDBMS (SQL, PL/SQL) + knowledge of NoSQL databases
  • Practical experience with AI-assisted development tools and modern authentication frameworks to enhance productivity, code quality, testing, and documentation
  • Solid understanding of fundamental programming concepts and proven experience in developing with relevant languages (e.g., Python, Java, Scala) for design specifications and implementation
  • Proven ability to lead, mentor, and guide integration developers on design, development, and implementation best practices for data pipelines
  • Expertise in designing, building, and governing diverse data integration patterns, including batch, real-time, and event-based architectures
  • Deep understanding of implementing robust error handling, monitoring, logging, and replay mechanisms within integration solutions
  • A strong commitment to ensuring high-quality deliverables through architectural guidance, best practices, and rigorous testing
  • Hands-on knowledge of messaging systems - IBM MQ, Kafka, Solace etc
  • Demonstrated ability to assess and mitigate risks in technical decisions, ensuring compliance with applicable laws, rules, and regulations within a regulated environment
  • Adherence to internal policies, sound ethical judgment, and transparent escalation/management of control issues
  • Ability to operate with a high degree of autonomy and drive initiatives from conception to delivery with limited direct supervision
  • Excellent communication and interpersonal skills, fostering effective collaboration with cross-functional teams and stakeholders

Nice to have:

  • Certifications in relevant data integration technologies, cloud platforms (e.g., AWS, Azure, GCP), or data governance are a plus
  • Experience with hands-on design and implementations of cloud-ready applications and deployment pipelines on large-scale container platform clusters is a plus
  • Experience working in a Continuous Integration and Continuous Delivery environment and familiar with Tekton, Harness, Jenkins, Code Quality, etc

Additional Information:

Job Posted:
April 10, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Lead Data Integration Engineer

Lead Data Engineer

At Citi Fund Services is undergoing a major transformation effort to transform t...
Location
Location
United Kingdom , Belfast
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Significant years of hands-on experience in software development, with proven experience in data integration / data pipeline developments
  • Exceptional technical leader with a proven background in delivery of significant projects
  • Multi-year experience in Data integration development (Ab Initio, Talend, Apache spark, AWS Glue, SSIS or equivalent) including optimization, tuning and benchmarking
  • Multi-year experience in SQL Oracle, MSSQL and equivalents including optimization, tuning and benchmarking
  • Expertise with Cloud-native development and Container Orchestration tools (Serverless, Docker, Kubernetes, OpenShift, etc.) a significant plus
  • Strong understanding of Agile methodologies (Scrum, Kanban) and experience working in Agile teams
  • Exposure to Continuous Integration and Continuous Delivery (CI/CD) pipelines, either on-premises or public cloud (i.e., Tekton, Harness, Jenkins, etc.)
  • Demonstrable expertise in financial services considered a plus
  • Self-starter with the ability to drive projects independently and deliver results in a fast paced environment
Job Responsibility
Job Responsibility
  • Architect and develop enterprise-scale data pipelines using the latest data streaming technologies
  • Implement and optimize delivered solutions through tuning for optimal performance through frequent benchmarking
  • Develop containerised solutions capable of running in private or public cloud
  • Ensure solution is aligned to ci/cd tooling and standards
  • Ensure solution is aligned to observability standards
  • Effectively communicate technical solutions and artifacts to non-technical stakeholders and senior leadership
  • Contribute to the journey of modernizing existing data processors move to common and cloud
  • Collaborate with cross function domain experts to translate business requirements to scalable data solutions
What we offer
What we offer
  • 27 days annual leave (plus bank holidays)
  • A discretional annual performance related bonus
  • Private Medical Care & Life Insurance
  • Employee Assistance Program
  • Pension Plan
  • Paid Parental Leave
  • Special discounts for employees, family, and friends
  • Access to an array of learning and development resources
  • Fulltime
Read More
Arrow Right

Data Engineering Lead

Data Engineering Lead a strategic professional who stays abreast of developments...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10-15 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting and other Big data frameworks
  • 4+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase
  • Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc) and Scala and SQL
  • Data Integration, Migration & Large Scale ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio etc.) - ETL design & build, handling, reconciliation and normalization
  • Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning)
  • Experienced in working with large and multiple datasets and data warehouses
  • Experience building and optimizing ‘big data’ data pipelines, architectures, and datasets
  • Strong analytic skills and experience working with unstructured datasets
  • Ability to effectively use complex analytical, interpretive, and problem-solving techniques
  • Experience with Confluent Kafka, Redhat JBPM, CI/CD build pipelines and toolchain – Git, BitBucket, Jira
Job Responsibility
Job Responsibility
  • Strategic Leadership: Define and execute the data engineering roadmap for Global Wealth Data, aligning with overall business objectives and technology strategy
  • Team Management: Lead, mentor, and develop a high-performing, globally distributed team of data engineers, fostering a culture of collaboration, innovation, and continuous improvement
  • Architecture and Design: Oversee the design and implementation of robust and scalable data pipelines, data warehouses, and data lakes, ensuring data quality, integrity, and availability for global wealth data
  • Technology Selection and Implementation: Evaluate and select appropriate technologies and tools for data engineering, staying abreast of industry best practices and emerging trends specific to wealth management data
  • Performance Optimization: Continuously monitor and optimize data pipelines and infrastructure for performance, scalability, and cost-effectiveness, ensuring optimal access to global wealth data
  • Collaboration: Partner with business stakeholders, data scientists, portfolio managers, and other technology teams to understand data needs and deliver effective solutions that support investment strategies and client reporting
  • Data Governance: Implement and enforce data governance policies and procedures to ensure data quality, security, and compliance with relevant regulations, particularly around sensitive financial data
  • Fulltime
Read More
Arrow Right

Big Data / Scala / Python Engineering Lead

The Applications Development Technology Lead Analyst is a senior level position ...
Location
Location
India , Chennai
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • At least two years (Over all 10+ hands on Data Engineering experience) of experience building and leading highly complex, technical data engineering teams
  • Lead data engineering team, from sourcing to closing
  • Drive strategic vision for the team and product
  • Experience managing an data focused product, ML platform
  • Hands on experience relevant experience in design, develop, and optimize scalable distributed data processing pipelines using Apache Spark and Scala
  • Experience managing, hiring and coaching software engineering teams
  • Experience with large-scale distributed web services and the processes around testing, monitoring, and SLAs to ensure high product quality
  • 7 to 10+ years of hands-on experience in big data development, focusing on Apache Spark, Scala, and distributed systems
  • Proficiency in Functional Programming: High proficiency in Scala-based functional programming for developing robust and efficient data processing pipelines
  • Proficiency in Big Data Technologies: Strong experience with Apache Spark, Hadoop ecosystem tools such as Hive, HDFS, and YARN
Job Responsibility
Job Responsibility
  • Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements
  • Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards
  • Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint
  • Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation
  • Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions
  • Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary
  • Fulltime
Read More
Arrow Right

Data Engineering Lead

The Engineering Lead Analyst is a senior level position responsible for leading ...
Location
Location
Singapore , Singapore
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10-15 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting and other Big data frameworks
  • 4+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase
  • Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc) and Scala and SQL
  • Data Integration, Migration & Large Scale ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio etc.) - ETL design & build, handling, reconciliation and normalization
  • Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning)
  • Experienced in working with large and multiple datasets and data warehouses
  • Experience building and optimizing ‘big data’ data pipelines, architectures, and datasets
  • Strong analytic skills and experience working with unstructured datasets
  • Ability to effectively use complex analytical, interpretive, and problem-solving techniques
  • Experience with Confluent Kafka, Redhat JBPM, CI/CD build pipelines and toolchain – Git, BitBucket, Jira
Job Responsibility
Job Responsibility
  • Define and execute the data engineering roadmap for Global Wealth Data, aligning with overall business objectives and technology strategy
  • Lead, mentor, and develop a high-performing, globally distributed team of data engineers, fostering a culture of collaboration, innovation, and continuous improvement
  • Oversee the design and implementation of robust and scalable data pipelines, data warehouses, and data lakes, ensuring data quality, integrity, and availability for global wealth data
  • Evaluate and select appropriate technologies and tools for data engineering, staying abreast of industry best practices and emerging trends specific to wealth management data
  • Continuously monitor and optimize data pipelines and infrastructure for performance, scalability, and cost-effectiveness
  • Partner with business stakeholders, data scientists, portfolio managers, and other technology teams to understand data needs and deliver effective solutions
  • Implement and enforce data governance policies and procedures to ensure data quality, security, and compliance with relevant regulations
What we offer
What we offer
  • Equal opportunity employer commitment
  • Accessibility and accommodation support
  • Global workforce benefits
  • Fulltime
Read More
Arrow Right

Lead Data Engineer

We are seeking an experienced Senior Data Engineer to lead the development of a ...
Location
Location
India , Kochi; Trivandrum
Salary
Salary:
Not provided
experionglobal.com Logo
Experion Technologies
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years experience in data engineering with analytical platform development focus
  • Proficiency in Python and/or PySpark
  • Strong SQL skills for ETL processes and large-scale data manipulation
  • Extensive AWS experience (Glue, Lambda, Step Functions, S3)
  • Familiarity with big data systems (AWS EMR, Apache Spark, Apache Iceberg)
  • Database experience with DynamoDB, Aurora, Postgres, or Redshift
  • Proven experience designing and implementing RESTful APIs
  • Hands-on CI/CD pipeline experience (preferably GitLab)
  • Agile development methodology experience
  • Strong problem-solving abilities and attention to detail
Job Responsibility
Job Responsibility
  • Architect, develop, and maintain end-to-end data ingestion framework for extracting, transforming, and loading data from diverse sources
  • Use AWS services (Glue, Lambda, EMR, ECS, EC2, Step Functions) to build scalable, resilient automated data pipelines
  • Develop and implement automated data quality checks, validation routines, and error-handling mechanisms
  • Establish comprehensive monitoring, logging, and alerting systems for data quality issues
  • Architect and develop secure, high-performance APIs for data services integration
  • Create thorough API documentation and establish standards for security, versioning, and performance
  • Work with business stakeholders, data scientists, and operations teams to understand requirements
  • Participate in sprint planning, code reviews, and agile ceremonies
  • Contribute to CI/CD pipeline development using GitLab
Read More
Arrow Right

Lead Data Engineer

Join our dynamic team as a Lead Data Engineer to spearhead the development, opti...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
fissionlabs.com Logo
Fission Labs
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or master’s degree in computer science, Engineering, or related field
  • Expert-level understanding of Salesforce object model
  • Comprehensive knowledge of Salesforce integration patterns
  • Deep understanding of Salesforce data architecture
  • Experience with Salesforce Bulk API jobs
  • Ability to handle complex data transformations and migrations
  • Knowledge of Salesforce metadata and custom object relationships
  • Advanced Python programming skills
  • Proven experience with AWS cloud services, specifically: AWS Glue, AWS Step Functions, AWS Lambda, AWS S3, AWS CloudWatch, Athena
  • Salesforce data model, Bulk jobs and architecture
Job Responsibility
Job Responsibility
  • Experience leading and driving large scale data migration projects
  • Experience in working with all the stakeholders to plan and coordinate the entire migration process
  • Proficiency in Python development
  • Experience building scalable, event-driven data migration solutions
  • Strong understanding of ETL (Extract, Transform, Load) processes
  • Familiarity with cloud-native architecture principles
What we offer
What we offer
  • Opportunity to work on business challenges from top global clientele with high impact
  • Vast opportunities for self-development, including online university access and sponsored certifications
  • Sponsored Tech Talks, industry events & seminars to foster innovation and learning
  • Generous benefits package including health insurance, retirement benefits, flexible work hours, and more
  • Supportive work environment with forums to explore passions beyond work
  • Fulltime
Read More
Arrow Right

Lead Data Engineer

As a Lead Data Engineer or architect at Made Tech, you'll play a pivotal role in...
Location
Location
United Kingdom , Any UK Office Hub (Bristol / London / Manchester / Swansea)
Salary
Salary:
80000.00 - 96000.00 GBP / Year
madetech.com Logo
Made Tech
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proficiency in Git (inc. Github Actions) and able to explain the benefits of different branch strategies
  • Strong experience in IaC and able to guide how one could deploy infrastructure into different environments
  • Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop
  • Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes)
  • Ability to create data pipelines on a cloud environment and integrate error handling within these pipelines
  • You understand how to create reusable libraries to encourage uniformity or approach across multiple data pipelines
  • Able to document and present end-to-end diagrams to explain a data processing system on a cloud environment
  • Some knowledge of how you would present diagrams (C4, UML, etc.)
  • Enthusiasm for learning and self-development
  • You have experience of working on agile delivery-lead projects and can apply agile practices such as Scrum, XP, Kanban
Job Responsibility
Job Responsibility
  • Define, shape and perfect data strategies in central and local government
  • Help public sector teams understand the value of their data, and make the most of it
  • Establish yourself as a trusted advisor in data driven approaches using public cloud services like AWS, Azure and GCP
  • Contribute to our recruitment efforts and take on line management responsibilities
  • Help implement efficient data pipelines & storage
What we offer
What we offer
  • 30 days of paid annual leave
  • Flexible parental leave options
  • Part time remote working for all our staff
  • Paid counselling as well as financial and legal advice
  • 7% employer matched pension
  • Flexible benefit platform which includes a Smart Tech scheme, Cycle to work scheme, and an individual benefits allowance which you can invest in a Health care cash plan or Pension plan
  • Optional social and wellbeing calendar of events
  • Fulltime
Read More
Arrow Right

Senior Data Engineer – Data Engineering & AI Platforms

We are looking for a highly skilled Senior Data Engineer (L2) who can design, bu...
Location
Location
India , Chennai, Madurai, Coimbatore
Salary
Salary:
Not provided
optisolbusiness.com Logo
OptiSol Business Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong hands-on expertise in cloud ecosystems (Azure / AWS / GCP)
  • Excellent Python programming skills with data engineering libraries and frameworks
  • Advanced SQL capabilities including window functions, CTEs, and performance tuning
  • Solid understanding of distributed processing using Spark/PySpark
  • Experience designing and implementing scalable ETL/ELT workflows
  • Good understanding of data modeling concepts (dimensional, star, snowflake)
  • Familiarity with GenAI/LLM-based integration for data workflows
  • Experience working with Git, CI/CD, and Agile delivery frameworks
  • Strong communication skills for interacting with clients, stakeholders, and internal teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable ETL/ELT pipelines across cloud and big data platforms
  • Contribute to architectural discussions by translating business needs into data solutions spanning ingestion, transformation, and consumption layers
  • Work closely with solutioning and pre-sales teams for technical evaluations and client-facing discussions
  • Lead squads of L0/L1 engineers—ensuring delivery quality, mentoring, and guiding career growth
  • Develop cloud-native data engineering solutions using Python, SQL, PySpark, and modern data frameworks
  • Ensure data reliability, performance, and maintainability across the pipeline lifecycle—from development to deployment
  • Support long-term ODC/T&M projects by demonstrating expertise during technical discussions and interviews
  • Integrate emerging GenAI tools where applicable to enhance data enrichment, automation, and transformations
What we offer
What we offer
  • Opportunity to work at the intersection of Data Engineering, Cloud, and Generative AI
  • Hands-on exposure to modern data stacks and emerging AI technologies
  • Collaboration with experts across Data, AI/ML, and cloud practices
  • Access to structured learning, certifications, and leadership mentoring
  • Competitive compensation with fast-track career growth and visibility
  • Fulltime
Read More
Arrow Right