CrawlJobs Logo

Big Data Developer

https://www.citi.com/ Logo

Citi

Location Icon

Location:
India, Chennai

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

Not provided

Job Description:

The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data-driven decision-making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence, they will contribute to business outcomes on an agile team.

Job Responsibility:

  • Developing and supporting scalable, extensible, and highly available data solutions
  • Delivering on critical business priorities while ensuring alignment with the wider architectural vision
  • Identifying and helping address potential risks in the data supply chain
  • Following and contributing to technical standards
  • Designing and developing analytical data models

Requirements:

  • First Class Degree in Engineering/Technology (4-year graduate course)
  • 9 to 11 years’ experience implementing data-intensive solutions using agile methodologies
  • Experience of relational databases and using SQL for data querying, transformation and manipulation
  • Experience of modelling data for analytical consumers
  • Ability to automate and streamline the build, test and deployment of data pipelines
  • Experience in cloud native technologies and patterns
  • A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training
  • Excellent communication and problem-solving skills
  • An inclination to mentor
  • an ability to lead and deliver medium sized components independently

Nice to have:

  • Hands-on experience of building data pipelines
  • Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend, and Informatica
  • Experience of ‘big data’ platforms such as Hadoop, Hive, or Snowflake for data storage and processing
  • Expertise around data warehousing concepts, relational and NoSQL database design
  • Good exposure to data modeling techniques
  • Proficient in one or more programming languages such as Python, Java, or Scala
  • Exposure to CI/CD platforms, version control, automated quality control management
  • Strong grasp of data quality, security, privacy, compliance
  • Experience developing Co>Op graphs
  • Demonstrable knowledge of Ab Initio toolsets
  • Exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery
  • Understanding of data validation, cleansing, enrichment, and data controls
  • Understanding of containerization platforms like Docker, Kubernetes
  • Experience with file formats such as Avro, Parquet, Protobuf
  • Experience using a job scheduler such as Autosys
  • Exposure to Business Intelligence tools such as Tableau, Power BI
  • Certification in relevant skills
What we offer:

Best-in-class benefits to be well, live well, and save well

Additional Information:

Job Posted:
June 11, 2025

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.