CrawlJobs Logo

Senior Data Engineer

https://www.citi.com/ Logo

Citi

Location Icon

Location:
India, Chennai

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Senior Data Engineer role designing, developing, and implementing cutting-edge data engineering solutions using modern big data and cloud technologies. Collaborates with product owners, data scientists, analysts, and technologists to deliver scalable, high-performance data products in an agile environment.

Job Responsibility:

  • Design and develop scalable big data solutions using platforms like Hadoop, Snowflake, or other modern data ecosystems
  • Collaborate with domain experts, product managers, analysts, and data scientists to build robust data pipelines
  • Lead migration of legacy workloads to cloud platforms (AWS, Azure, or GCP)
  • Develop and implement cloud-native solutions for data processing and storage
  • Partner with data scientists to build data pipelines from heterogeneous sources
  • Enable advanced analytics and machine learning workflows
  • Implement CI/CD pipelines to automate data engineering workflows
  • Research and evaluate open-source technologies
  • Mentor team members on big data and cloud technologies
  • Define and enforce coding standards and reusable components
  • Convert SAS-based pipelines into modern frameworks like PySpark, Scala, or Java
  • Optimize big data applications for performance and scalability
  • Analyze evolving business requirements and recommend enhancements
  • Ensure compliance with applicable laws, regulations, and organizational policies

Requirements:

  • 8+ years of experience with Hadoop (Cloudera) and big data technologies
  • Advanced knowledge of Hadoop ecosystem including HDFS, MapReduce, Hive, Pig, Impala, Spark, Kafka, Kudu, and Solr
  • Proficiency in Java, Python, or Scala
  • Hands-on experience with Spark programming (PySpark, Scala, or Java)
  • Familiarity with Apache Beam
  • Experience with cloud platforms like AWS, Azure, or GCP
  • Expertise in designing and developing data pipelines for ingestion, transformation, and processing
  • Experience with Snowflake or Delta Lake
  • Hands-on experience with containerization tools like Docker and Kubernetes
  • Proficiency in DevOps practices including source control, CI/CD, and automated deployments
  • Experience with Python libraries for machine learning and data science workflows
  • Strong knowledge of data structures, algorithms, distributed storage, and compute systems
  • 1+ year of SAS experience preferred
  • 1+ year of Hadoop administration experience preferred
  • Strong problem-solving and analytical skills
  • Excellent interpersonal and teamwork abilities
  • Proven leadership experience including mentoring and managing a team
  • Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience)

Nice to have:

  • Familiarity with Apache Beam
  • Experience with Snowflake or Delta Lake
  • 1+ year of SAS experience
  • 1+ year of Hadoop administration experience

Additional Information:

Job Posted:
August 21, 2025

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.