CrawlJobs Logo

Data Ops Capability Deployment Sr. Analyst

https://www.citi.com/ Logo

Citi

Location Icon

Location:
India, Pune

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

Not provided

Job Description:

Data Ops Capability Deployment Sr. Analyst is a seasoned professional role responsible for performing data analytics and building data science/tooling capabilities to support the broader Enterprise Data team. The role emphasizes data pipeline development, data quality solutions, and collaboration with stakeholders to refine data strategy and processes.

Job Responsibility:

  • Design, develop and maintain robust and scalable data pipeline to ingest, transform and load data from various sources
  • Implement Data Quality solutions, data validations rules and monitoring to ensure data accuracy
  • Research and evaluate new data technologies, data mesh architecture and self-service data platforms
  • Work closely with Enterprise Architecture Team on the definition and refinement of overall data strategy
  • Proactively identify and address data related challenges, including performance bottlenecks, batch orchestrations, Reports and Dashboards
  • Build analytics dashboards & data science capabilities for Enterprise Data platforms
  • Communicate complicated findings and propose solutions to a variety of stakeholders
  • Understanding business and functional requirements provided by business analysts and convert into technical design documents
  • Work closely with cross-functional teams e.g. Business Analysis, Product Assurance, Platforms and Infrastructure, Business Office, Control and Production Support
  • Prepare handover documents and manage SIT, UAT and Implementation
  • Demonstrate an in-depth understanding of how the development function integrates within overall business/technology to achieve objectives
  • Performs other duties and functions as assigned
  • Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets.

Requirements:

  • 13+ years of active development background and experience in Financial Services or Finance IT
  • Experience with Data Quality/Data Tracing/Data Lineage/Metadata Management Tools
  • Sound knowledge on implementing ETL using PySpark on distributed platforms along with data ingestion, Spark optimization, resource utilization, capacity planning & batch orchestration
  • Hands on with Hive, HDFS, Airflow, job scheduler
  • Strong programming skills in Python with experience in data manipulation and analysis libraries (Pandas, Numpy)
  • Strong knowledge in one or more of the BI visualization tools such as Tableau, PowerBI
  • Proven experience in implementing Datalake/Datawarehouse for enterprise use cases
  • Exposure to analytical tools and AI/ML is desired
  • Should be able to write complex SQL/Stored Procs
  • Working knowledge on DevOps, Jenkins/Lightspeed, Git, CoPilot is must.

Nice to have:

  • Exposure to analytical tools and AI/ML
  • Working knowledge on DevOps, Jenkins/Lightspeed, Git, CoPilot.
What we offer:
  • Equal opportunity employer
  • Support for persons with disabilities.

Additional Information:

Job Posted:
October 11, 2025

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.