CrawlJobs Logo

Data Ops Capability Deployment - Analyst

https://www.citi.com/ Logo

Citi

Location Icon

Location:
India, Pune

Category Icon
Category:
IT - Administration

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

Not provided

Job Description:

Data Ops Capability Deployment - Analyst is a seasoned professional role. Applies in-depth disciplinary knowledge, contributing to the development of new solutions/frameworks/techniques and the improvement of processes and workflow for Enterprise Data function. Integrates subject matter and industry expertise within a defined area. Requires in-depth understanding of how areas collectively integrate within the sub-function as well as coordinate and contribute to the objectives of the function and overall business.

Job Responsibility:

  • Hands on with data engineering background and have thorough understanding of Distributed Data platforms and Cloud services
  • sound understanding of data architecture and data integration with enterprise applications
  • research and evaluate new data technologies, data mesh architecture and self-service data platforms
  • work closely with Enterprise Architecture Team on the definition and refinement of overall data strategy
  • should be able to address performance bottlenecks, design batch orchestrations, and deliver Reporting capabilities
  • ability to perform complex data analytics (data cleansing, transformation, joins, aggregation etc.) on large complex datasets
  • build analytics dashboards & data science capabilities for Enterprise Data platforms
  • communicate complicated findings and propose solutions to a variety of stakeholders
  • understanding business and functional requirements provided by business analysts and convert into technical design documents
  • work closely with cross-functional teams e.g. Business Analysis, Product Assurance, Platforms and Infrastructure, Business Office, Control and Production Support
  • prepare handover documents and manage SIT, UAT and Implementation
  • demonstrate an in-depth understanding of how the development function integrates within overall business/technology to achieve objectives
  • requires a good understanding of the banking industry
  • performs other duties and functions as assigned.

Requirements:

  • 10+ years of active development background and experience in Financial Services or Finance IT is a required
  • experience with Data Quality/Data Tracing/Data Lineage/Metadata Management Tools
  • hands on experience for ETL using PySpark on distributed platforms along with data ingestion, Spark optimization, resource utilization, capacity planning & batch orchestration
  • in-depth understanding of Hive, HDFS, Airflow, job scheduler
  • strong programming skills in Python with experience in data manipulation and analysis libraries (Pandas, Numpy)
  • should be able to write complex SQL/Stored Procs
  • should have worked on DevOps, Jenkins/Lightspeed, Git, CoPilot
  • strong knowledge in one or more of the BI visualization tools such as Tableau, PowerBI
  • proven experience in implementing Datalake/Datawarehouse for enterprise use cases
  • exposure to analytical tools and AI/ML is desired.

Nice to have:

Exposure to analytical tools and AI/ML.

What we offer:
  • Equal opportunity employer
  • inclusive work environment
  • career growth opportunities.

Additional Information:

Job Posted:
August 06, 2025

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.