CrawlJobs Logo

Data Ops Capability Deployment - Analyst

https://www.citi.com/ Logo

Citi

Location Icon

Location:
India, Pune

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

Not provided

Job Description:

Data Ops Capability Deployment - Analyst is a seasoned professional role focusing on data engineering, data analytics, and data governance. The role requires in-depth understanding of distributed data platforms, cloud services, and industry-specific skills to design and improve overall data strategies. Collaboration across functions, technical development, and risk management are core responsibilities.

Job Responsibility:

  • Hands on with data engineering background and understanding of distributed data platforms and cloud services
  • research and evaluate new data technologies, data mesh architecture and self-service data platforms
  • work closely with enterprise architecture team on the definition and refinement of overall data strategy
  • address performance bottlenecks, design batch orchestrations, and deliver reporting capabilities
  • perform complex data analytics on large, complex datasets
  • build analytics dashboards & data science capabilities for enterprise data platforms
  • communicate complicated findings and propose solutions to a variety of stakeholders
  • understanding business and functional requirements and convert into technical design documents
  • work closely with cross-functional teams
  • prepare handover documents and manage SIT, UAT, and implementation
  • demonstrate an understanding of how the development function integrates within overall business/technology to achieve objectives.

Requirements:

  • 10+ years of active development background and experience in Financial Services or Finance IT is required
  • experience with Data Quality/Data Tracing/Data Lineage/Metadata Management Tools
  • hands on experience for ETL using PySpark on distributed platforms along with data ingestion, Spark optimization, resource utilization, capacity planning & batch orchestration
  • in-depth understanding of Hive, HDFS, Airflow, job scheduler
  • strong programming skills in Python with experience in data manipulation and analysis libraries (Pandas, Numpy)
  • should be able to write complex SQL/Stored Procs
  • should have worked on DevOps, Jenkins/Lightspeed, Git, CoPilot
  • strong knowledge in one or more of the BI visualization tools such as Tableau, PowerBI
  • proven experience in implementing Datalake/Datawarehouse for enterprise use cases
  • exposure to analytical tools and AI/ML is desired.

Nice to have:

  • exposure to analytical tools and AI/ML
  • experience in additional BI tools such as Tableau, PowerBI
  • experience with metadata management tools.
What we offer:
  • diverse team environment
  • career growth opportunities
  • compliance with equal opportunity laws
  • flexible work options for accommodations.

Additional Information:

Job Posted:
August 07, 2025

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.