CrawlJobs Logo

Software Engineer

https://www.hsbc.com Logo

HSBC

Location Icon

Location:
India, Hyderabad

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

Not provided

Job Description:

HSBC is one of the largest banking and financial services organisations, operating in 64 countries and territories. The Software Engineer role involves designing and maintaining scalable data pipelines on Google Cloud Platform, optimizing workflows, building ETL processes, and collaborating with stakeholders to meet technical requirements while leveraging tools like Python, PySpark, and SQL.

Job Responsibility:

  • Design, develop, and maintain scalable data pipelines on Google Cloud Platform (GCP)
  • Optimize and automate data workflows using Dataproc, BigQuery, Dataflow, Cloud Storage, and Pub/Sub
  • Build and maintain ETL processes for data ingestion, transformation, and loading into data warehouses
  • Ensure the reliability and performance of data pipelines by implementing Apache Airflow for orchestration
  • Collaborate with stakeholders to gather and translate requirements into technical solutions
  • Work on multiple data warehousing projects, applying a thorough understanding of concepts such as dimensions, facts, and slowly changing dimensions (SCDs)
  • Develop scripts and applications in Python and PySpark to handle large-scale data processing tasks
  • Write optimized SQL queries for data analysis and transformations
  • Use GitHub, Jenkins, Terraform, and Ansible to deploy and manage code in production
  • Troubleshoot and resolve issues related to data pipelines, ensuring high availability and scalability.

Requirements:

  • Cloud Expertise: Hands-on experience with GCP services like Dataproc, BigQuery, Dataflow, Cloud Storage, and Pub/Sub
  • Multi-cloud knowledge (e.g., AWS, Azure) is a plus
  • Programming & Tools: Strong skills in Python, PySpark, and SQL for data processing and analysis
  • Experience with Apache Airflow for workflow orchestration
  • Data Warehousing: Clear understanding of dimensions, facts, SCDs, and other core data warehousing concepts
  • Proven experience working on multiple data warehousing projects
  • DevOps & Deployment Tools: Experience working with GitHub, Jenkins, Terraform, and Ansible for CI/CD processes
  • General Skills: Strong problem-solving and analytical skills
  • Excellent communication and stakeholder management abilities
  • Ability to work independently and as part of a team.

Nice to have:

Multi-cloud knowledge (e.g., AWS, Azure) is a plus.

What we offer:
  • Continuous professional development
  • Flexible working
  • Opportunities for growth
  • Inclusive and diverse environment.

Additional Information:

Job Posted:
November 11, 2025

Expiration:
November 15, 2025

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.