CrawlJobs Logo

Python - Data Engineer/Consultant Specialist

https://www.hsbc.com Logo

HSBC

Location Icon

Location:
India, Hyderabad

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided
Save Job
Save Icon
Job offer has expired

Job Description:

Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.

Job Responsibility:

  • Design, develop, and optimize data pipelines using Azure Databricks, PySpark, and Prophesy
  • Implement and maintain ETL/ELT pipelines using Azure Data Factory (ADF) and Apache Airflow for orchestration
  • Develop and optimize complex SQL queries and Python-based data transformation logic
  • Work with version control systems (GitHub, Azure DevOps) to manage code and deployment processes
  • Automate deployment of data pipelines using CI/CD practices in Azure DevOps
  • Ensure data quality, security, and compliance with best practices
  • Monitor and troubleshoot performance issues in data pipelines
  • Collaborate with cross-functional teams to define data requirements and strategies

Requirements:

  • 6+ years of experience in data engineering, working with Azure Databricks, PySpark, and SQL
  • Hands-on experience with Prophesy for data pipeline development
  • Proficiency in Python for data processing and transformation
  • Experience with Apache Airflow for workflow orchestration
  • Strong expertise in Azure Data Factory (ADF) for building and managing ETL processes
  • Familiarity with GitHub and Azure DevOps for version control and CI/CD automation
  • Solid understanding of data modelling, warehousing, and performance optimization
  • Ability to work in an agile environment and manage multiple priorities effectively
  • Excellent problem-solving skills and attention to detail
  • Experience with Delta Lake and Lakehouse architecture
  • Hands-on experience with Terraform or Infrastructure as Code (IaC)
  • Understanding of machine learning workflows in a data engineering context
What we offer:
  • Continuous professional development
  • Flexible working
  • Opportunities to grow within an inclusive and diverse environment

Additional Information:

Job Posted:
August 04, 2025

Expiration:
August 18, 2025

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.