CrawlJobs Logo

Senior ETL Engineer

https://www.hsbc.com Logo

HSBC

Location Icon

Location:
India, Hyderabad

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

Not provided

Job Description:

Join HSBC as a Senior ETL Engineer/Consultant Specialist, leading the design and development of ETL processes while optimizing data pipelines and ensuring data governance and security. Collaborate with teams to integrate data sources, automate workflows, and maintain robust data solutions within GCP. Contribute to HSBC's mission to empower businesses and individuals worldwide.

Job Responsibility:

  • Lead the design and implementation of ETL processes using all kinds of batch/streaming tools to extract, transform, and load data from various sources into GCP
  • Collaborate with stakeholders to gather requirements and ensure that ETL solutions meet business needs
  • Optimize data pipelines for performance, scalability, and reliability, ensuring efficient data processing workflows
  • Monitor and troubleshoot ETL processes, proactively addressing issues and bottlenecks
  • Integrate data from diverse sources, including databases, APIs, and flat files, ensuring data quality and consistency
  • Manage and maintain data storage solutions in GCP (e.g., BigQuery, Cloud Storage) to support analytics and reporting
  • Write Apache Beam based Dataflow Job for data extraction, transformation, and analysis, ensuring optimal performance and accuracy
  • Collaborate with data analysts and data scientists to prepare data for analysis and reporting
  • Implement automation for ETL workflows using tools like Apache Airflow or Cloud Composer, enhancing efficiency and reducing manual intervention
  • Set up monitoring and alerting mechanisms to ensure the health of data pipelines and compliance with SLAs
  • Apply best practices for data governance, ensuring compliance with industry regulations (e.g., GDPR, HIPAA) and internal policies
  • Collaborate with security teams to implement data protection measures and address vulnerabilities
  • Document ETL processes, data models, and architecture to facilitate knowledge sharing and onboarding of new team members
  • Conduct training sessions and workshops to share expertise and promote best practices within the team

Requirements:

  • Minimum of 5 years of industry experience in data engineering or ETL development, with a strong focus on Data Stage and GCP
  • Proven experience in designing and managing ETL solutions, including data modeling, data warehousing, and SQL development
  • Strong knowledge of GCP services (e.g., BigQuery, Dataflow, Cloud Storage, Pub/Sub) and their application in data engineering
  • Experience of cloud-based solutions, especially in GCP, cloud certified candidate is preferred
  • Experience and knowledge of Bigdata data processing in batch mode and streaming mode, proficient in Bigdata eco systems, e.g. Hadoop, HBase, Hive, MapReduce, Kafka, Flink, Spark, etc.
  • Familiarity with Java & Python for data manipulation on Cloud/Bigdata platform
  • Strong problem-solving skills with a keen attention to detail
  • Ability to analyze complex data sets and derive meaningful insights
What we offer:
  • Continuous professional development
  • Flexible working
  • Opportunities to grow within an inclusive and diverse environment

Additional Information:

Job Posted:
July 17, 2025

Expiration:
December 31, 2025

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.