CrawlJobs Logo

Consultant Specialist

https://www.hsbc.com Logo

HSBC

Location Icon

Location:
China, Xi'an

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

Not provided

Job Description:

As a Senior Data Engineer / ETL Engineer, you will be instrumental in designing, developing, and optimizing data processing systems that support our organization's data initiatives. Your expertise in Data Stage, Data Flow, SQL, Bigdata and Google Cloud Platform (GCP) will be essential in building robust ETL pipelines that transform raw data into actionable insights. You will collaborate with cross-functional teams to ensure that data is accurate, accessible, and valuable for decision-making.

Job Responsibility:

  • Lead the design and implementation of ETL processes using all kinds of batch/streaming tools to extract, transform, and load data from various sources into GCP
  • collaborate with stakeholders to gather requirements and ensure that ETL solutions meet business needs
  • optimize data pipelines for performance, scalability, and reliability, ensuring efficient data processing workflows
  • monitor and troubleshoot ETL processes, proactively addressing issues and bottlenecks
  • integrate data from diverse sources, including databases, APIs, and flat files, ensuring data quality and consistency
  • manage and maintain data storage solutions in GCP (e.g., Big Query, Cloud Storage) to support analytics and reporting
  • write Apache Beam based Dataflow Job for data extraction, transformation, and analysis, ensuring optimal performance and accuracy
  • collaborate with data analysts and data scientists to prepare data for analysis and reporting
  • implement automation for ETL workflows using tools like Apache Airflow or Cloud Composer, enhancing efficiency, and reducing manual intervention
  • set up monitoring and alerting mechanisms to ensure the health of data pipelines and compliance with SLAs
  • apply best practices for data governance, ensuring compliance with industry regulations (e.g., GDPR, HIPAA) and internal policies
  • collaborate with security teams to implement data protection measures and address vulnerabilities
  • document ETL processes, data models, and architecture to facilitate knowledge sharing and onboarding of new team members
  • conduct training sessions and workshops to share expertise and promote best practices within the team

Requirements:

  • Bachelor’s degree in computer science, Information Systems, or a related field
  • minimum of 5 years of industry experience in data engineering or ETL development, with a strong focus on Data Stage and GCP
  • proven experience in designing and managing ETL solutions, including data modeling, data warehousing, and SQL development
  • strong knowledge of GCP services (e.g., Big Query, Dataflow, Cloud Storage, Pub/Sub) and their application in data engineering
  • experience of cloud-based solutions, especially in GCP, cloud certified candidate is preferred
  • experience and knowledge of Bigdata data processing in batch mode and streaming mode, proficient in Bigdata eco systems, e.g. Hadoop, HBase, Hive, MapReduce, Kafka, Flink, Spark, etc.
  • familiarity with Java & Python for data manipulation on Cloud/Bigdata platform
  • strong problem-solving skills with a keen attention to detail
  • ability to analyze complex data sets and derive meaningful insights

Nice to have:

cloud certified candidate is preferred

What we offer:
  • competitive salary and comprehensive benefits package
  • opportunity to work in a dynamic and collaborative environment on cutting-edge data projects
  • professional development opportunities to enhance your skills and advance your career

Additional Information:

Job Posted:
June 20, 2025

Expiration:
July 20, 2025

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.