CrawlJobs Logo

Consultant Specialist

https://www.hsbc.com Logo

HSBC

Location Icon

Location:
Mainland China, Guangzhou

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Some careers have more impact than others. If you’re looking for a career where you can make a real impression, join HSBC and discover how valued you’ll be. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist.

Job Responsibility:

  • Design and Build Data Processing Systems
  • Collaborate with cross-functional teams to understand data requirements and design efficient data pipelines
  • Implement data ingestion, transformation, and enrichment processes using GCP services (such as BigQuery, Dataflow, and Pub/Sub)
  • Ensure scalability, reliability, and performance of data processing workflows
  • Data Ingestion and Processing
  • Collect and ingest data from various sources (both batch and real-time) into GCP
  • Cleanse, validate, and transform raw data to ensure its quality and consistency
  • Optimize data processing for speed and efficiency
  • Data Storage and Management
  • Choose appropriate storage solutions (e.g., Bigtable, Cloud Storage) based on data characteristics and access patterns
  • Create and manage data warehouses, databases, and data lakes
  • Define data retention policies and archival strategies
  • Data Preparation for Analysis
  • Prepare data for downstream analytics, reporting, and machine learning
  • Collaborate with data scientists and analysts to understand their requirements
  • Ensure data is accessible, well-organized, and properly documented
  • Automation and Monitoring
  • Automate data workflows using tools like Apache Airflow or Cloud Composer
  • Monitor data pipelines, troubleshoot issues, and proactively address bottlenecks
  • Implement alerting mechanisms for data anomalies
  • Security and Compliance
  • Apply security best practices to protect sensitive data
  • Ensure compliance with industry regulations (e.g., GDPR, HIPAA) and internal policies
  • Collaborate with security teams to address vulnerabilities
  • Documentation and Knowledge Sharing
  • Document data pipelines, architecture, and processes
  • Share knowledge with team members through documentation, training sessions, and workshops

Requirements:

  • Minimum of 3 years of industry experience, cloud experience is a plus
  • Familiarity with GCP services (BigQuery, Dataflow, Pub/Sub, etc.) and related technologies
  • Experience with data modeling, ETL processes, and data warehousing, SQL
  • Previous work in highly regulated or complex organizations is a plus
  • Speak English fluently
  • Proficiency in Python & Bash for data manipulation and scripting
  • Proficiency in SQL for querying and manipulating data
  • Knowledge of Terraform for infrastructure as code (IaC)
  • Familiarity with Jenkins for continuous integration and deployment
  • Knowledge of Cloud platform, GCP is a plus
  • Strong problem-solving skills and attention to detail
What we offer:
  • Continuous professional development
  • Flexible working
  • Opportunities to grow within an inclusive and diverse environment

Additional Information:

Job Posted:
June 14, 2025

Expiration:
August 11, 2025

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.