This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
HSBC is one of the largest banking and financial services organisations, operating in 64 countries and territories. The Software Engineer role involves designing and maintaining scalable data pipelines on Google Cloud Platform, optimizing workflows, building ETL processes, and collaborating with stakeholders to meet technical requirements while leveraging tools like Python, PySpark, and SQL.
Job Responsibility:
Design, develop, and maintain scalable data pipelines on Google Cloud Platform (GCP)
Optimize and automate data workflows using Dataproc, BigQuery, Dataflow, Cloud Storage, and Pub/Sub
Build and maintain ETL processes for data ingestion, transformation, and loading into data warehouses
Ensure the reliability and performance of data pipelines by implementing Apache Airflow for orchestration
Collaborate with stakeholders to gather and translate requirements into technical solutions
Work on multiple data warehousing projects, applying a thorough understanding of concepts such as dimensions, facts, and slowly changing dimensions (SCDs)
Develop scripts and applications in Python and PySpark to handle large-scale data processing tasks
Write optimized SQL queries for data analysis and transformations
Use GitHub, Jenkins, Terraform, and Ansible to deploy and manage code in production
Troubleshoot and resolve issues related to data pipelines, ensuring high availability and scalability.
Requirements:
Cloud Expertise: Hands-on experience with GCP services like Dataproc, BigQuery, Dataflow, Cloud Storage, and Pub/Sub
Multi-cloud knowledge (e.g., AWS, Azure) is a plus
Programming & Tools: Strong skills in Python, PySpark, and SQL for data processing and analysis
Experience with Apache Airflow for workflow orchestration
Data Warehousing: Clear understanding of dimensions, facts, SCDs, and other core data warehousing concepts
Proven experience working on multiple data warehousing projects
DevOps & Deployment Tools: Experience working with GitHub, Jenkins, Terraform, and Ansible for CI/CD processes
General Skills: Strong problem-solving and analytical skills
Excellent communication and stakeholder management abilities
Ability to work independently and as part of a team.
Nice to have:
Multi-cloud knowledge (e.g., AWS, Azure) is a plus.
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.