This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
HSBC Markets and Securities is an emerging markets-led and financing-focused investment banking and trading business. Global Business Insights (GBI) provide critical metrics and reports to Markets and Securities Services Operations. The GBI Transformation is a large and complex data integration program spanning all of MSS Ops globally. We are a critical enabler for the Rubix 2025 Strategy and the MSS control agenda, providing operational KPI and KRI metrics. We are looking for a GCP developer who can design, develop, test and deploy ETL/SQL pipelines connected to a variety of on-prem and Cloud data sources - both data stores and files. We will be using mainly GCP technologies like Cloud Store, BigQuery, and Data Fusion. You will also need to work with our devops tooling to deliver continuous integration/deployment capabilities, automated testing, security, and IT compliance.
Job Responsibility:
Design, build, test and deploy Google Cloud data models and transformations in BigQuery environment (e.g. SQL, stored procedures, indexes, clusters, partitions, triggers, etc.)
Create and manage ETl/ELT data pipelines to model raw/unstructured data into Data Vault universal model, enriched, transformed and optimised raw data into suitable for end consumers usage
Review and refine, interpret and implement business and technical requirements
Deliver non-functional requirements, IT standards and developer and support tools to ensure our applications are a secure, compliant, scalable, reliable and cost effective
Monitor data pipelines for failures or performance issues and implementing fixes or improvements as needed
Optimise ETL/ELT processes for performance and scalability, ensuring they can handle large volumes of data efficiently
Integrate data from multiple sources, ensuring consistency and accuracy
Manage code artefacts and CI/CD using tools like Git, Jenkins, Google Secrets Manager, etc. Fix defects and provide enhancements during the development period and hand-over knowledge, expertise, code and support responsibilities to support team
Provisioning of subject matter expertise to support Enterprise Risk Management (ERM) Leadership Team (LT) and ERM Assurance teams discharge their responsibilities in relation to operational risk and resilience risk steward delivery across all service areas, delivery of assurance activities, embedding of assurance practices and embedding of stewardship activities and service catalogue in respective GB/GF/Specialist team
Requirements:
Proven (3+ years) hands on experience in SQL querying and optimisation of complex queries/transformation in BigQuery, with a focus on cost, time-effective SQL coding and concurrency/data integrity
Proven (3+ years) hands on experience in SQL Data Transformation/ETL/ELT pipelines development, testing and implementation, ideally in GCP Datafusion
Proven Experience in Data Vault modelling and usage
Hands on experience in Cloud Composer/Airflow, Cloud Run, Pub/Sub
Hands on development in Python, Terraform
Proficiency in Git usage for version control and collaboration
Proficiency with CI/CD processes/pipelines designing, creation, maintenance in DevOps tools like Ansible/Jenkins etc. for Cloud Based Applications (Ideally GCP)
Experience in working in DataOps model
Experience in working in Agile environment and toolset
Strong problem-solving and analytical skills
Enthusiastic willingness to learn and develop technical and soft skills as needs require rapidly and independently
Strong organisational and multi-tasking skills
Good team player who embraces teamwork and mutual support
Nice to have:
Experience designing, testing, and implementing data ingestion pipelines on GCP Data Fusion, CDAP or similar tools, including ingestion and parsing and wrangling of CSV, JSON, XML etc formatted data from RESTful & SOAP APIs, SFTP servers, etc
Modern world data contract best practices understanding with experience for independently directing, negotiating, and documenting best in class data contracts
Java development, testing and deployment skills (ideally custom plugins for Data Fusion)
Proficiency in working with Continuous Integration (CI), Continuous Delivery (CD) and continuous testing tools, ideally for Cloud based Data solutions
What we offer:
Annual performance-based bonus
Additional bonuses for recognition awards
Multisport card
Private medical care
Life insurance
One-time reimbursement of home office set-up (up to 800 PLN)
Corporate parties & events
CSR initiatives
Nursery discounts
Financial support with trainings and education
Social fund
Flexible working hours
Free parking
Comprehensive and competitive package of benefits covering healthcare, family friendly leaves, pension and life assurance