This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Join us as an AWS Data Engineer Barclays, responsible for supporting the successful delivery of Location Strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences.
Job Responsibility:
Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data
Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures
Development of processing and analysis algorithms fit for the intended data complexity and volumes
Collaboration with data scientist to build and deploy machine learning models
Requirements:
Firsthand Experience in developing, testing and maintaining applications on AWS Cloud
Strong hands‑on experience with the AWS Data Analytics stack (Amazon S3, AWS Glue, Athena, Lambda, IAM, Lake Formation, KMS, STS, and Step Functions), with a proven ability to build, test, and support secure, scalable, and well‑governed data pipelines
Firsthand experience in Airflow and PySpark and strong knowledge of Python
Design and implement scalable and efficient data transformation/storage solutions using Snowflake
Experience in Data ingestion to Snowflake for different storage format such Parquet, Iceberg, JSON, CSV etc
Experience in using DBT (Data Build Tool) with snowflake for ELT pipeline development
Experience in Writing advanced SQL and PL SQL programs
Experience in AWS data pipeline development
HandsOn Experience for building reusable components using Snowflake and AWS Tools/Technology
Must have completed two major projects
Nice to have:
Exposure to data governance or lineage tools such as Immuta and Alation is added advantage
Experience in using Orchestration tools such as Apache Airflow or Snowflake Tasks is added advantage
Knowledge on Ab Initio ETL tool is a plus
Hands on experience in Unix scripting is a plus
Automation based on tools like selenium, java is a plus