This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Join us as a Data Engineer at Barclays, responsible for supporting the successful delivery of Location Strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure.
Job Responsibility:
Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data
Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures
Development of processing and analysis algorithms fit for the intended data complexity and volumes
Collaboration with data scientist to build and deploy machine learning models
Supporting the successful delivery of Location Strategy projects to plan, budget, agreed quality and governance standards
Spearhead the evolution of our digital landscape, driving innovation and excellence
Harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences
Requirements:
Proficiency in Ab Initio (GDE, EME, Co>Operating System) for graph development, debugging, and performance tuning
Ability to write and debug Unix/Linux shell scripts for job orchestration
SQL & PL/SQL querying skills for data extraction, transformation, and validation
Familiarity with Tivoli Workload Scheduler (TWS) or similar scheduling tools
Exposure to SIT, E2E, and OAT testing cycles
Experience with JIRA for sprint planning, backlog grooming, and delivery tracking
Ability to design and implement scalable ETL solutions for real-time and batch data processing
Ability to write test cases and execute testing scenarios
Ability to collaborate with architects, analysts, and QA teams to ensure data quality and system stability
Ability to participate in code reviews, peer testing, and production deployments
Ability to maintain documentation and support audit and compliance requirements
Ability to contribute to continuous improvement initiatives and automation efforts
Nice to have:
Working knowledge of MongoDB or any other document databases
Experience with IBM MQ, Apache Kafka, or similar message queue technologies
Awareness of access control, vulnerability assessments, and audit readiness
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.