This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team.
Job Responsibility:
Developing and supporting scalable, extensible, and highly available data solutions
Deliver on critical business priorities while ensuring alignment with the wider architectural vision
Identify and help address potential risks in the data supply chain
Follow and contribute to technical standards
Design and develop analytical data models
Requirements:
First Class Degree in Engineering/Technology (4-year graduate course)
7 to 12 years’ experience implementing data-intensive solutions using agile methodologies
Experience of relational databases and using SQL for data querying, transformation and manipulation
Experience of modelling data for analytical consumers
Ability to automate and streamline the build, test and deployment of data pipelines
Experience in cloud native technologies and patterns
A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training
Excellent communication and problem-solving skills
An inclination to mentor
an ability to lead and deliver medium sized components independently
ETL: Design, develop, and maintain scalable and efficient data processing pipelines using Python and Spark
Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive for data storage and processing
Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Any Relational Databases (Oracle, MSSQL, MySQL) including SQL & performance tuning
Data Modeling & Design: Good exposure to data modeling techniques
design, optimization and maintenance of data models and data structures
Languages: Proficient in Python programming language
DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management
Nice to have:
Experience with AWS Cloud Platform
Experience with CI/CD Tools like LSE (Light Speed Enterprise), Jenkins, GitHub
Certification on any of the above topics would be an advantage
Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls
Containerization: Fair understanding of containerization platforms like Docker, Kubernetes
File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta
Others: Experience of using a Job scheduler e.g., Autosys. Exposure to Business Intelligence tools e.g., Tableau, Power BI
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.