This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Data Engineer is accountable for developing high quality data products to support the Bank's regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team.
Job Responsibility:
Developing and supporting scalable, extensible, and highly available data solutions
Deliver on critical business priorities while ensuring alignment with the wider architectural vision
Identify and help address potential risks in the data supply chain
Follow and contribute to technical standards
Design and develop analytical data models
Requirements:
First Class Degree in Engineering/Technology (4-year graduate course)
9 to 11 years' experience implementing data-intensive solutions using agile methodologies
Experience of relational databases and using SQL for data querying, transformation and manipulation
Experience of modelling data for analytical consumers
Ability to automate and streamline the build, test and deployment of data pipelines in cloud native technologies and patterns
Excellent communication and problem-solving skills
Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica
Experience of 'big data' platforms such as Hadoop, Hive or Snowflake for data storage and processing
Expertise around Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design
Good exposure to data modeling techniques
design, optimization and maintenance of data models and data structures
Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala
Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management
A strong grasp of principles and practice including data quality, security, privacy and compliance
Nice to have:
Ab Initio: Experience developing Co>Op graphs
ability to tune for performance
Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc
Exposure to data validation, cleansing, enrichment and data controls
Fair understanding of containerization platforms like Docker, Kubernetes
Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta
Experience of using a Job scheduler e.g., Autosys
Exposure to Business Intelligence tools e.g., Tableau, Power BI
Certification on any one or more of the above topics
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.