This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We have an exciting opportunity for a Data Engineer to join the team in Newcastle or London. You will work closely with business and technology teams across Wealth Management Europe (WME) to support the ongoing maintenance and evolution of the Data Lakehouse platform. The primary focus being the ingestion and modelling of new data, and the evolution of the platform itself utilising new technologies to improve performance and accuracy of the data.
Job Responsibility:
Responsible for the development and ongoing maintenance of the Data Lakehouse platform infrastructure using the Microsoft Azure technology stack, including Databricks and Data Factory
Manage data pipelines consisting of a series of stages through which data flows (for example, from data sources or endpoints of acquisition to integration to consumption for specific use cases). These data pipelines must be created, maintained and optimized as workloads move from development to production for specific use cases. Architecting, creating and maintaining data pipelines will be the primary responsibility of the data engineer
Create new and modify existing Notebooks, Functions and Workflows to support efficient reporting and analytics to the business
Create, maintain, and develop Dev, UAT and Production environments ensuring consistency
Responsible for using innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks to minimize manual and error-prone processes and improve productivity
Competent in using GitHub (or other version control tooling) and in using data and schema comparisons via Visual Studio
Champion for the DevOps process to ensure the latest techniques are being used and that implementation methodologies involving new or changes to existing source code or data structures follow the agreed development and release processes and that all productionised code is adequately documented, reviewed and unit tested where appropriate
Identify, source, stage, and model internal process improvements to automate manual processes and optimise data delivery for greater scalability, as part of the end-to-end data lifecycle
Be curious and knowledgeable about new data initiatives and how to address them. This includes applying their data and/or domain understanding in addressing new data requirements. Additionally, be responsible for proposing appropriate (and innovative) data ingestion, preparation, integration and operationalization techniques in optimally addressing these data requirements
Requirements:
Proven experience working within Data Engineering and Data Management architectures like Data Warehouse, Data Lake, Data Hub and the supporting processes like Data Integration, Governance, Metadata Management
Proven experience working in cross-functional teams and collaborating with business stakeholders in support of a departmental and/or multi-departmental data management and analytics initiative
Strong experience with popular database programming languages for relational databases (SQL, T-SQL)
Experience working on a cloud data platform such as Databricks or Snowflake
Adept in agile methodologies, and capable of applying DevOps and DataOps principles to data pipelines
Basic experience in working with data governance, data quality and data security teams
Good understanding of datasets, Data Lakehouses, modelling, database design and programming
Knowledge of Data Lakehousing techniques, solutions and methodologies
Strong experience supporting and working with cross-functional teams in a dynamic business environment
Required to be highly creative and collaborative working closely with business teams and IT teams to define the business problem, refine the requirements, and design and develop data deliverables accordingly
Nice to have:
Knowledge of Terraform or other Infrastructure-as-code tools
Experience with advanced analytics tools for Object-oriented/object function scripting using languages such as Python, Java, C++, Scala, R, and others
Experience using automated unit testing methodologies
What we offer:
Leaders who support your development through coaching and managing opportunities
Opportunities to work with the best in the field
Ability to make a difference and lasting impact
Work in a dynamic, collaborative, progressive, and high-performing team