This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Data Modeler role requires expertise in developing and maintaining data models within the insurance domain. Candidates should have experience with data modeling tools and methodologies, particularly in P&C and Specialty insurance projects. Strong collaboration and communication skills are essential for working with stakeholders and technical teams. A minimum of 5 years of experience is required, with a focus on data integrity and alignment with business needs.
Job Responsibility:
Develop and maintain data models-Logical and Physical, to support business needs ensuring data integrity for Insurance domains
Design models aligned to Databricks medallion architecture by translating the requirements of underwriting, delegated authority, reinsurance, actuarial, claims and finance
Implement and maintain modelling standards, naming conventions, metadata and documentation working closely with data analyst, data engineers, and associates in data office
Review existing models and propose simplification, standardisation and target state improvements across projects and for wider requirements
Lead the evolution of requirements, refine and translate the specifications into data models to enable development phase
Ensure the models accurately reflect Insurance processes and data elements before handing it over to engineering
Requirements:
Experience delivering data models for P&C, Specialty insurance projects with exposure to Lloyd’s/London Market and Syndicate data structures is required
Understanding of insurance domain concepts and proficiency with data modelling tools(eg Erwin)
Expertise in enterprise data modelling, including conceptual, logical, and physical models for transactional domains
Experience in data modelling using Dimensional Star/Snowflake, Data Vault, and 3NF methodologies
Knowledge of data warehousing and ETL/ELT pipelines
Experience with Azure Data Factory and Databricks, designing data pipelines can be advantageous
Understanding and knowledge of system development life cycle methodologies (such as agile software development)
Excellent collaboration and communications skills to deal with senior leaders, stakeholders and technical teams is required
Flexibility and bringing a curious and creative mindset, open to new things and able to propose innovative idea