This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Data Platform Lead is responsible for designing, building, and evolving Beazley’s enterprise data platform to enable scalable, secure, and efficient data capabilities across the organisation. This role ensures the platform provides the foundational services and tools required for vertical data product teams to deliver high-quality, business-critical data solutions. The Data Platform Lead will drive innovation, maintain best practices, and ensure alignment with Beazley’s strategic objectives for data and analytics.
Job Responsibility:
Define and implement the roadmap for the data platform, ensuring it meets current and future business needs
Establish platform standards, governance, and best practices for data ingestion, storage, processing, and consumption
Lead the design and implementation of core platform components leveraging technologies such as Confluent, Azure Data Lake, Snowflake, Power BI, and FiveTran
Ensure platform scalability, reliability, and security across all environments
Act as a key liaison between horizontal platform teams and vertical data product teams, enabling seamless integration and delivery
Provide guidance and support to engineering teams on platform capabilities and usage
Monitor platform performance and optimise for cost, efficiency, and resilience
Implement robust observability, monitoring, and incident management processes
Stay current with emerging technologies and industry trends to enhance platform capabilities
Drive automation and self-service capabilities for data engineering and analytics teams
Requirements:
Strong expertise in data platform architecture and cloud-based data solutions
Proven track record of leading technical teams and influencing stakeholders across the organisation
Demonstrated ability to design and implement enterprise-scale data platform components, working closely with Data Architecture, Enterprise Architecture and Solution Architecture teams
Hands-on experience with Confluent (Kafka), Azure Data Lake, Snowflake, and FiveTran, including data ingestion, storage, processing and consumption patterns
Strong understanding of data governance, security and compliance frameworks, enabling resilient, scalable and cost-efficient platforms
Good expertise in Data Visualisation tools such as Power BI
Proficiency in data integration, ETL/ELT pipelines, and streaming data solutions
Excellent communication and collaboration skills to work effectively with cross-functional teams
Strategic thinker with strong problem-solving and decision-making abilities
Ability to balance innovation with operational stability and cost optimisation
Degree-level education or equivalent commercial experience
Nice to have:
Experience in financial services or regulated industries
Familiarity with DevOps, CI/CD, and Infrastructure-as-Code practices
Knowledge of data modelling, analytics, and BI tools