CrawlJobs Logo

Senior Group Data Engineer

odeon.co.uk Logo

ODEON Cinemas

Location Icon

Location:
United Kingdom

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

As a Senior Data Engineer, you will play a key leadership role in designing, delivering, and maintaining the OCG Group data platform. You’ll combine deep technical expertise with strategic oversight of engineering projects, leading and mentoring team members to deliver high-quality solutions and drive best practices. In addition to owning technical delivery, you will support the Group Data Engineering Lead by stepping into leadership responsibilities when required, contributing to team continuity.

Job Responsibility:

  • Provide leadership, coaching, and mentorship to the data engineering team, acting as Team Lead during the line manager’s absence to ensure continuity and effective delivery
  • Develop, operate, maintain, and deploy data pipelines to extract, transform, and load data into the OCG Data Lake from internal and external sources, ensuring proper documentation
  • Deliver end-to-end data engineering projects, including scoping, development, and deployment, while collaborating with other data engineers across work streams to ensure consistency, scalability, and adherence to best practices
  • Champion the implementation of engineering standards and principles across all data solutions
  • Assess and monitor data quality, designing approaches to maintain agreed standards
  • Report regularly on progress, risks, and issues to the line manager and senior stakeholders across both project and support activities
  • Build strong collaborative relationships across OCG functions, with a particular focus on the Data and Analytics teams

Requirements:

  • Experience leading, prioritising and delivering data engineering workstreams, while influencing stakeholders and mentoring engineers to drive delivery
  • Strong Experience of best practise modern analytics and big data architectures in an Azure environment with a focus on Databricks , Azure Data Lake, Azure Data Factory and Power BI
  • Skills in SQL, Spark and Python
  • Experience with operational management of working data pipelines, including troubleshooting and communication with stakeholders
  • Experience in the monitoring, measuring and maintaining data quality
  • Experience with data modelling such as Kimball, Inmon or One Big table
  • experience in monitoring, measuring, and maintaining data quality, including knowledge of test approaches for automated data pipelines
  • Ability to work using Agile principles to deliver regular releases and improvements to users
What we offer:
  • Bonus Scheme
  • Pension Scheme
  • Hybrid working (2-3 days per week in the office)
  • Unlimited free cinema tickets for you, and 12 friends and family tickets every three months
  • 40% discount on our food and drinks, including our in-cinema Costa Coffee stores
  • Private Medical Insurance
  • Critical Illness Cover
  • Life Assurance
  • Free access to our confidential Employee Assistance Programme – an online platform that offers advice and support on topics including finance, health, and mental wellbeing
  • The opportunity to gain professional qualifications through our Bright Lights Apprenticeship scheme
  • Fantastic career development opportunities across our cinemas and support offices

Additional Information:

Job Posted:
January 24, 2026

Expiration:
February 06, 2026

Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Group Data Engineer

Data Engineer Senior

We are looking for a highly skilled professional to lead the industrialisation o...
Location
Location
Portugal , Lisbon
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum of 5 years’ experience in MLOps, data engineering, or DevOps with a focus on ML/DL/LLM/AI agents in production environments
  • Strong proficiency in Python
  • Hands-on experience with CI/CD tools such as GitLab, Docker, Kubernetes, Jenkins
  • Solid understanding of ML, DL, and LLM models
  • Experience with ML lifecycle tools such as MLflow or DVC
  • Good understanding of model lifecycle, data traceability, and governance frameworks
  • Experience with on-premise and hybrid infrastructures
  • Excellent communication skills and ability to collaborate with remote teams
  • Proactive mindset, technical rigour, and engineering mentality
  • Willingness to learn, document, and standardise best practices
Job Responsibility
Job Responsibility
  • Analyse, monitor, and optimise ML models, tracking their performance
  • Design and implement CI/CD pipelines for ML models and data flows
  • Containerise and deploy models via APIs, batch processes, and streaming
  • Manage model versioning and traceability
  • Ensure continuous improvement and adaptation of AI use cases and ML models
  • Set up monitoring and alerting for model performance
  • Establish incident response protocols in collaboration with IT
  • Maintain dashboards and automated reports on model health
  • Implement validation frameworks for data and models (e.g., Great Expectations, unit tests, stress tests), in collaboration with Group Governance
  • Contribute to documentation and apply technical best practices
What we offer
What we offer
  • Work in a constantly evolving environment
  • Contribute to digital impact
  • Opportunity for growth and development
  • Fulltime
Read More
Arrow Right

Senior Data Engineer - Catalog

Join the Catalog team and be part of Deezer’s next steps. We ingest, reconcile a...
Location
Location
France , Paris
Salary
Salary:
Not provided
deezer.com Logo
Deezer
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience in Data Engineering
  • Familiarity with data quality frameworks and best practices
  • Solid experience with Python, SQL, BigQuery, Spark/Scala, ETL
  • Experience with entity matching, data reconciliation or data science applied to quality problems is a strong plus
  • Proactive, organised and with strong team spirit
  • Passion for music and improving user experience through better metadata
  • Fluent in English
Job Responsibility
Job Responsibility
  • Design and maintain ETL pipelines to ingest and process large volumes of catalog data from disparate sources
  • Ingest and match entities like artists, albums, concerts and lyrics
  • Develop solutions to improve metadata quality at scale
  • Contribute to projects like: Artist disambiguation and source-of-truth building
  • Album grouping via ML models in collaboration with Data scientists
  • Build analytics and quality KPIs to monitor catalog performance
  • Collaborate closely with Data Scientists, Researchers, Product Managers and partners to implement new solutions
What we offer
What we offer
  • A Deezer premium family account for free
  • Access to gym classes
  • Join over 70 Deezer Communities
  • Deezer parties several times a year and drinks every thursday
  • Allowance for sports, travelling and culture
  • Meal vouchers
  • Mental health and well-being support from Moka.Care
  • Great offices
  • Hybrid remote work policy
  • Fulltime
Read More
Arrow Right

Senior Crypto Data Engineer

Token Metrics is seeking a multi-talented Senior Big Data Engineer to facilitate...
Location
Location
Vietnam , Hanoi
Salary
Salary:
Not provided
tokenmetrics.com Logo
Token Metrics
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Data Engineering, Big Data Analytics, Computer Engineering, or related field
  • A Master's degree in a relevant field is an added advantage
  • 3+ years of Python, Java or any programming language development experience
  • 3+ years of SQL & No-SQL experience (Snowflake Cloud DW & MongoDB experience is a plus)
  • 3+ years of experience with schema design and dimensional data modeling
  • Expert proficiency in SQL, NoSQL, Python, C++, Java, R
  • Expert with building Data Lake, Data Warehouse or suitable equivalent
  • Expert in AWS Cloud
  • Excellent analytical and problem-solving skills
  • A knack for independence and group work
Job Responsibility
Job Responsibility
  • Liaising with coworkers and clients to elucidate the requirements for each task
  • Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed
  • Reformulating existing frameworks to optimize their functioning
  • Testing such structures to ensure that they are fit for use
  • Building a data pipeline from different data sources using different data types like API, CSV, JSON, etc
  • Preparing raw data for manipulation by Data Scientists
  • Implementing proper data validation and data reconciliation methodologies
  • Ensuring that your work remains backed up and readily accessible to relevant coworkers
  • Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
85000.00 - 150000.00 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or another related technical field Required
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field Preferred
  • 2+ years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows Required
  • 6+ years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics Required
  • Expert knowledge on SQL and Python programming
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed
  • Experience in tuning queries for performance and scalability
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar
  • Excellent organizational, prioritization and analytical abilities
  • Have proven experience working in incremental execution through successful launches
Job Responsibility
Job Responsibility
  • Work closely with various business, IT, Analyst and Data Science groups to collect business requirements
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform
  • Optimize data pipelines for performance, scalability, and reliability
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
  • Troubleshoot and resolve data engineering issues as they arise
  • Develop REST APIs to expose data to other teams within the company
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Fulltime
Read More
Arrow Right

Senior BI Engineer

LumApps is now more than just an Employee Experience Platform — it is an AI-powe...
Location
Location
France , Tassin-la-Demi-Lune; Sophia-Antipolis; Paris
Salary
Salary:
Not provided
lumapps.com Logo
LumApps
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Engineering proficiency: Strong command of SQL and experience with dbt
  • Visualization savvy: Experience with Looker (or similar enterprise BI tools) and an understanding of data modeling
  • Language skills: You are fluent in both French and English
  • Process oriented: You are comfortable using Jira/Git and writing clear documentation
Job Responsibility
Job Responsibility
  • Build the Foundation: Design and develop robust data transforms (ETL/ELT) within the group Data Lake using dbt and SQL
  • Scale through M&A: Analyze the reporting structures of companies acquired by LumApps and design efficient strategies to integrate their data into our ecosystem
  • Connect the Dots: Create interfaces between the Data Lake and new data sources (via Meltano or custom scripts)
  • Collaborate & Clarify: Translate functional needs from BI Analysts and Business Owners into rigorous technical specifications
  • Guarantee Quality: Implement automated data validation systems and manage access rights, ensuring our stakeholders always trust the numbers
What we offer
What we offer
  • Hybrid work model – 2 days at the office, 3 days remote
  • RTT days – ~10 extra days off per year
  • Meal vouchers (SWILE) + free snacks & coffee
  • Yoga classes
  • Supportive parental leave and family moments
  • Health insurance (ALAN) – 60% covered + full life & disability cover
  • Afterworks, team celebrations & seasonal parties
  • Equipment of your choice
  • French & English lessons, professional development & access to Leeto CSE
  • Fulltime
Read More
Arrow Right

Senior Data Architect

Experienced Data Architect to design, develop, and implement comprehensive enter...
Location
Location
India , Noida; Chennai; Bangalore
Salary
Salary:
Not provided
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 11+ years of experience in data engineering/management roles
  • 3+ years in enterprise-level data architecture and data governance
  • Proven experience of defining the global architecture of the Single Source of Truth and its three components (Access, Knowledge, Trust)
  • Design and document data models and architecture frameworks aligned with group standards
  • Integrate real-time and streaming (event-driven architecture) capabilities into the overall design
  • Proven experience implementing solutions using Snowflake (Preferable) and Databricks
  • Strong background in Data Governance, Data Quality, and Master data Management (MDM)
  • BE / B.Tech/ MS/ M.Tech/ MCA qualification
Job Responsibility
Job Responsibility
  • Design, develop, and implement enterprise-wide data architecture, models, and integration frameworks
  • Establish and enforce data governance standards, policies, and best practices
  • Design, build and optimize data platforms using Snowflake (preferable) and/or Databricks
  • Oversee and guide complex data transformation for large and diverse datasets, ensuring Data integrity, quality, and performance
  • Drive the creation and maintenance of advanced data models, to support both analytical and operational needs
  • Ensure consistency and reusability of models across business domains and systems
  • Implement MDM, Lineage tracking, and Data cataloguing
  • Guarantee consistency between technical design and trust framework (quality, compliance, security)
  • Collaborate with business stakeholders to align data strategy with organizational objectives, document use cases, and design processes, ensuring alignment with technical implementation
  • Provide guidance to engineering teams for solution design and implementation
  • Fulltime
Read More
Arrow Right

Applications Development Senior Group Manager

This role will be part of the Risk Data team and is a senior management level po...
Location
Location
United Kingdom , London
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong academic record, ideally with a Bachelor’s or Master’s degree in Computer Science or engineering or related technical discipline
  • Proven experience in enterprise application development with full stack technologies
  • Strong Architect and hands on technical experience in implementing large volume real time complex solutions in Big Data Platforms & Public Cloud platforms
  • Experience in Data architecture, strong Software development fundamentals, data structures, design patterns, object-oriented principles
  • Experience in design and delivery of multi-tiered applications and high performance server side components
  • Skills on system performance tuning, high performance, low latency, multithreading and experience with Java server side programming
  • Preferred experience in Handling high volumes of data and working with In-memory databases and Caching solutions
  • Experience of building and leading teams, ideally with a global resource profile and demonstrated ability to deliver large projects efficiently and on time
  • Significant experience in large Financial Services Technology services companies is expected for this position
  • Hands-on development, architecture and leadership experience in real-time data engineering platforms implementation
Job Responsibility
Job Responsibility
  • Lead the efforts in Institutional Data Platform (ICG) that span multiple businesses, products and functions
  • Delivery of Price Risk related Data initiatives and Capital reporting (GSIB) related deliverables
  • Establish strong relationships with the global business stakeholders and ensure transparency of project deliveries
  • Actively identify and manage risks and issues, working with disparate teams to create mitigation plans and follow-through to resolution
  • Adhere to all key Project Management (PMQC) & Engineering Excellence standards
  • Ensure timely communications to Senior Technology Management and Business Partners in Front Office, Middle Office & other Operations functions
  • Drive the design and development of system architecture, work with end-users of the systems, and enhance the quality of deliverables
  • Ensure staff follows Citi documented policy and procedures as well as maintain desktop procedures and supporting documentation for filings on a current basis and in comprehensive manner
  • Ensure change is managed with appropriate controls, documentation, and approvals including implementation of new and revised regulatory reporting requirements
  • Manage and maintain all disaster recovery plans, oversee appropriate testing, and provide permit-to-operate for new applications
What we offer
What we offer
  • 27 days annual leave (plus bank holidays)
  • A discretional annual performance related bonus
  • Private Medical Care & Life Insurance
  • Employee Assistance Program
  • Pension Plan
  • Paid Parental Leave
  • Special discounts for employees, family, and friends
  • Access to an array of learning and development resources
  • Fulltime
Read More
Arrow Right

Senior Data Consultant

Ivy Partners is a Swiss consulting firm that assists companies in their strategi...
Location
Location
Portugal , Lisboa
Salary
Salary:
Not provided
ivy.partners Logo
IVY Partners
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Between 6 to 10 years of professional experience
  • strong understanding and experience in data architecture within diverse domains such as finance, purchasing, etc.
  • deep expertise in managing large datasets
  • proficient in tools like BPMN 2.0 (preferably using Draw.io or LucidChart)
  • skilled in building and interpreting conceptual and logical data models
  • effectively document architecture and business processes
  • strong ability to perform gap analysis on existing datasets versus group standards
  • develop and manage a detailed remediation plan
  • both english and portuguese are required
Job Responsibility
Job Responsibility
  • Collaborate with diverse stakeholder communities, including business experts, data engineers, data scientists, and technical architects, to leverage data analytics, AI, and Master Data Management
  • Structure data platforms optimally for various use cases, ensuring alignment with business needs and governance standards
  • Develop and standardize conceptual data models, and document business processes in BPMN format to support data consumption
  • Identify and define critical business processes and associated data stewards or process owners
  • Assess existing datasets against defined data architecture standards, propose remediation plans, and manage the redesign process to ensure dataset standardization and reusability
  • Ensure adherence to data governance frameworks, including data risk management policies, data classification, access policies, and retention protocols
  • Document all relevant details in data catalogs and governance tools like Collibra
What we offer
What we offer
  • nurturing environment where everyone is valued
  • training and opportunities for advancement both in Switzerland and internationally
  • climate of trust based on transparency, professionalism, and commitment
  • encouraging innovation
  • collective is at the heart of our actions
  • strive to generate a positive impact
Read More
Arrow Right