This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are currently looking for an Data Engineer to join our fast-paced, data-driven tech team within a global digital media environment. You’ll play a crucial role in shaping how the business collects, models, and activates data across multiple commercial and editorial functions. In this hands-on role, you will architect and build scalable data pipelines, optimise infrastructure, and deliver high-value insight tools that empower decision makers. The Data Engineer will work closely with commercial, product, and analytics teams, making this position vital to the organisation’s long-term data strategy. As an Data Engineer, you’ll combine engineering, analytics, and product thinking to create a reliable, high-performing data ecosystem. You’ll have the opportunity to work with modern cloud technologies, large-scale datasets, and a business that truly values data as a strategic asset. This role stands out because it sits at the intersection of engineering and commercial impact, your work directly supports revenue optimisation, global reporting, and critical business decisions.
Job Responsibility:
Building, operating, and optimising end-to-end ETL/ELT data pipelines using APIs, SFTP, and containerised orchestration tools
Developing scalable and well-structured data models that support commercial, programmatic, and affiliate revenue functions
Managing and improving complex data infrastructure that processes high-volume, multi-source Big Data
Creating, maintaining, and enhancing interactive dashboards that drive KPI-focused decision-making
Owning data quality, ensuring accuracy, consistency, and reliability across all core datasets
Analysing campaign, monetisation, and platform performance and providing actionable insights
Collaborating with Operations, Sales, Marketing, Finance, and Senior Analytics teams
Supporting strategic projects with advanced data modelling and insight generation
Requirements:
Strong Python and or Pyspark
Experience with cloud technologies such as GCP (BigQuery, Compute Engine, Kubernetes) and AWS (Redshift, EC2)
Experience building ETL/ELT pipelines and working with APIs or SFTP integrations
Understanding of data modelling, warehousing, and Big Data environments
Strong analytical and creative problem-solving skills
Ability to manage projects and collaborate effectively in a team