This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Data sits at the heart of the company. This role will work closely with the Finance Data & Process squad, supporting the development of financial data pipelines and datasets that enable accounting processes, ERP integrations, and operational finance reporting. This includes translating finance and accounting requirements into robust, auditable, and scalable data solutions. As a Data Engineer, your role focuses on designing, building, and optimizing data pipelines, curated datasets, and analytical data models within Azure, AWS, and Databricks environments. The position involves working with largescale datasets, improving performance and reliability, and translating business logic into well-structured tables, metrics, and transformation rules.
Job Responsibility:
Build and maintain ETL/ELT pipelines using Databricks, PySpark, Spark SQL, and Delta Lake
Develop production ready notebooks, workflows, and data lake integrations
Apply best practices for Spark optimization (partitioning, caching, avoiding shuffle, file compaction)
Support the development of finance-specific data pipelines, including datasets used for financial bookings, reconciliations, and ERP integrations
Design curated datasets, semantic layers, and data marts that power analytics and reporting
Partner with business stakeholders to understand requirements, operational challenges and decision-making needs, ensuring alignment between business expectations and technical constraints
Convert business requirements into data models, defining tables, metrics, KPIs, and transformation rules
Work closely with product owners and analysts to align datasets with business processes
Support the structuring of datasets used in financial processes (e.g. revenue, costs, working capital, reconciliations)
Document data models, lineage, logic, and dataset behavior clearly and consistently, with particular attention to traceability and auditability of financial data
Work with very large datasets, optimizing transformations and storage for scale and cost
Tune and maintain MSSQL databases (indexes, running processes, performance diagnostics)
Implement robust data validation, schema enforcement, and quality checks across pipelines
Collaborate with engineers, analysts, and business stakeholders to deliver reliable data solutions
Communicate effectively with both technical and non‑technical audiences
Requirements:
Hands‑on experience with Databricks (PySpark, Spark SQL, Delta Lake, Lakebase), PostgresSQL
Background working with large, distributed datasets
Proficiency in Python, PySpark and SQL
Experience with data modeling, curated datasets, semantic layers, and medallion architecture
Experience with AWS (in particular Lambda, CloudWatch, and Step Functions)
Competence in using Datadog or similar observability/monitoring platforms
Strong debugging, problem solving, and communication skills
Comfortable operating in Agile environments
Strong commitment to thorough documentation
Nice to have:
Experience with Power BI, Tableau, or Luzmo
Understanding of CI/CD practices for data pipelines
Bachelor’s degree in Computer Science, Data Engineering, or related field
What we offer:
Flexi-Week and Work-Life Balance: We prioritise your mental health and wellbeing, offering you a flexible four-day Flexi-Week at full pay and with no reduction to your annual holiday allowance. We also offer a variety of different paid special leaves
Remote Working Allowance: You will receive a monthly allowance to cover part of your running costs. In addition, we will support you in setting up your remote workspace appropriately
Flexi-Office: We offer an international culture and flexibility through our Flexi-Office and hybrid/remote work possibilities to work across Awin regions
Meal Vouchers: You will be supported with a certain net sum to spend it on a variety of lunches
Health & Wellbeing: The insurance covers several types of health, vision and / or dental treatments for you and for up to one additional family member
Remote Working Furniture Package: After 3 months of employment, you will be eligible for a furniture package, which should enable you to set up a proper workplace at your remote working location
Appreciation: Thank and reward colleagues by sending them a voucher through our peer-to-peer program