This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
In the Marktplaats data and analytics teams, data is at the heart of everything we do. As a Data Engineer of the Data Platform team at Marktplaats you will be relied on to independently develop and deliver high-quality features for our new Data/ML Platform, refactor and translate our data products and finish various tasks to a high standard. You will be the cornerstone of the platform’s reliability, scalability and performance, working hands on with batch and streaming data pipelines, storage solutions and APIs that serve complex analytical and ML workloads.The role encompasses ownership of the self-serve data platform, including data collection, lake management, orchestration, processing, and distribution.
Job Responsibility:
Independently develop and deliver high-quality features for our new Data/ML Platform
Refactor and translate our data products
Finish various tasks to a high standard
Be the cornerstone of the platform’s reliability, scalability and performance
Work hands on with batch and streaming data pipelines, storage solutions and APIs
Encompass ownership of the self-serve data platform, including data collection, lake management, orchestration, processing, and distribution
Collaborate in a small, fast moving team
Requirements:
10+ years of hands-on experience in Software Development/Data Engineering
Proven experience building cloud native data intensive applications (both real time and batch based)
Strong background in Data Engineering to support other Data Engineers, Back Enders and Data Scientists
Hands-on experience building and maintaining Spark applications
Experienced in AWS Cloud usage and data management
Ensure data quality, schema governance and monitoring across pipelines
Experience with orchestrators such as Airflow, Kubeflow & Databricks workflows
Solid experience with containerization and orchestration technologies
Fundamental understanding of various Parquet, Delta Lake and other OTFs file formats
Proficiency on an IaC tool such as Terraform, CDK or CloudFormation
Experience with Databricks
Data validation/analysis skills & proficiency in SQL
Strong written and verbal English communication skill
Nice to have:
Prior experience building and operating data platforms
GCP experience
Scala Spark
What we offer:
An attractive Base Salary
Participation in our Short Term Incentive plan (annual bonus)
Work From Anywhere: Enjoy up to 20 days a year of working from anywhere
A 24/7 Employee Assistance Program for you and your family
A collaborative environment with an opportunity to explore your potential and grow