This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Data Management Platform is the core system that receives, processes and provides information to other external DMPs and real time platforms. The purpose of DMP is to store the connection between users/devices and their respective behavioral categories. Further this data is used by the other teams for the analytics and digital ad targeting. Team roadmap includes developing a new system based on the streaming architecture with a relevant tech stack.
Job Responsibility:
Own end-to-end development, from design to production, with a focus on innovation and efficiency
Lead and mentor developers on complex projects impacting core business systems
Contribute to the design and development of next-generation data solutions
Implement high-scale big data solutions and contribute to platform infrastructure and architecture
Research and evaluate core technologies, integrations, and external APIs
Collaborate with cross-functional stakeholders, including Product, Engineering, and data providers
Participate in an off-hours pager duty rotation
Requirements:
Bachelor’s degree in Computer Science, Computer Engineering, or equivalent experience
4+ years of software development experience
2+ years of experience with Apache Spark
2+ years of experience with Scala (preferred) and/or Python or Java
Experience working with high-performance, large-scale, or distributed data systems
Experience with SQL and NoSQL databases
Experience with Kubernetes
Experience with Git
Experience building and optimizing data pipelines, ETL processes, and streaming systems (e.g., Hadoop, Spark, Kafka) is a plus
Familiarity with tools such as GitLab, Grafana, InfluxDB, and Kibana is a plus
Strong communication skills and ability to work effectively in a team
English level: B1+ or higher
Nice to have:
Experience building and optimizing data pipelines, ETL processes, and streaming systems (e.g., Hadoop, Spark, Kafka)
Familiarity with tools such as GitLab, Grafana, InfluxDB, and Kibana
What we offer:
Technical and non-technical training for professional and personal growth
Internal conferences and meetups to learn from industry experts
Support and mentorship from an experienced employee to help you professional grow and development
Health insurance
English courses
Sports activities to promote a healthy lifestyle
Flexible work options, including remote and hybrid opportunities
Referral program for bringing in new talent
Work anniversary program and additional vacation days