This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As the Manager for Realtime Data Platform, a team within the Knowledge Platform, you will oversee large-scale, cross-functional initiatives that impact the entire data ecosystem. You will be responsible for defining the multi-year vision and technical roadmap for all real-time data analysis and serving needs, ensuring they are aligned with broader business objectives and product goals. A key aspect of your role will also include scaling the engineering teams, defining new roles, and establishing best practices for communication and collaboration across multiple sub-teams.
Job Responsibility:
Provide visionary technical leadership and define a clear 1-3 year strategic roadmap for the Realtime Data Platform
Lead the multi-year effort to modernize our core data platform, introducing real-time analytical processing at petabyte scale
Partner with product teams to enable new data-driven features, such as AI-powered applications or real-time dashboards, by ensuring the underlying platform capabilities are in place
Successfully scale and structure the engineering teams, including hiring new talent and mentoring managers and senior individual contributors
Cultivate and maintain strong relationships with product and other engineering teams, serving as a trusted technical advisor on all things Data and AI
Requirements:
A minimum of 8 years of experience in data or software engineering
At least 3 years in a leadership or management role
Proven experience successfully managing and scaling an engineering team of 10+ people, including managers and senior individual contributors
Demonstrated ability to define and execute a technical strategy that led to a significant business outcome or operational improvement
Strong command of the data and software engineering domains, with a focus on architecture and strategy
Deep understanding of distributed data processing technologies (e.g., Apache Spark, Databricks) and event streaming (e.g., Kafka)
Solid knowledge of modern data storage and serving technologies, such as CubeJS, ElasticSearch, and Clickhouse, and more
Familiarity with observability tooling such as Splunk, Grafana, and Prometheus
Nice to have:
Experience within the financial domain
Hands-on design experience in crafting data processing patterns for a modern Lakehouse architecture
Excellent written and verbal communication skills tailored for diverse audiences (leadership, users, company-wide)
Ability to rapidly evaluate various technologies and conduct proof of concepts to drive architecture design
Experience thriving in a complex environment
What we offer:
Competitive salary plus valuable equity
Collaborative open office space with a fully stocked kitchen