This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Streaming Platform team at Sentry is building the next generation of infrastructure that powers our ingestion pipelines and real-time data processing systems. Our platform ingests, processes, and distributes hundreds of thousands of events per second with low latency and high reliability. We are creating a system that makes it easy for Sentry engineers to deploy and run streaming applications at scale by simplifying the complexity of Kafka, scaling consumers automatically, and managing state so product teams can focus on building great experiences for developers. As part of this team, you will work on challenges at the intersection of distributed systems, real-time data processing, and developer experience. You will help us create a self-service streaming platform that improves stability, accelerates time to production, and reduces operational overhead.
Job Responsibility:
Design, build, and operate components of our Streaming Platform, including Kafka, the streaming runtime, high-level APIs, and developer-facing abstractions
Implement resilient, high-throughput stream processing systems that handle unbounded datasets with strong correctness guarantees (delivery, checkpointing, watermarking, and more)
Build scalable automation and control plane for Kafka fleet management and improve efficiency
Partner with product engineers to ensure our abstractions enable fast, reliable, and consistent ingestion pipelines
Improve observability, monitoring, and failover for mission-critical real-time systems
Requirements:
5+ years of software engineering experience, with background in distributed systems, data infrastructure, or real-time streaming
Proficiency in a programming language such as Python, Rust, Go, or Java (we primarily use Python and Rust, but experience in similar languages is valuable)
Experience with streaming technologies such as Kafka, Flink, Spark Streaming, or similar tools
Strong understanding of partitioning, watermarks, windowing, stateful/stateless processing, and delivery guarantees
Experience building and operating systems in cloud environments such as Kubernetes, AWS, or GCP
Nice to have:
experience with ClickHouse, Arrow or other columnar data processing, or modern streaming SQL engines such as Materialize or RisingWave
What we offer:
incentive compensation, equity grants, paid time off, and group health insurance coverage