This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
About the Team: The ClickPipes - Database Integrations team builds the platform that enables real-time data replication from databases into ClickHouse at petabyte scale. As a member of this team, you will be solving complex database-related challenges and distributed systems problems, such as understanding database internals to optimize snapshotting strategy, handling schema evolution during live replication, managing data type compatibility across systems, maintaining low end-to-end latency under unpredictable loads, and leveraging durable execution frameworks to ensure data consistency over unreliable networks. We work in the open — our database integrations are built on PeerDB, an open-source CDC platform we actively maintain and contribute to.
Job Responsibility:
Build data-intensive systems
Design and develop high-throughput integrations with databases (Postgres, MySQL, MongoDB), data lakes (Iceberg, Delta Lake), and data warehouses (BigQuery, Snowflake)
Handle edge cases in real-world production scenarios: unconventional database setups, internals of data types, database upgrades/failovers, large transactions, etc
Design integration solutions to enable users to fully harness ClickHouse's performance and throughput
Own end-to-end reliability
Debug complex issues in production leveraging runtime diagnostics (e.g. pprof, parca) and observability tools (e.g. metrics, logging, tracing)
Build and improve infrastructure and tools to increase system reliability, reduce incident response time, and simplify/automate operations
Write clear documentation, both publicly and internally
Participate in on-call rotation
Drive product innovation
Work directly with customers to understand integration requirements and discover gaps in existing product
Collaborate cross-functionally with internal teams to ensure operational efficiency
Lead technical discussions and influence product roadmaps
Requirements:
5+ years of industry experience building data-intensive software solutions
Proficient in Go, or experienced in systems programming with willingness to ramp up quickly in Go
Cloud-native experience deploying and operating services on at least one major cloud platform (AWS/GCP/Azure)
Practical experience with Kubernetes
Strong problem solver and solid production debugging skills
Clear communication in writing (design docs, code review) and verbally (technical discussions, customer calls, incident response)
Nice to have:
Experience with database replication technologies (CDC, logical replication)
Experience with durable execution frameworks (Temporal)
Experience with data formats and protocols (Avro, Parquet, Protobuf)
Experience with modern data processing frameworks (e.g. Kafka, Spark, Flink)
Experience with maintaining/contributing to open-source software
What we offer:
Flexible work environment - ClickHouse is a globally distributed company and remote-friendly. We currently operate in 20 countries
Healthcare - Employer contributions towards your healthcare
Equity in the company - Every new team member who joins our company receives stock options
Time off - Flexible time off in the US, generous entitlement in other countries
A $500 Home office setup if you’re a remote employee
Global Gatherings – We believe in the power of in-person connection and offer opportunities to engage with colleagues at company-wide offsites
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.