This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Join our Data Platform team at 10x Genomics to architect and implement our strategic Unified Data Platform (UDP). This pivotal role is focused on modernizing our data infrastructure, transitioning to a scalable Event-Driven Architecture (EDA), and building the foundation for next-generation AI/ML and self-service analytics.
Job Responsibility:
Architect and implement the canonical data layer and Event-Driven Architecture (EDA) using technologies like Apache Iceberg and Kafka to decouple applications and ensure real-time data flow
Design, build, and optimize high-volume, code-first data pipelines (real-time and batch) across a large application landscape (e.g., Salesforce, Oracle, Workday)
Establish Amazon S3 as the Single Source of Truth (SSOT) and govern data using principles like the Medallion Architecture (Silver and Gold layers) and schema evolution
Develop, test, and maintain robust and scalable ELT pipelines and data models in Snowflake, including leveraging advanced features like Snowpipes, Streams, and Stored Procedures
Develop the data presentation layer for self-service analytics, including the Natural Language Query (NLQ) interface integrated with Generative AI (e.g., Bedrock)
Lead technical efforts to migrate key business domains off legacy middleware and onto the new platform, eliminating the 'Integration Bottleneck'
Define and enforce data governance, quality, and security standards across the Unified Data Platform
Collaborate with the Architecture Review Board (ARB) to promote modern approaches such as serverless computing and Domain-Driven Design
Take ownership of the full development lifecycle, from prototyping and design through deployment, monitoring, and operational excellence
Requirements:
Bachelor's degree in Computer Science, Information Management, or a related field, or equivalent experience
5+ years of hands-on experience in software engineering focused on data platform development, distributed systems, or enterprise integrations
Proven experience designing and implementing highly scalable data platforms on major cloud environments (e.g., AWS, GCP, or Azure)
Deep proficiency in one or more general-purpose programming languages (e.g., Python, Java, or similar)
Strong foundation in computer science fundamentals, including data structures, algorithms, and system design
Nice to have:
Expertise in message queues and event streaming platforms (e.g., Kafka, RabbitMQ, Pub/Sub) and implementing Event-Driven Architecture
Experience with building data lakes/lakehouses using open formats like Apache Iceberg on cloud storage (e.g., Amazon S3)
Expertise in modern ELT development, data modeling for OLAP/data warehousing using tools like dbt, and advanced Snowflake features (e.g., Snowpipes, Streams, Stored Procedures)
Familiarity with containerization (Docker, Kubernetes) and Infrastructure-as-Code (IaC) principles
Prior experience in migrating an organization off a traditional iPaaS platform or eliminating legacy middleware
Experience with Generative AI integration for data access (e.g., NLQ, feature stores)
What we offer:
Equity grants
Comprehensive health and retirement benefit programs