This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Data Infrastructure team at Figma builds and operates the foundational platforms that power analytics, AI, and data-driven decision-making across the company. We serve a diverse set of stakeholders, including AI Researchers, Machine Learning Engineers, Data Scientists, Product Engineers, and business teams that rely on data for insights and strategy. Our team owns and scales critical data platforms such as the Snowflake data warehouse, ML Datalake, and large-scale data movement and processing applications, managing all data flowing into and out of these platforms.
Job Responsibility:
Design and build large-scale distributed data systems that power analytics, AI/ML, and business intelligence
Develop batch and streaming solutions to ensure data is reliable, efficient, and scalable across the company
Manage data ingestion, movement, and processing through core platforms like Snowflake, our ML Datalake, and real-time streaming systems
Improve data reliability, consistency, and performance, ensuring high-quality data for engineering, research, and business stakeholders
Collaborate with AI researchers, data scientists, product engineers, and business teams to understand data needs and build scalable solutions
Drive technical decisions and best practices for data ingestion, orchestration, processing, and storage
Requirements:
5+ years of Software Engineering experience, specifically in backend or infrastructure engineering
Experience designing and building distributed data infrastructure at scale
Strong expertise in batch and streaming data processing technologies such as Spark, Flink, Kafka, or Airflow/Dagster
A proven track record of impact-driven problem-solving in a fast-paced environment
A strong sense of engineering excellence, with a focus on high-quality, reliable, and performant systems
Excellent technical communication skills, with experience working across both technical and non-technical counterparts
Experience mentoring and supporting engineers, fostering a culture of learning and technical excellence
Nice to have:
Experience with data governance, access control, and cost optimization strategies for large-scale data platforms
Familiarity with our stack, including Golang, Python, SQL, frameworks such as dbt, and technologies like Spark, Kafka, Snowflake, and Dagster
Experience designing data infrastructure for AI/ML pipelines
The ability to navigate ambiguity, take ownership, and drive projects from inception to execution
What we offer:
equity
health, dental & vision
retirement with company contribution
parental leave & reproductive or family planning support
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.