CrawlJobs Logo

Infrastructure Engineer (Data & Automations)

elevenlabs.io Logo

ElevenLabs

Location Icon

Location:
United Kingdom

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are looking for an Infrastructure Engineer to join our Core Platform team. As ElevenLabs scales, the systems and tooling needed to support our teams have grown significantly. As part of the Core Platform team, you will own the infrastructure that enables every team at ElevenLabs move fast, safely and at scale.

Job Responsibility:

  • Taking end-to-end ownership of platform reliability and security, with a particular focus on improving security across our internal systems
  • Collaborating closely with the Infrastructure team to bridge platform needs with infra capabilities
  • Partnering with Growth, Finance and other internal teams to ensure they have the data and tooling they need
  • Owning the infrastructure underpinning our Automation Engineering teams - setting up internal services, building and maintaining ETLs, and connecting systems with one another

Requirements:

  • Strong background in infrastructure engineering
  • Software engineering fundamentals: you understand what good code looks like and can write production-quality Python
  • Experience with cloud infrastructure, container orchestration, deployment systems, and security fundamentals
  • You take ownership - when you see a platform problem, you fix it

Nice to have:

  • Experience with Kubernetes
  • Experience with DBT
  • Experience building AI agents
  • Familiarity with CI/CD systems or developer experience tooling
  • Basics of how AI models work
What we offer:
  • Innovative culture
  • Growth paths
  • Learning & development: ElevenLabs proactively supports professional development through an annual discretionary stipend
  • Social travel: We also provide an annual discretionary stipend to meet up with colleagues each year, however you choose
  • Annual company offsite: Each year, we bring the entire team together in a new location - past offsites have included Croatia and Italy
  • Co-working: If you’re not located near one of our main hubs, we offer a monthly co-working stipend

Additional Information:

Job Posted:
May 03, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Infrastructure Engineer (Data & Automations)

Research Engineer, Data Infrastructure

As a Research Engineer in Data Infrastructure, you will design and implement a “...
Location
Location
United States , Palo Alto
Salary
Salary:
180000.00 - 250000.00 USD / Year
1x.tech Logo
1X Technologies
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong experience in building data pipelines and ETL systems
  • Ability to design and implement systems that collect, upload, and manage data from robotic fleets
  • Familiarity with architectures combining on‑robot components, on‑premises clusters, and cloud systems
  • Experience with data labeling tools or building tooling for dataset visualization and annotation
  • Skills in creating or applying machine learning models for dataset organization / automated labeling
Job Responsibility
Job Responsibility
  • Optimize operational efficiency of data collection on the NEO fleet
  • Design triggers on the robot to determine if and when data should be uploaded
  • Automate ETL pipelines so fleet‑wide data is easily queryable and available for training
  • Work with external dataset providers to prepare diverse multi-modal pre-training datasets
  • Build frontend tools for visualizing and automating labeling of very large datasets
  • Develop machine learning models to automatically label and organize datasets
What we offer
What we offer
  • Health, dental, and vision insurance
  • 401(k) with company match
  • Paid time off and holidays
  • Fulltime
Read More
Arrow Right

AI Research Engineer, Data Infrastructure

As a Research Engineer in Infrastructure, you will design and implement a robust...
Location
Location
United States , Palo Alto
Salary
Salary:
180000.00 - 250000.00 USD / Year
1x.tech Logo
1X Technologies
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong experience in building data pipelines and ETL systems
  • Ability to design and implement systems for data collection and management from robotic fleets
  • Familiarity with architectures that span on-robot components, on-premise clusters, and cloud infrastructure
  • Experience with data labeling tools or building dataset visualization and annotation tooling
  • Proficiency in creating or applying machine learning models for dataset organization and automated labeling
Job Responsibility
Job Responsibility
  • Optimize operational efficiency of data collection across the NEO robot fleet
  • Design intelligent triggers to determine when and what data should be uploaded from the robots
  • Automate ETL pipelines to make fleet-wide data easily queryable and training-ready
  • Collaborate with external dataset providers to prepare diverse multi-modal pre-training datasets
  • Build frontend tools for visualizing and automating the labeling of large datasets
  • Develop machine learning models for automatic dataset labeling and organization
What we offer
What we offer
  • Equity
  • Health, dental, and vision insurance
  • 401(k) with company match
  • Paid time off and holidays
  • Fulltime
Read More
Arrow Right

Data Infrastructure Engineer

The Data Infrastructure team builds distributed systems and tools supporting Int...
Location
Location
Ireland , Dublin
Salary
Salary:
Not provided
intercom.com Logo
Intercom
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of full-time, professional work experience in the data space using Python and SQL
  • Solid experience building and running data pipelines for large and complex datasets including handling dependencies
  • Hands-on cloud provider experience (preferably AWS) including service integrations and automation via CLI and APIs
  • Solid understanding of data security practices and are passionate about privacy
  • Some DevOps experience
  • You care about your craft
Job Responsibility
Job Responsibility
  • Evolve the Data Platform by designing and building the next generation of the stack
  • Develop, run and support our data pipelines using tools like Airflow, PlanetScale, Kinesis, Snowflake, Tableau, all in AWS
  • Collaborate with product managers, data engineers, analysts and data scientists to develop tooling and infrastructure to support their needs
  • Develop automation and tooling to support the creation and discovery of high quality analytics data in an environment where dozens of changes can be shipped daily
  • Implement systems to monitor our infrastructure, detect and surface data quality issues and ensure Operational Excellence
What we offer
What we offer
  • Competitive salary and equity in a fast-growing start-up
  • We serve lunch every weekday, plus a variety of snack foods and a fully stocked kitchen
  • Regular compensation reviews - we reward great work!
  • Pension scheme & match up to 4%
  • Peace of mind with life assurance, as well as comprehensive health and dental insurance for you and your dependents
  • Open vacation policy and flexible holidays so you can take time off when you need it
  • Paid maternity leave, as well as 6 weeks paternity leave for fathers, to let you spend valuable time with your loved ones
  • If you’re cycling, we’ve got you covered on the Cycle-to-Work Scheme. With secure bike storage too
  • MacBooks are our standard, but we also offer Windows for certain roles when needed
  • Fulltime
Read More
Arrow Right

Senior Software Engineer, Data Engineering

Join us in building the future of finance. Our mission is to democratize finance...
Location
Location
United States , Menlo Park
Salary
Salary:
146000.00 - 198000.00 USD / Year
robinhood.com Logo
Robinhood
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional experience building end-to-end data pipelines
  • Hands-on software engineering experience, with the ability to write production-level code in Python for user-facing applications, services, or systems (not just data scripting or automation)
  • Expert at building and maintaining large-scale data pipelines using open source frameworks (Spark, Flink, etc)
  • Strong SQL (Presto, Spark SQL, etc) skills
  • Experience solving problems across the data stack (Data Infrastructure, Analytics and Visualization platforms)
  • Expert collaborator with the ability to democratize data through actionable insights and solutions
Job Responsibility
Job Responsibility
  • Help define and build key datasets across all Robinhood product areas. Lead the evolution of these datasets as use cases grow
  • Build scalable data pipelines using Python, Spark and Airflow to move data from different applications into our data lake
  • Partner with upstream engineering teams to enhance data generation patterns
  • Partner with data consumers across Robinhood to understand consumption patterns and design intuitive data models
  • Ideate and contribute to shared data engineering tooling and standards
  • Define and promote data engineering best practices across the company
What we offer
What we offer
  • Market competitive and pay equity-focused compensation structure
  • 100% paid health insurance for employees with 90% coverage for dependents
  • Annual lifestyle wallet for personal wellness, learning and development, and more
  • Lifetime maximum benefit for family forming and fertility benefits
  • Dedicated mental health support for employees and eligible dependents
  • Generous time away including company holidays, paid time off, sick time, parental leave, and more
  • Lively office environment with catered meals, fully stocked kitchens, and geo-specific commuter benefits
  • Bonus opportunities
  • Equity
  • Fulltime
Read More
Arrow Right

Software Engineer, Data Engineering

Join us in building the future of finance. Our mission is to democratize finance...
Location
Location
Canada , Toronto
Salary
Salary:
124000.00 - 145000.00 CAD / Year
robinhood.com Logo
Robinhood
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of professional experience building end-to-end data pipelines
  • Hands-on software engineering experience, with the ability to write production-level code in Python for user-facing applications, services, or systems (not just data scripting or automation)
  • Expert at building and maintaining large-scale data pipelines using open source frameworks (Spark, Flink, etc)
  • Strong SQL (Presto, Spark SQL, etc) skills
  • Experience solving problems across the data stack (Data Infrastructure, Analytics and Visualization platforms)
  • Expert collaborator with the ability to democratize data through actionable insights and solutions
Job Responsibility
Job Responsibility
  • Help define and build key datasets across all Robinhood product areas. Lead the evolution of these datasets as use cases grow
  • Build scalable data pipelines using Python, Spark and Airflow to move data from different applications into our data lake
  • Partner with upstream engineering teams to enhance data generation patterns
  • Partner with data consumers across Robinhood to understand consumption patterns and design intuitive data models
  • Ideate and contribute to shared data engineering tooling and standards
  • Define and promote data engineering best practices across the company
What we offer
What we offer
  • bonus opportunities
  • equity
  • benefits
  • Fulltime
Read More
Arrow Right

Lead Infrastructure and Automation Engineer

The Lead Infrastructure and Automation Engineer is responsible for developing, m...
Location
Location
United Kingdom , London
Salary
Salary:
Not provided
communityfibre.co.uk Logo
Community Fibre
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum of 7-10 years of experience working in a server and storage environment
  • Advanced understanding of Linux (Ubuntu / Debian, and CentOS / RHEL)
  • Experience with infrastructure as code (IAC) practices
  • Configuration management: Puppet and Ansible
  • Orchestration: Terraform
  • DCIM/IPAM tools, e.g. NetBox
  • Log ingestion: OpenSearch/Elasticsearch, Logstash, Kibana, Filebeat, Syslog, Graylog
  • Containerisation: Docker and / or Kubernetes
  • Virtualisation: VMware 7.x
  • Cloud: AWS
Job Responsibility
Job Responsibility
  • Develop, maintain, improve, and support Community Fibre’s infrastructure service environments
  • Manage and maintain existing Linux based servers, both on-prem and cloud hosted
  • Work with the Network Technology Team
  • Build new servers / systems
  • Ensure existing ones are maintained, reliable and resilient
  • Cover backend systems engineering, infrastructure, and site reliability engineering
  • Provide guidance and mentoring to other engineers
  • Create and implement high and low-level designs
  • Act as a senior member for the Network Technology team
  • Provide reports to SLT, Exec’ and Board members when required
What we offer
What we offer
  • 25 days holiday, increasing by 1 day for each year of service up to 28 days
  • Birthday leave
  • Cycle to work scheme
  • Flexible WFH policy
  • Private Health Cover
  • Fulltime
Read More
Arrow Right

Automation NoSQL Data Engineer

HPE Operations is our innovative IT services organization. It provides the exper...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Information Systems, or equivalent
  • 7+ years of demonstrated experience working in software development teams with a strong focus on NoSQL databases and distributed data systems
  • Strong experience in automated deployment, troubleshooting, and fine-tuning technologies such as Apache Cassandra, Clickhouse, MongoDB, Apache Spark, Apache Flink, Apache Airflow, and similar technologies
  • Strong knowledge of NoSQL databases such as Apache Cassandra, Clickhouse, and MongoDB, including their installation, configuration, and performance tuning in production environments
  • Expertise in deploying and managing real-time data processing pipelines using Apache Spark, Apache Flink, and Apache Airflow
  • Experience in deploying and managing Apache Spark and Apache Flink operators on Kubernetes and other containerized environments, ensuring high availability and scalability of data processing jobs
  • Hands-on experience in configuring and optimizing Apache Spark and Apache Flink clusters, including fine-tuning resource allocation, fault tolerance, and job execution
  • Proficiency in authoring, automating, and optimizing Apache Airflow DAGs for orchestrating complex data workflows across Spark and Flink jobs
  • Strong experience with container orchestration platforms (like Kubernetes) to deploy and manage Spark/Flink operators and data pipelines
  • Proficiency in creating, managing, and optimizing Airflow DAGs to automate data pipeline workflows, handle retries, task dependencies, and scheduling
Job Responsibility
Job Responsibility
  • Think through complex data engineering problems in a fast-paced environment and drive solutions to reality
  • Work in a dynamic, collaborative environment to build DevOps-centered data solutions using the latest technologies and tools
  • Provide engineering-level support for data tools and systems deployed in customer environments
  • Respond quickly and professionally to customer emails/requests for assistance
What we offer
What we offer
  • Health & Wellbeing
  • Personal & Professional Development
  • Unconditional Inclusion
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

As a Senior Software Engineer, you will play a key role in designing and buildin...
Location
Location
United States
Salary
Salary:
156000.00 - 195000.00 USD / Year
apollo.io Logo
Apollo.io
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years experience in platform engineering, data engineering or in a data facing role
  • Experience in building data applications
  • Deep knowledge of data eco system with an ability to collaborate cross-functionally
  • Bachelor's degree in a quantitative field (Physical / Computer Science, Engineering or Mathematics / Statistics)
  • Excellent communication skills
  • Self-motivated and self-directed
  • Inquisitive, able to ask questions and dig deeper
  • Organized, diligent, and great attention to detail
  • Acts with the utmost integrity
  • Genuinely curious and open
Job Responsibility
Job Responsibility
  • Architect and build robust, scalable data pipelines (batch and streaming) to support a variety of internal and external use cases
  • Develop and maintain high-performance APIs using FastAPI to expose data services and automate data workflows
  • Design and manage cloud-based data infrastructure, optimizing for cost, performance, and reliability
  • Collaborate closely with software engineers, data scientists, analysts, and product teams to translate requirements into engineering solutions
  • Monitor and ensure the health, quality, and reliability of data flows and platform services
  • Implement observability and alerting for data services and APIs (think logs, metrics, dashboards)
  • Continuously evaluate and integrate new tools and technologies to improve platform capabilities
  • Contribute to architectural discussions, code reviews, and cross-functional projects
  • Document your work, champion best practices, and help level up the team through knowledge sharing
What we offer
What we offer
  • Equity
  • Company bonus or sales commissions/bonuses
  • 401(k) plan
  • At least 10 paid holidays per year
  • Flex PTO
  • Parental leave
  • Employee assistance program and wellbeing benefits
  • Global travel coverage
  • Life/AD&D/STD/LTD insurance
  • FSA/HSA and medical, dental, and vision benefits
  • Fulltime
Read More
Arrow Right