This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Our systems operate at the massive speed and scale of algorithmic trading. We process terabytes of data and billions of rows each day, and every layer of the platform must be engineered for performance and reliability. As a small team, we are continuously evolving our analytics platform to keep pace with growing datasets and increasingly complex analytical workloads. We are seeking a software engineer who enjoys building the systems that make large-scale data usable. This role sits at the intersection of data engineering and backend development, focused on building and improving the infrastructure that powers our analytics platform. That includes designing data pipelines, workflow orchestration, and internal tooling that supports high-volume, time-sensitive data processing. The ideal candidate is comfortable working with both real-time and historical data streams and has experience managing end-to-end data workflows. You will help design and maintain the services, pipelines, and platform infrastructure that support our analytics initiatives, working closely with both engineering and the business to turn complex datasets into usable systems.
Job Responsibility:
Work with engineers and business teams to understand our datasets and build systems to store, process, and productionize them
Develop and maintain reliable data pipelines and workflow systems for large-scale data ingestion and transformation
Build internal services and tooling that support analytics, automation, and data operations
Implement ETL and data transformation processes for analytical databases and internal applications
Develop and maintain workflow orchestration for complex data processes
Optimize storage, retrieval, and processing performance across large datasets
Partner with analysts and researchers to support data modeling and analytics workflows
Monitor and troubleshoot data infrastructure to ensure reliability and data integrity
Continuously improve the platform through better system design, tooling, and engineering practices
Requirements:
Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent hands-on experience
3-5 years of experience building internal data platform tooling or services that support large-scale data processing and analytics workflows (primarily using Python and SQL
Rust is a plus)
Experience with workflow orchestration systems used to manage complex data pipelines (Temporal, Airflow, Dagster, or similar)
Familiarity with columnar data formats such as Parquet or Arrow
Experience working with large-scale datasets and high-volume data pipelines in production environments
US citizen/visa only
Nice to have:
Experience working with analytics platforms, data providers, or financial data systems is a plus