CrawlJobs Logo

Quantitative Data Engineer

sig.com Logo

Susquehanna International Group

Location Icon

Location:
Ireland , Dublin

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

At Susquehanna, the frontier of trading is no longer just speed—it’s also about mastering data. The Quantitative Engineering and Data group in Europe is a rapidly expanding group focused on creating algorithms, processes and datasets that enable revenue generation from data at Susquehanna. We partner with Quantitative Researchers, Traders and other engineering teams in London, Dublin and across the globe. We are looking for world leading talent that has the skills to write great Python code and navigate complex data and models.

Job Responsibility:

  • Member of the Quantitative Engineering and Data group
  • Responsible for enabling research and trading activities using a range of data science, analysis, and engineering skills
  • Write, test, and deploy Python code that creates new model features, analyses complex datasets, optimizes algorithms/quant code, and defines data orchestration
  • Partner with researchers on revenue generating projects taking on coding tasks that optimise project delivery
  • Develop expertise in Susquehanna’s systems and business

Requirements:

  • Degree in a STEM subject with evidence of independent work either from work experience, Masters, or PhD
  • 4+ years professional experience in Python with practical experience solving programming problems through algorithm design
  • 3+ years hands-on experience working with data manipulation libraries (e.g. Pandas, Polars, PySpark)
  • Experience working in a Linux environment
  • A desire to work closely with Researchers and accelerate their research
  • The ability to self-manage, self-motivate, and seek process improvement opportunities
  • Strong interpersonal and communication skills that enable you to collaborate effectively in a quantitative domain
  • Attention to detail and ability to react to changing priorities
  • A general interest in quantitative trading

Additional Information:

Job Posted:
February 01, 2026

Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Quantitative Data Engineer

Senior Data Engineer

As a Senior Software Engineer, you will play a key role in designing and buildin...
Location
Location
United States
Salary
Salary:
156000.00 - 195000.00 USD / Year
apollo.io Logo
Apollo.io
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years experience in platform engineering, data engineering or in a data facing role
  • Experience in building data applications
  • Deep knowledge of data eco system with an ability to collaborate cross-functionally
  • Bachelor's degree in a quantitative field (Physical / Computer Science, Engineering or Mathematics / Statistics)
  • Excellent communication skills
  • Self-motivated and self-directed
  • Inquisitive, able to ask questions and dig deeper
  • Organized, diligent, and great attention to detail
  • Acts with the utmost integrity
  • Genuinely curious and open
Job Responsibility
Job Responsibility
  • Architect and build robust, scalable data pipelines (batch and streaming) to support a variety of internal and external use cases
  • Develop and maintain high-performance APIs using FastAPI to expose data services and automate data workflows
  • Design and manage cloud-based data infrastructure, optimizing for cost, performance, and reliability
  • Collaborate closely with software engineers, data scientists, analysts, and product teams to translate requirements into engineering solutions
  • Monitor and ensure the health, quality, and reliability of data flows and platform services
  • Implement observability and alerting for data services and APIs (think logs, metrics, dashboards)
  • Continuously evaluate and integrate new tools and technologies to improve platform capabilities
  • Contribute to architectural discussions, code reviews, and cross-functional projects
  • Document your work, champion best practices, and help level up the team through knowledge sharing
What we offer
What we offer
  • Equity
  • Company bonus or sales commissions/bonuses
  • 401(k) plan
  • At least 10 paid holidays per year
  • Flex PTO
  • Parental leave
  • Employee assistance program and wellbeing benefits
  • Global travel coverage
  • Life/AD&D/STD/LTD insurance
  • FSA/HSA and medical, dental, and vision benefits
  • Fulltime
Read More
Arrow Right

Ab Initio Data Engineer

The Applications Development Intermediate Programmer Analyst is an intermediate ...
Location
Location
India , Chennai; Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in a quantitative field (such as Engineering, Computer Science, Statistics, Econometrics)
  • Minimum 5 years of extensive experience in design, build and deployment of Ab Initio-based applications
  • Expertise in handling complex large-scale Data Lake and Warehouse environments
  • Hands-on experience writing complex SQL queries, exporting and importing large amounts of data using utilities
Job Responsibility
Job Responsibility
  • Ability to design and build Ab Initio graphs (both continuous & batch) and Conduct>it Plans
  • Build Web-Service and RESTful graphs and create RAML or Swagger documentations
  • Complete understanding and analytical ability of Metadata Hub metamodel
  • Strong hands on Multifile system level programming, debugging and optimization skill
  • Hands on experience in developing complex ETL applications
  • Good knowledge of RDBMS – Oracle, with ability to write complex SQL needed to investigate and analyze data issues
  • Strong in UNIX Shell/Perl Scripting
  • Build graphs interfacing with heterogeneous data sources – Oracle, Snowflake, Hadoop, Hive, AWS S3
  • Build application configurations for Express>It frameworks – Acquire>It, Spec-To-Graph, Data Quality Assessment
  • Build automation pipelines for Continuous Integration & Delivery (CI-CD), leveraging Testing Framework & JUnit modules, integrating with Jenkins, JIRA and/or Service Now
  • Fulltime
Read More
Arrow Right

Senior Manager, Data Engineering

You will build a team of talented engineers that will work cross functionally to...
Location
Location
United States , San Jose
Salary
Salary:
240840.00 - 307600.00 USD / Year
archer.com Logo
Archer Aviation
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 6+ years of experience in a similar role, 2 of which are in a data leadership role
  • B.S. in a quantitative discipline such as Computer Science, Computer Engineering, Electrical Engineering, Mathematics, or a related field
  • Expertise with data engineering disciplines including data warehousing, database management, ETL processes, and ML model deployment
  • Experience with processing and storing telemetry data
  • Demonstrated experience with data governance standards and practices
  • 3+ years leading teams, including building and recruiting data engineering teams supporting diverse stakeholders
  • Experience with cloud-based data platforms such as AWS, GCP, or Azure
Job Responsibility
Job Responsibility
  • Lead and continue to build a world-class team of engineers by providing technical guidance and mentorship
  • Design and implement scalable data infrastructure to ingest, process, store, and access multiple data supporting flight test, manufacturing and supply chain, and airline operations
  • Take ownership of data infrastructure to enable a highly scalable and cost-effective solution serving the needs of various business units
  • Build and support the development of novel tools to enable insight and decision making with teams across the organization
  • Evolve data engineering and AI strategy to align with the short and long term priorities of the organization
  • Help to establish a strong culture of data that is used throughout the company and industry
  • Lead initiatives to integrate AI capabilities in new and existing tools
  • Fulltime
Read More
Arrow Right

Python Data Engineer

The FX Data Analytics & AI Technology team, within Citi's FX Technology organiza...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8 to 12 Years experience
  • Master’s degree or above (or equivalent education) in a quantitative discipline
  • Proven experience in software engineering and development, and a strong understanding of computer systems and how they operate
  • Excellent Python programming skills, including experience with relevant analytical and machine learning libraries (e.g., pandas, polars, numpy, sklearn, TensorFlow/Keras, PyTorch, etc.), in addition to visualization and API libraries (matplotlib, plotly, streamlit, Flask, etc)
  • Experience developing and implementing Gen AI applications from data in a financial context
  • Proficiency working with version control systems such as Git, and familiarity with Linux computing environments
  • Experience working with different database and messaging technologies such as SQL, KDB, MongoDB, Kafka, etc
  • Familiarity with data visualization and ideally development of analytical dashboards using Python and BI tools
  • Excellent communication skills, both written and verbal, with the ability to convey complex information clearly and concisely to technical and non-technical audiences
  • Ideally, some experience working with CI/CD pipelines and containerization technologies like Docker and Kubernetes
Job Responsibility
Job Responsibility
  • Design, develop and implement quantitative models to derive insights from large and complex FX datasets, with a focus on understanding market trends and client behavior, identifying revenue opportunities, and optimizing the FX business
  • Engineer data and analytics pipelines using modern, cloud-native technologies and CI/CD workflows, focusing on consolidation, automation, and scalability
  • Collaborate with stakeholders across sales and trading to understand data needs, translate them into impactful data-driven solutions, and deliver these in partnership with technology
  • Develop and integrate functionality to ensure adherence with best-practices in terms of data management, need-to-know (NTK), and data governance
  • Contribute to shaping and executing the overall data strategy for FX in collaboration with the existing team and senior stakeholders
  • Fulltime
Read More
Arrow Right

Director, Data Engineering & Analytics

We are seeking a proven Data and Analytics leader to run our data team. This rol...
Location
Location
United States , Washington, DC
Salary
Salary:
165000.00 - 295625.00 USD / Year
arcadia.com Logo
Arcadia
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Expert Data & Analytics leader with demonstrated experience processing and analyzing large-scale datasets (billions of records)
  • Deep expertise with Snowflake as a data platform, including performance optimization, cost management, and architecting for scale
  • Hands-on experience with modern data stack: dbt for transformation, Hex for analytics, and Fivetran/Airbyte for data ingestion
  • Have built and led Data & Analytics teams at high-growth SaaS companies, specifically those dealing with high-volume data processing
  • Experience with utility data, billing systems, or similar high-volume transactional data is highly valued
  • 12+ years in the workforce with significant experience in data-intensive environments
  • Top-notch technical skills covering both data and quantitative techniques: data facility, descriptive analytics, and predictive modeling
  • SQL and Python are a must, with demonstrated ability to write optimized queries for large-scale data processing
  • Experience with data governance, security, and compliance in handling sensitive customer data
Job Responsibility
Job Responsibility
  • Build, lead, and scale a successful data organization
  • Oversee the processing and analysis of several million utility bills per month, ensuring data pipeline reliability, accuracy, and scalability
  • Ensure data quality and that we are building our products to give our customers the insight they need
  • Build a multi-year strategy around data infrastructure, enterprise data modeling, and processing capabilities
  • Build a framework for data investments that ensures we are appropriately balancing R&D with products that deliver strong return on investment
  • Lead the optimization and evolution of our Snowflake-based data architecture to handle exponential data growth
  • Own the enterprise unified data model and architecture that will power all of Arcadia’s applications and use cases
What we offer
What we offer
  • competitive benefits and equity component to the package
  • Fulltime
Read More
Arrow Right

Software Developer / Data Engineer in Proteomics

The Chair of Proteomics and Bioanalytics at the Technical University of Munich l...
Location
Location
Germany , Freising
Salary
Salary:
Not provided
jobrxiv.org Logo
jobRxiv
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • MSc or PhD in Computer Science, Bioinformatics, Computational Biology, or a related quantitative field
  • Solid experience in programming and system architecture design (e.g., Python), workflow management tools (e.g., Nextflow) and containerization (e.g., Docker)
  • Strong knowledge of relational database systems and schema design (e.g., MySQL), and web-development (e.g., Vue.js) including data visualization frameworks (e.g., D3.js)
  • Excellent collaborative skills
  • ability to work in multidisciplinary teams
Job Responsibility
Job Responsibility
  • Develop and maintain scalable data-processing pipelines for mass spectrometry-based phosphoproteomics and related omics data
  • Design, build, and document database systems and interfaces for managing, visualizing, and mining complex multi-omics datasets
  • Integrate quantification, annotation, and pathway information with genomic and transcriptomic data relevant to cancer signaling and drug response
  • Collaborate closely with proteomics experts, cancer biologists, and clinicians to transform experimental data into biologically and clinically meaningful insights
  • Implement data quality control, and reproducibility workflows in accordance with regulatory requirements
  • Contribute to the development of APIs, web tools, and dashboards for internal users and collaborators
What we offer
What we offer
  • Join an interdisciplinary team of biochemists, cell biologists, bioinformaticians, and clinicians that uses the latest proteomic approaches to fight cancer
  • The Technical University of Munich is one of the best academic institutions in the world, and offers a stimulating work environment and excellent future perspectives
  • The position is initially available for two years but may be extended
  • The salary follows the TVL scale
  • Fulltime
Read More
Arrow Right

Senior Software Engineer

We are seeking a highly skilled senior software engineer to join our team. This ...
Location
Location
India , Bengaluru
Salary
Salary:
Not provided
rearc.io Logo
Rearc
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Deep expertise in Java: Proven proficiency in designing, developing, and optimizing high-performance, multithreaded Java applications
  • Comprehensive SDLC Experience: Extensive experience with the entire software development lifecycle, including requirements gathering, architectural design, coding, testing (unit, integration, performance), deployment, and maintenance
  • Data Engineering & Financial Data Processing: Proven experience in data engineering, including building, maintaining, and optimizing complex data pipelines for real-time and historical financial stock market data
  • Financial Market Acumen: A strong background in the financial industry, with a working knowledge of financial instruments, market data (e.g., tick data, OHLC), and common financial models
  • Problem-Solving & Adaptability: Excellent problem-solving skills and the ability to work with complex and evolving requirements
  • Collaboration & Communication: Superior communication skills, capable of collaborating effectively with quantitative analysts, data scientists, and business stakeholders
  • Testing & CI/CD: A strong ability to work on development and all forms of testing, with working knowledge of CI/CD pipelines and deployments
  • Database Proficiency: Experience with various database technologies (SQL and NoSQL) and the ability to design database schemas for efficient storage and retrieval of financial data
Job Responsibility
Job Responsibility
  • Design and Development: Architect, build, and maintain robust, scalable, and low-latency Java applications for processing real-time and historical financial stock market data
  • Data Pipeline Engineering: Engineer and manage sophisticated data pipelines using modern data technologies to ensure timely and accurate data availability for analytics and trading systems
  • Performance Optimization: Profile and optimize applications for maximum speed, scalability, and efficiency
  • System Integration: Integrate data from various financial market sources and ensure seamless data flow into downstream systems
  • Mentorship and Best Practices: Provide guidance and mentorship to other engineers, contribute to code reviews, and advocate for best practices
  • Operational Excellence: Participate in the full software development lifecycle, from initial design to production support, ensuring system reliability and performance
Read More
Arrow Right

AI Data Engineer

The AI Data Engineer role involves designing and implementing cloud platforms fo...
Location
Location
United States , San Juan
Salary
Salary:
Not provided
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in computer science, engineering, information systems, or closely related quantitative discipline
  • 4-7 years’ experience
  • strong programming skills in Python, Java, Golang, or JavaScript
  • good understanding of distributed systems, event-driven programming paradigms, and designing for scale and performance
  • experience with cloud-native applications, developer tools, managed services, and next-generation databases
  • knowledge of DevOps practices like CI/CD, infrastructure as code, containerization, and orchestration using Kubernetes
  • good written and verbal communication skills
  • comfortable with AWS services
  • familiarity with the landscape of big data exploration, visualization, and prototyping platforms
  • familiarity with statistical and machine learning techniques
Job Responsibility
Job Responsibility
  • Research, propose, design, implement, operate and maintain cloud platforms for big data exploration and visualization, in support of a team of data scientists
  • deploy data science solutions into cloud environments
  • work with data scientists to troubleshoot cloud workflows
  • closely collaborate with our datalake team on cloud technologies
  • identify and implement cost-saving strategies to reduce ongoing cloud expenses
  • build CI/CD pipelines
  • deploy and maintain orchestration and monitoring systems for big data processing
  • help build images and containerize applications
What we offer
What we offer
  • Comprehensive suite of benefits that supports physical, financial, and emotional wellbeing
  • specific programs catered to professional development
  • inclusive working environment
  • Fulltime
Read More
Arrow Right