This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Akuna Capital is an innovative trading firm with a strong focus on collaboration, cutting-edge technology, data driven solutions, and automation. We specialize in providing liquidity as an options market-maker – meaning we are committed to providing competitive quotes that we are willing to both buy and sell. To do this successfully, we design and implement our own low latency technologies, trading strategies, and mathematical models. The Akuna Data Engineering team is composed of world-class talent responsible for designing, building, and maintaining the systems, applications, and infrastructure needed to collect, store, process, manage, and query Akuna’s data assets. The Akuna Data Engineering team plays a crucial role in ensuring that trustworthy data is available, reliable, and accessible to support various data-driven initiatives within Akuna’s Quant, Trading, and Business Operations business units.
Job Responsibility:
Work within a growing Data Engineering division supporting the strategic role of data at Akuna
Drive the ongoing design and expansion of our data platform across a wide variety of data sources, supporting an array of streaming, operational and research workflows
Work closely with Trading, Quant, Technology & Business Operations teams throughout the firm to identify how data is produced and consumed, helping to define and deliver high impact projects
Build and deploy batch and streaming pipelines to collect and transform our rapidly growing Big Data set within our hybrid cloud architecture utilizing Kubernetes/EKS, Kafka/MSK and Databricks/Spark.
Mentor junior engineers in software and data engineering best practices
Produce clean, well-tested, and documented code with a clear design to support mission critical applications
Build automated data validation test suites that ensure that data is processed and published in accordance with well-defined Service Level Agreements (SLA’s) pertaining to data quality, data availability and data correctness
Challenge the status quo and help push our organization forward, as we grow beyond the limits of our current tech stack
Requirements:
BS/MS/PhD in Computer Science, Engineering, Physics, Math, or equivalent technical field
5+ years of professional experience developing software applications
Java/Scala experience required
Highly motivated and willing to take ownership of high-impact projects upon arrival
Prior hands-on experience with data platforms and technologies such as Delta Lake, Spark, Kubernetes, Kafka, ClickHouse, and/or Presto/Trino
Experience building large-scale batch and streaming pipelines with strict SLA and data quality requirements
Must possess excellent communication, analytical, and problem-solving skills
Demonstrated experience working with diverse data sets and frameworks across multiple domains
Recent hands-on experience with AWS Cloud development, deployment and monitoring necessary
Demonstrated experience working on an Agile team employing software engineering best practices, such as GitOps and CI/CD, to deliver complex software projects
The ability to react quickly and accurately to rapidly changing market conditions, including the ability to quickly and accurately respond and/or solve math and coding problems are essential functions of the role
Nice to have:
Python experience a significant plus
financial data experience not required, but a strong plus
What we offer:
discretionary performance bonus
comprehensive benefits package that may encompass employer-paid medical, dental, vision, retirement contributions, paid time off, and other benefits