This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Akuna Capital is a leading proprietary trading firm specializing in options market making. We are a data-driven organization, leveraging our data as a key competitive advantage essential to our success. The Akuna Data Engineering team is composed of world-class talent responsible for designing, building, and maintaining the systems, applications, and infrastructure needed to collect, store, process, manage, and query Akuna’s data assets. The Akuna Data Engineering team plays a crucial role in ensuring that trustworthy data is available, reliable, and accessible to support various data-driven initiatives within Akuna’s Quant, Trading, and Business Operations business units.
Job Responsibility:
Work within a growing Data Engineering division supporting the strategic role of data at Akuna
Drive the ongoing design and expansion of our data platform across a wide variety of data sources, supporting an array of streaming, operational and research workflows
Work closely with Trading, Quant, Technology & Business Operations teams throughout the firm to identify how data is produced and consumed, helping to define and deliver high impact projects
Build and deploy batch and streaming pipelines to collect and transform our rapidly growing Big Data set within our hybrid cloud architecture utilizing Kubernetes/EKS, Kafka/MSK and Databricks/Spark
Mentor junior engineers in software and data engineering best practices
Produce clean, well-tested, and documented code with a clear design to support mission critical applications
Build automated data validation test suites that ensure that data is processed and published in accordance with well-defined Service Level Agreements (SLA’s) pertaining to data quality, data availability and data correctness
Challenge the status quo and help push our organization forward, as we grow beyond the limits of our current tech stack
Requirements:
BS/MS/PhD in Computer Science, Engineering, Physics, Math, or equivalent technical field
5+ years of professional experience developing software applications
Java/Scala experience required
Highly motivated and willing to take ownership of high-impact projects upon arrival
Prior hands-on experience with data platforms and technologies such as Delta Lake, Spark, Kubernetes, Kafka, ClickHouse, and/or Presto/Trino
Experience building large-scale batch and streaming pipelines with strict SLA and data quality requirements
Must possess excellent communication, analytical, and problem-solving skills
Recent hands-on experience with AWS Cloud development, deployment and monitoring necessary
Demonstrated experience working on an Agile team employing software engineering best practices, such as GitOps and CI/CD, to deliver complex software projects
The ability to react quickly and accurately to rapidly changing market conditions, including the ability to quickly and accurately respond and/or solve math and coding problems are essential functions of the role
Nice to have:
Python experience
Demonstrated experience working with diverse data sets and frameworks across multiple domains – financial data experience not required, but a strong plus
What we offer:
Discretionary performance bonus
Comprehensive benefits package that may encompass employer-paid medical, dental, vision, retirement contributions, paid time off, and other benefits
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.