This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Fixed Income Data team is experiencing rapid growth, committed to delivering cutting-edge data-centric solutions across various critical domains in sales, risk, pricing, quants, trading, execution, and trade processing. The candidate would collaborate to design and implement data-processing solutions leveraging APIs, big data platforms, and microservices.
Job Responsibility:
design, develop, and implement highly scalable and resilient API services for data access and processing, leveraging big data platforms
conduct feasibility studies, time and cost estimates for new API-driven data solutions and establish and implement new or revised applications and systems to meet specific business needs or user areas
monitor and control all phases of the development process (analysis, design, construction, testing, and deployment) for API-driven data applications, providing operational support
utilize in-depth specialty knowledge of API development for big data environments and analytics to analyze complex problems/issues, evaluate business processes, system processes, and industry standards, and make evaluative judgments
ensure essential procedures are followed and help define operating standards and processes for API-driven data infrastructure
serve as an advisor or coach to new or junior analysts on API development and big data access best practices
operate with a limited level of direct supervision, exercising independence of judgment and autonomy
act as a Subject Matter Expert (SME) to senior stakeholders and/or other team members on data API technologies and their application in finance
Requirements:
3-5 years of demonstrable and relevant experience in software development, with a strong focus on API development and big data solutions
expertise in developing high-performance APIs for large-scale data platforms and distributed systems
extensive hands-on experience with data distribution platforms like Apache Kafka, and various big data storage/querying systems (e.g., Trino, Pinot, Druid, Ignite) for low-latency access via APIs
solid understanding of Java / Scala with a focus on building high-performance, concurrent applications
strong experience with the Spring stack, particularly Spring Boot for building microservices that expose data via APIs
expert-level understanding and demonstrable experience in REST API development for data reporting and consumption
demonstrable experience in writing reusable, testable, and efficient code with proper error and exception handling, especially for fault-tolerant API services
experience with the design and implementation of cloud-native applications and deployment via Kubernetes / OpenShift, specifically for managing API-driven data services
hands-on experience in handling various data structures and optimizing them for API consumption and analytical queries
experience with API Gateway, Circuit Breaker, Spring Security, Discovery Server, and monitoring services (e.g., Prometheus, Grafana) is a plus, particularly in an API-driven data ecosystem
good understanding of data modeling, partitioning, and sharing of huge data sets for optimal performance in large-scale data platforms accessed via APIs
experience working on a Continuous Integration and Continuous Delivery (CI/CD) environment, with a focus on rapid and reliable deployment of API services and data access layers
familiarity with TeamCity, SonarQube, and Jenkins
experience with the SDLC lifecycle and in working within an Agile environment, adapting to fast-paced data requirements
demonstrable understanding and experience of engineering best practices: design patterns, coding standards, code review, and robust unit/integration testing (e.g., JUnit, Mockito) for API services
strong experience with standard CI tools (Jenkins, TeamCity, SonarQube, Git)
strong communication skills, oral and written, essential for explaining complex API-driven data architectures and solutions to business stakeholders
ability to apply sound technical skills and knowledge of fixed income business to develop creative API-driven data solutions to meet client and business needs
responsible, agile, and collaborative team worker, crucial for cross-functional API and data projects
ability to develop strong relationships with others, effectively influencing peers and business partners
self-motivated and organized, with determination to achieve goals
ability to work autonomously when required in a fast-paced environment
ability to face off to all business users (traders, financial controllers, risk managers, etc.) to understand their data access and API needs
flexibility and ability to deliver quality results in the required timeframe
flexibility to work with a global team, across geographies and time zones
strong academic record, ideally with a bachelor's or master's degree in computer science, or related technical/quantitative discipline
ideally, an understanding of financial derivatives (fixed income products) or willingness to learn about this area to effectively apply API-driven data insights
Nice to have:
experience with API Gateway, Circuit Breaker, Spring Security, Discovery Server, and monitoring services (e.g., Prometheus, Grafana) is a plus, particularly in an API-driven data ecosystem
ideally, an understanding of financial derivatives (fixed income products) or willingness to learn about this area
What we offer:
flexibility to work with a global team across geographies and time zones
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.