This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are on a mission to ensure everyone has access to medical expertise, no matter where they are. Half the world still lacks access to quality healthcare. Even in advanced systems, outcomes are uneven, and clinicians are overwhelmed. Medical knowledge grows faster than human capacity can keep up. Corti is building the infrastructure to close that gap. Our AI platform expands access to medical expertise, reducing errors, restoring time to clinicians, and making care more affordable, accessible, and human again. There is no quality healthcare without a quality dialogue, and no reliable AI without a strong foundation. Help us build both.
Job Responsibility:
Design and build LLM-powered product features used in production
Develop agentic workflows and frameworks that coordinate multiple AI components
Implement RAG (Retrieval-Augmented Generation) architectures using embeddings and vector search
Build systems for prompting, context engineering, and tool usage
Develop evaluation frameworks to measure LLM and agent performance
Work closely with product and platform teams to turn AI capabilities into reliable, scalable product features
Continuously improve system reliability, latency, and cost efficiency of AI pipelines
Requirements:
Strong programming skills in Python and the ability to contribute to production-grade codebases
Hands-on experience in LLMs, including at least some of the following: Training, finetuning, or post-training transformer-based models
Building or operating LLM inference services in production, including performance work
Experience with embeddings, vector databases, and semantic search