This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Design, implement and evaluate reproducible fine-tuning and adaptation workflows to incorporate domain context (glossaries, prior translation examples, and English/style guidelines) into LLM outputs that support WOAH’s translation, summarisation and RRA text-generation tasks. The project should prioritize practical, maintainable pipelines that DID staff can run or orchestrate from R and/or command-line tools, and that clearly document trade-offs between cloud API tuning and local/lightweight adaptation.
Job Responsibility:
Work with WOAH editors/linguists to gather glossaries, style guides, and a representative set of prior translations and editorial corrections
Define the target tasks and acceptance criteria
Review and compare adaptation approaches
Propose two to three candidate strategies for different operational contexts
Prepare training/validation datasets in the formats required by the selected fine-tuning platforms
Implement reproducible scripts (R and/or Python) to perform tokenisation estimates, convert formats, and upload or launch fine-tuning jobs
Conduct controlled fine-tuning/adaptation experiments on at least two approaches
Evaluate using quantitative metrics and qualitative review by WOAH editors
Document expected cloud fine-tuning costs based on WOAH datasets
Analyse operational implications
Deliver clear step-by-step pipelines, annotated code, and operational instructions for DID staff
Produce final technical report, reproducible notebooks, and an executive summary for non-technical stakeholders
Develop a prototype graphical interface (in Dash or RShiny) to allow WOAH staff to upload small samples of text for testing, select between models, provide glossary terms or contextual examples, view side-by-side outputs for editorial evaluation
Ensure that the GUI is lightweight, easy to deploy locally, and documented
Requirements:
Master level (or equivalent) in data science. Alternatively in veterinary sciences, life science, international affairs, public health, agricultural science with formal training or experience in data science
R or python advanced level
Familiarity with LLM fine tuning theory and methods would be a plus
Analysis and problem solving
Good working knowledge of Microsoft Office, such as Excel, Word and PowerPoint
Excellent knowledge of English, both spoken and written
Basic knowledge of epidemiology and epidemic intelligence principles
Ability to work in an agile way and deliver at pace… organisation skills and ability to meet deadlines
Team player who can integrate well into the department and is willing to commit to supporting the development of Observatory outputs
Nice to have:
Familiarity with LLM fine tuning theory and methods would be a plus
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.