This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Join the AI-Share team to help build and operate the foundations that power our Generative AI features (LLM, RAG, agents) inside the DataGalaxy data governance platform. This role focuses on MLOps / ModelOps delivery: making GenAI capabilities reliable in production (deployment, monitoring, cost control, traceability), while collaborating with product engineering teams across a polyglot stack.
Requirements:
Contribute to the evolution of our ModelOps platform for GenAI: provider integrations, configuration, deployment automation, and operational tooling
Help implement practical patterns for running GenAI workloads in production: evaluation, versioning, reproducibility, safe rollouts/rollbacks, and environment management
Build and improve CI/CD workflows adapted to AI: packaging, automated checks, evaluation steps (when applicable), deployment, and rollback
Improve traceability of AI assets (configs, prompts/templates when applicable, evaluation outputs, versions) to support governance and debugging
Add and maintain observability for GenAI workloads: latency, availability, usage/cost signals, and quality-related indicators (dashboards/alerts)
Develop and improve GenAI features within the platform (agent, RAG pipelines, MCP server): new capabilities, prompt engineering, bug fixes, and client-facing improvements
Work closely with Product / Data / Engineering to integrate GenAI capabilities into the platform in a maintainable way
Participate in code reviews, documentation, and post-incident follow-ups (RCA / action items), with guidance from the team