This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
This role is accountable for the end-to-end global process that ensures AI-enabled systems used in GxP contexts are validated/qualified and maintained in a state of control throughout their lifecycle, consistent with the company’s Quality Management System (QMS) expectations for global process ownership and controlled document architecture. This role is validation/qualification-specific and operates in partnership with the existing global Business Process Owner (BPO) for Responsible Use of AI for GxP, who owns overarching responsible-use governance and user controls (e.g., human-in-the-loop expectations and usage constraints) and good understanding on the Global AI Regulatory requirements (EU Act, FDA).
Job Responsibility:
Maintain a compliant global quality process for AI validation/qualification and ensure alignment with QMS architecture and global documentation expectations
Define and maintain the global procedural framework for validating AI-enabled GxP systems, including integration points with system validation/SDLC expectations and responsible-use controls
Ensure the process is risk-based and scaled to AI context of use and impact, with clear decision gates and required evidence expectations
Robust understanding on the LLM and Machine development life cycle approaches
Able to support in shifts based on the project needs in EST, PST Timezone
Own/maintain the core global procedures and templates supporting AI validation/qualification, including credibility/trustworthiness assessment documentation and controlled report formats
Ensure documents are periodically reviewed and revised consistent with QMS role expectations for global process ownership
Partner with the Responsible Use BPO and governance stakeholders to ensure validation/qualification requirements align with responsible-use constraints (e.g., independent human verification and traceability expectations)
Coordinate with QA, Business Owners, System SMEs/IT Application Owners, and AI Model SMEs to ensure validation deliverables reflect business, compliance, and IS requirements and are stored in appropriate controlled repositories
Provide execution guidance, coaching, and job aids to sites/functions on implementing the global AI validation/qualification process locally while maintaining global consistency
Contribute to training curricula and support communities of practice to build consistent capability across the network
Define and communicate process performance metrics and drive continuous improvement opportunities
Support inspection readiness by ensuring validation evidence expectations are clear, auditable, and consistently applied
Requirements:
Master’s degree with a minimum of 10 years experience in Software and Systems Quality assurance
Bachelor’s degree with a minimum of 12 years of Software and Systems Quality assurance experience
Demonstrated experience in GxP computerized systems validation/qualification and lifecycle management
Experience developing, maintaining, and deploying controlled documents and templates within a regulated QMS environment
Familiarity with AI governance expectations in GxP contexts, including responsible-use constraints such as human oversight and independent verification
Working knowledge of AI trustworthiness/credibility assessment concepts and documentation practices used to support validation decisions
Strong process orientation and document quality discipline (controlled documentation, traceability, clarity)
Ability to influence without authority across global, cross-functional stakeholders
Risk-based thinking and ability to translate policy/standards into practical, scalable validation deliverables