This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are seeking an L&D and Assessment Specialist dedicated to Globalization programs, with a focus on translation and transcription workflows. This role is responsible for designing, developing, and maintaining training content and assessment frameworks used to onboard and certify linguists and transcriptionists on our platform. The role sits at the intersection of linguistic quality, platform enablement, and operational scalability, ensuring that linguists and transcriptionists meet defined quality standards before participating in production work.
Job Responsibility:
Design and deliver structured onboarding and continuous learning programs for linguists and transcriptionists specializing in translation and transcription
Develop scalable training materials, including: Learning modules, Style guides, Workflow documentation, Best practice guidelines for quality, consistency, and compliance
Partner with Localization, Quality, Product, and Operations teams to ensure training content aligns with: Platform capabilities, Client requirements, Evolving linguistic quality standards
Maintain and regularly update training materials to reflect changes in tools, processes, and language quality expectations
Track training completion and effectiveness using defined success metrics
Design, build, and maintain linguist onboarding assessments, including: Translation tests, Transcription accuracy tests, Language proficiency and domain knowledge evaluations
Define scoring rubrics, pass/fail criteria, and quality thresholds aligned with production requirements
Ensure assessments fairly and accurately measure: Linguistic accuracy, Fluency and style, Terminology usage, Audio comprehension (for transcription), Verbatim accuracy (for transcription), Timestamp accuracy (for transcription)
Work closely with Quality and PgM Leads to validate assessment outcomes and continuously improve test reliability
Analyze assessment results to identify skill gaps and inform training improvements
Define assessment requirements with Ops / ML engineer who build assessment tools (GenAI Models)
Monitor post-onboarding performance data to validate assessment effectiveness
Use feedback from audits, QA reviews, and production metrics to: Refine assessments, Enhance training content
Support remediation and re-certification processes for linguists when quality gaps are identified
Help standardize L&D and assessment processes across languages and programs
Requirements:
2+ years of experience in Learning & Development, Assessment Design, Localization, or a related field
Hands-on experience with translation and/or transcription workflows
Strong understanding of linguistic quality evaluation criteria
Experience designing assessments, tests, or certification programs
Ability to write clear, structured instructional and evaluation content
Experience working with linguists, language leads, or vendor teams
Strong analytical skills to interpret assessment results and quality data