This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As a Senior Data Scientist, you will bridge the gap between complex datasets and strategic action. You’ll synthesize information from various external sources such as Google Search Console, Semrush, AHRefs, Matomo, as well as WMF’s data lake to develop visibility and insights into search recall gaps and emerging topic trends. Your mission is to provide leadership with the clarity needed to make confident, informed decisions. You will help inform Wikimedia’s strategy for ensuring that people continue to find the content they need from the Wikipedias and other wiki projects, as well as on the Wikimedia Foundation’s sites. You will provide visibility into web analytics, search trends, content trends, patterns of usage, and search recall gaps.
Job Responsibility:
Integrating fragmented data from our internal data lake, BigQuery, Matomo, and external intelligence tools such as Ahrefs, Semrush, and Trakkr AI among others to create a unified, granular view of how users discover and use our content across desktop and mobile domains
Translating analytics and complex trends by geography, project, and topic into actionable narratives that help leadership navigate uncertainty and prioritize high-impact interventions
Developing automated systems to detect where our content is under-indexed or missing from results, providing the data necessary to improve article coverage, alternative titles, content for the Wikimedia Foundation’s websites, and FAQ structures
Establishing the metrics and frameworks to track how our content is utilized within Large Language Models and generative search engines, directly informing high-level negotiations and the valuation of our Enterprise offerings, as well as driving grassroot donor growth
Defining Measurable Outcomes: Collaborating with product leaders to set achievable goals and drafting the instrumentation requirements necessary to track success with technical precision
Assessing the efficacy of new initiatives through the rigorous design and analysis of experiments (e.g., A/B testing) to understand their true impact on our global audience
Interpreting website performance data to uncover insights, inform UX/UI enhancements, and support strategies aimed at improving customer experience and conversion outcomes on the Wikimedia Foundation’s websites
Requirements:
Fluent in Python and Jupyter for data analysis, statistical modeling, simulation, visualization & reporting
Fluent in SQL and working with large scale data using tools such as Hive, Presto, Druid, and Spark
Experience with building dashboards & reports using tools such as Jupyter, Quarto, Superset
A solid, intuitive understanding of how websites operate (including structure, user journeys, and tracking mechanisms), combined with practical expertise in web search SEO/GEO
A Bachelor’s degree + 5 years related experience or a Master’s degree + 3 years related experience
or equivalent work experience
Excellent communication skills
You must possess the statistical nuance required to design experiments, identify statistically significant phenomena, and frame findings into clear, high-level narratives
Comfortable working in a highly collaborative, remote environment
Nice to have:
Familiarity with Wikimedia projects, moderation processes, and communities