This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Admin Ops – Data Analytics role is a critical enabler for smooth operations within ZDI team & for BU data / intelligence sharing. This role is responsible for managing tool access, ensuring efficient CI/CD processes, and serving as the go-to administrator for various technology stacks and building developer tools (CI/CD, Test-Driven Development, and InfoSec pre-validation tools) and acting as a Cloud Admin for data analytics-specific needs.
Job Responsibility:
Serve as the primary administrator for GitHub, ensuring appropriate access control, governance, and repository security
Manage CI/CD workflows across different data analytics and engineering projects, enforcing automation best practices
Maintain role-based access control (RBAC) across tools and platforms used by the Data Analytics team
Troubleshoot and resolve access and permission-related issues for tools, repositories, and cloud environments
Act as the single point of contact for administering analytics tools and platforms such as Snowflake, DBT, Streamlit, and cloud-native services
Ensure smooth onboarding and offboarding of users across platforms, ensuring compliance with data governance policies
Collaborate with IT security and infrastructure teams to uphold compliance and security standards
Maintain documentation and guidelines for tool usage, best practices, and governance policies
Build and maintain developer tools for the Data Analytics team, including CI/CD Pipelines, Test-Driven Development (TDD) Frameworks, and InfoSec Pre-Validation Tools
Work closely with data engineers and DevOps teams to standardize deployment processes for data pipelines, models, and dashboards
Ensure that CI/CD processes align with enterprise governance policies while optimizing for efficiency and security
Serve as Cloud Admin for data analytics environments across AWS, Azure, or GCP, ensuring governance and security best practices
Manage and optimize Snowflake usage for the Data Analytics team, including performance tuning, cost monitoring, and security controls
Oversee DBT (Data Build Tool) administration, including permissions, environment configurations, and development best practices
Manage Streamlit deployment environments, ensuring stability and governance of internal data applications
Work with IT and DevOps teams to ensure compliance with cloud security standards
Act as a liaison between the Data Analytics team and Business Units (BUs) to coordinate data-sharing requests
Ensure compliance with data governance policies while facilitating smooth inter-team data exchanges
Monitor and optimize access to shared data assets to improve efficiency and security
Define a strategic roadmap for tool development and automation aligned with budget allocations
Identify opportunities for cost optimization and operational efficiency in cloud/data analytics workflows
Collaborate with leadership to prioritize tooling initiatives based on impact and feasibility
Requirements:
Bachelor’s degree in Computer Science, Information Technology, Data Science, or a related field
10+ years of experience in DevOps, IT administration, cloud engineering, or a similar role within a technology or data analytics function
Strong expertise in GitHub administration, including managing access control, repository security, and workflow automation
Experience with CI/CD pipelines (e.g., GitHub Actions, Jenkins, Azure DevOps, or similar)
Hands-on experience with Snowflake, DBT, and Streamlit for administration and governance
Solid knowledge of cloud platforms (AWS, Azure, GCP) and data governance/security best practices
Strong scripting skills (Python, Bash, or PowerShell) for automation and tool development
Experience implementing Test-Driven Development (TDD) and InfoSec pre-validation tools
Strong documentation skills and ability to create process guidelines for teams
Nice to have:
Experience working in a Data Analytics, Data Engineering, or AI/ML environment
Familiarity with Infrastructure as Code (IaC) tools like Terraform or CloudFormation
Experience with DevOps tools (e.g., Docker, Kubernetes, Apache Airflow)
Knowledge of ITSM tools such as ServiceNow or Jira for managing access requests
Strong stakeholder management and ability to collaborate across IT, security, and data teams