This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Associate MLOps Analyst will be a key member of Circle K's Data & Analytics team, working on implementing Azure data services, deploying ML models, building pipelines, automating workflows, and collaborating with cross-functional teams.
Job Responsibility:
Collaborate with data scientists to deploy ML models into production environments
Implement and maintain CI/CD pipelines for machine learning workflows
Use version control tools (e.g., Git) and ML lifecycle management tools (e.g., MLflow) for model tracking, versioning, and management
Design, build as well as optimize applications containerization and orchestration with Docker and Kubernetes and cloud platforms like AWS or Azure
Automating pipelines using understanding of Apache Spark and ETL tools like Informatica PowerCenter, Informatica BDM or DEI, Stream Sets and Apache Airflow
Implement model monitoring and alerting systems to track model performance, accuracy, and data drift in production environments
Work closely with data scientists to ensure that models are production-ready
Collaborate with Data Engineering and Tech teams to ensure infrastructure is optimized for scaling ML applications
Optimize ML pipelines for performance and cost-effectiveness
Help the Data teams leverage best practices to implement Enterprise level solutions
Follow industry standards in coding solutions and follow programming life cycle to ensure standard practices across the project
Helping to define common coding standards and model monitoring performance best practices
Continuously evaluate the latest packages and frameworks in the ML ecosystem
Build automated model deployment data engineering pipelines from plain Python/PySpark mode
Collaborate with Data Scientists, Data Engineers, cloud platform and application engineers to create and implement cloud policies and governance for ML model life cycle.
Requirements:
Bachelor’s degree required, preferably with a quantitative focus (Statistics, Business Analytics, Data Science, Math, Economics, etc.)
Knowledge of core computer science concepts such as common data structures and algorithms, OOPs
Programming languages (R, Python, PySpark, etc.)
Big data technologies & framework (AWS, Azure, GCP, Hadoop, Spark, etc.)
Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), non-relational (MongoDB, DynamoDB) database management systems and Data Engineering tools
Exposure to ETL tools and version controlling
Experience in building and maintaining CI/CD pipelines for ML models
Understanding of machine-learning, information retrieval or recommendation systems
Familiarity with DevOps tools (Docker, Kubernetes, Jenkins, GitLab).
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.