CrawlJobs Logo

Senior Manager Factory Engineering

purina.com Logo

Nestlé Purina

Location Icon

Location:
United States , San Antonio

Category Icon
Category:

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

The Senior Manager, Factory Engineering position is responsible for managing the engineering activities relating to the factory and provide professional engineering and strategic technical leadership in the Factory, ensuring people health and safety, product safety and environment are never compromised. Strategic leader for engineering function, leveraging Nestle’s leadership principles in the following areas; capital execution, asset maintenance and utilities excellence, engineering compliance, and sustainability.

Job Responsibility:

  • Define, propose and monitor the implementation of the Factory Engineering teams plans and objectives
  • Live Nestle leadership principles in all aspects of the role
  • Set the technical strategic vision for the factory
  • Develop and maintain technical competencies at the factory
  • Drive equipment reliability through implementing maintenance foundations in TPM
  • Build cooperation and collaboration with the production and other cross functional teams to build world class operation
  • Improve maintenance cost structure
  • Implement Nestle capital process to drive excellence in capital execution
  • Proactively develop, maintain and execute the capital market business strategy (MBS)
  • Drive factory master planning
  • Ensure full compliance with all governing bodies for utilities system relevant to the Modesto site
  • Maintain reliability through implementing maintenance foundations and TPM methodology
  • Develop and maintain all skillset necessary for internal management of utilities systems
  • Drive and execute plant sustainability objectives
  • Drive world class water management in the factory
  • Drive reliability in all electrical components in the factory
  • Develop and drive automation plan for the factory
  • Ensure full compliance with Nestle standards including calibration
  • Develop technical skills in maintenance to ensure electrical reliability
  • Drive Innovation/Renovation, initiating/coordinating implementation of new processes/technology

Requirements:

  • University Degree (BA/BS) in Engineering, preferably Mechanical, Chemical, Process or equivalent experience and minimum 10 years related work experience
  • 5+ years of leadership/management experience required
  • Knowledge of food safety, regulatory requirements for food safety and hygienic engineering (OSH&R) preferred
  • Strong understanding of, or ability to learn, the full scope of role responsibilities, including financial analysis and key business drivers such as Nestlé Standard Costing, Asset and Maintenance Management (AMM), CAPEX budgeting and preparation (CIAT), and Industrial Services/Energy Management (NEMT)
  • Familiarity with TPM and continuous improvement methodologies
  • Proficiency with AutoCAD, Visio, CDM, AMM, ESAT, CIAT, NEMT and similar technical tools/systems
  • Demonstrated technical and managerial capabilities with in depth knowledge of food processing plant operations, technologies, utilities, maintenance systems, and hygienic engineering requirements (OSH&R)
  • Strong analytical and problem solving skills with the ability to apply root cause analysis and technical creativity to resolve complex short and long term issues
  • Effective communication and networking skills, with the ability to perform under pressure and in ambiguous environments
  • Experience in Project Engineering/Management, including plant design and construction
What we offer:
  • Dynamic career paths
  • Robust development
  • Opportunities to learn from talented colleagues around the globe
  • Benefits that support physical, financial, and emotional wellbeing

Additional Information:

Job Posted:
March 13, 2026

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Manager Factory Engineering

Industrial Engineer and Project Manager

This role blends technical leadership with client-facing project execution. You’...
Location
Location
Canada , Greater Toronto Area
Salary
Salary:
Not provided
progima.com Logo
Progima
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years of experience in a manufacturing environment is required
  • Bachelor’s degree in Industrial, Manufacturing or Mechanical Engineering (or equivalent experience)
  • Member of Professional Engineers Ontario (PEO)
  • Flexible schedule and the ability to travel regularly throughout the industrial regions of the Greater Toronto Area (GTA)
  • Ability to thrive in a fast-paced and dynamic environment and present strong ethics and a deep commitment to clients’ and coworkers’ success
  • Excellent analytical and problem-solving skills couple with project management skills
  • Strong written and oral communication skills
  • Strong sense of responsibility and accountability
  • Driven by operational efficiency, results, and strong task performance
  • Must be a Canadian citizen or Landed Immigrant and meet Controled Goods Program requirements
Job Responsibility
Job Responsibility
  • Lead the industrial strategic planning process with the client’s executive team, designing factories, and warehouse layouts to maximize productivity and minimize waste
  • Conduct feasibility studies, including budget assessments, to evaluate prospective projects and present recommendations to senior management and clients
  • Work closely with our customers’ production personnel and managers to implement improvements, manage projects and ensure successful execution
  • Possess a strong background in industrial systems selection, a good knowledge of the supplier ecosystem and of the automation technologies
  • Manage large discrete manufacturing industrial transfers with a straightforward project management approach
  • Supporting Progima’s business development initiatives, including client meetings and proposal writing, to expand our client base and foster long-term relationships
What we offer
What we offer
  • Performance-based bonus structure
  • Group insurance benefits
  • Long-term career growth and the opportunity to lead and innovate in a collaborative, forward-thinking company
  • The chance to make a tangible impact across a diverse range of industries
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

At ANS, the Senior Data Engineer plays a key role in delivering robust, scalable...
Location
Location
United Kingdom , Manchester
Salary
Salary:
Not provided
ans.co.uk Logo
ANS Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience in building and optimising pipelines in Azure Data Factory, Synapse, or Fabric
  • Strong knowledge of Python and SQL
  • Experience in using metadata frameworks in data engineering
  • Experience in best practice data engineering principles including CI/CD via Azure DevOps or Github
  • Understanding of Azure networking and security in relation to the data platform
  • Experience of data governance and regulation, including GDPR, principle of least privilege, classification etc.
  • Experience of lakehouse architecture, data warehousing principles, and data modelling
  • Familiarity with Microsoft Purview in a data platform context
  • Base knowledge of Azure foundry
Job Responsibility
Job Responsibility
  • Build and optimise data pipelines, notebooks, and data flows in Microsoft Fabric and Synapse Analytics, connecting to a variety of on-premises and cloud based data sources
  • Support Data Architects and Cloud Engineers by implementing solutions based on provided designs and offering feedback where needed
  • Collaborate across disciplines to ensure high-quality delivery of data solutions, including working with presales, managed services, and customer teams
  • Mentor Data engineers and support their development through guidance and task distribution
  • Ensure best practice adherence in engineering processes, including CI/CD via Azure DevOps and secure data handling (e.g. Key vault, private endpoints)
  • Contribute to Agile delivery by participating in standups, user story creation, and sprint planning
  • Document implemented solutions clearly and accurately for internal and customer use
  • Troubleshoot and resolve issues across subscriptions and environments
  • Work closely with the Project Manager (where applicable) to align on delivery timelines, report progress, and manage risks, while also acting as a key point of contact for customer SMEs and engineers to support collaboration and clarify technical requirements
  • Engage in continuous learning through certifications (e.g. DP-600 and/or DP700, AI-900, AI102, etc.) and development days
What we offer
What we offer
  • 25 days’ holiday, plus you can buy up to 5 more days
  • Birthday off
  • An extra celebration day
  • 5 days’ additional holiday in the year you get married
  • 5 volunteer days
  • Private health insurance
  • Pension contribution match and 4 x life assurance
  • Flexible working and work from anywhere for up to 30 days per year (some exceptions)
  • Maternity: 16 weeks’ full pay, Paternity: 3 weeks’ full pay, Adoption: 16 weeks’ full pay
  • Company social events
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

At Blue Margin, we are on a mission to build the go-to data platform for PE-back...
Location
Location
United States , Fort Collins
Salary
Salary:
110000.00 - 140000.00 USD / Year
bluemargin.com Logo
Blue Margin
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related field
  • 5+ years of professional experience in data engineering, with emphasis on Python & PySpark/Apache Spark
  • Proven ability to manage large datasets and optimize for speed, scalability, and reliability
  • Strong SQL skills and understanding of relational and distributed data systems
  • Experience with Azure Data Factory, Synapse Pipelines, Fivetran, Delta Lake, Microsoft Fabric, or Snowflake
  • Knowledge of data modeling, orchestration, and Delta/Parquet file management best practices
  • Familiarity with CI/CD, version control, and DevOps practices for data pipelines
  • Experience leveraging AI-assisted tools to accelerate engineering workflows
  • Strong communication skills
  • ability to convey complex technical details to both engineers and business stakeholders
Job Responsibility
Job Responsibility
  • Architect, design, and optimize large-scale data pipelines using tools like PySpark, SparkSQL, Delta Lake, and cloud-native tools
  • Drive efficiency in incremental/delta data loading, partitioning, and performance tuning
  • Lead implementations across Azure Synapse, Microsoft Fabric, and/or Snowflake environments
  • Collaborate with stakeholders and analysts to translate business needs into scalable data solutions
  • Evaluate and incorporate AI/automation to improve development speed, testing, and data quality
  • Oversee and mentor junior data engineers, establishing coding standards and best practices
  • Ensure high standards for data quality, security, and governance
  • Participate in solution design for client engagements, balancing technical depth with practical outcomes
What we offer
What we offer
  • Competitive pay
  • strong benefits
  • flexible hybrid work setup
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

As a senior data engineer, you will help our clients with building a variety of ...
Location
Location
Belgium , Brussels
Salary
Salary:
Not provided
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • At least 5 years of experience as a Data Engineer or in software engineering in a data context
  • Programming experience with one or more languages: Python, Scala, Java, C/C++
  • Knowledge of relational database technologies/concepts and SQL is required
  • Experience building, scheduling and maintaining data pipelines (Spark, Airflow, Data Factory)
  • Practical experience with at least one cloud provider (GCP, AWS or Azure). Certifications from any of these is considered a plus
  • Knowledge of Git and CI/CD
  • Able to work independently, prioritize multiple stakeholders and tasks, and manage work time effectively
  • You have a degree in Computer Engineering, Information Technology or related field
  • You are proficient in English, knowledge of Dutch and/or French is a plus.
Job Responsibility
Job Responsibility
  • Gather business requirements and translate them to technical specifications
  • Design, implement and orchestrate scalable and efficient data pipelines to collect, process, and serve large datasets
  • Apply DataOps best practices to automate testing, deployment and monitoring
  • Continuously follow & learn the latest trends in the data world.
What we offer
What we offer
  • A variety of perks, such as mobility options (including a company car), insurance coverage, meal vouchers, eco-cheques, and more
  • Continuous learning opportunities through the Sopra Steria Academy to support your career development
  • The opportunity to connect with fellow Sopra Steria colleagues at various team events.
Read More
Arrow Right

Senior Data Engineer

The Data Engineer is responsible for designing, building, and maintaining robust...
Location
Location
Germany , Berlin
Salary
Salary:
Not provided
ibvogt.com Logo
ib vogt GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Degree in Computer Science, Data Engineering, or related field
  • 5+ years of experience in data engineering or similar roles
  • experience in renewable energy, engineering, or asset-heavy industries is a plus
  • Strong experience with modern data stack (e.g., PowerPlatform, Azure Data Factory, Databricks, Airflow, dbt, Synapse, Snowflake, BigQuery, etc.)
  • Proficiency in Python and SQL for data transformation and automation
  • Experience with APIs, message queues (Kafka, Event Hub), data streaming and knowledge of data lakehouse and data warehouse architectures
  • Familiarity with CI/CD pipelines, DevOps practices, and containerization (Docker, Kubernetes)
  • Understanding of cloud environments (preferably Microsoft Azure, PowerPlatform)
  • Strong analytical mindset and problem-solving attitude paired with a structured, detail-oriented, and documentation-driven work style
  • Team-oriented approach and excellent communication skills in English
Job Responsibility
Job Responsibility
  • Design, implement, and maintain efficient ETL/ELT data pipelines connecting internal systems (M365, Sharepoint, ERP, CRM, SCADA, O&M, etc.) and external data sources
  • Integrate structured and unstructured data from multiple sources into the central data lake / warehouse / Dataverse
  • Build data models and transformation workflows to support analytics, reporting, and AI/ML use cases
  • Implement data quality checks, validation rules, and metadata management according to the company’s data governance framework
  • Automate workflows, optimize performance, and ensure scalability of data pipelines and processing infrastructure
  • Work closely with Data Scientists, Software Engineers, and Domain Experts to deliver reliable datasets for Digital Twin and AI applications
  • Maintain clear documentation of data flows, schemas, and operational processes
What we offer
What we offer
  • Competitive remuneration and motivating benefits
  • Opportunity to shape the data foundation of ib vogt’s digital transformation journey
  • Work on cutting-edge data platforms supporting real-world renewable energy assets
  • A truly international working environment with colleagues from all over the world
  • An open-minded, collaborative, dynamic, and highly motivated team
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

The Data Engineer will build scalable pipelines and data models, implement ETL w...
Location
Location
United States , Fort Bragg
Salary
Salary:
Not provided
barbaricum.com Logo
Barbaricum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Active DoD TS/SCI clearance (required or pending verification)
  • Bachelor’s degree in Computer Science, Data Science, Engineering, or related field (or equivalent experience) OR CSSLP / CISSP-ISSAP
  • Strong programming skills in Python, Java, or Scala
  • Strong SQL skills
  • familiarity with analytics languages/tools such as R
  • Experience with data processing frameworks (e.g., Apache Spark, Hadoop) and orchestration tools (e.g., Airflow)
  • Familiarity with cloud-based data services (e.g., AWS Redshift, Google BigQuery, Azure Data Factory)
  • Experience with data modeling, database design, and data architecture concepts
  • Strong analytical and problem-solving skills with attention to detail
  • Strong written and verbal communication skills
Job Responsibility
Job Responsibility
  • Build and maintain scalable, reliable data pipelines to collect, process, and store data from multiple sources
  • Design and implement ETL processes to support analytics, reporting, and operational needs
  • Develop and maintain data models, schemas, and standards to support enterprise data usage
  • Collaborate with data scientists, analysts, and stakeholders to understand requirements and deliver solutions
  • Analyze large datasets to identify trends, patterns, and actionable insights
  • Present findings and recommendations through dashboards, reports, and visualizations
  • Optimize database and pipeline performance for scalability and reliability across large datasets
  • Monitor and troubleshoot pipeline issues to minimize downtime and improve system resilience
  • Implement data quality checks, validation routines, and integrity controls
  • Implement security measures to protect data and systems from unauthorized access
Read More
Arrow Right

Senior Hardware Engineer

As a key contributor in the HW Sustaining group, you will be responsible for inv...
Location
Location
United States , San Jose
Salary
Salary:
117500.00 - 270000.00 USD / Year
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree or higher in Electrical, Electronics, or Computer Engineering
  • 8+ years of hardware design, debug and testing experience at the board and system level
  • Expertise in debugging hardware problems using oscilloscopes, logic analyzers, and specialized networking test equipment such as IXIA/Spirent traffic analyzers, PCI analyzers, and in-circuit emulators
  • PCB design experience including schematic capture, PCB layout and routing
  • Familiarity with Cadence System Design tool suite
  • Knowledge of networking hardware and associated components and interfaces such as switch ASICs, PHY, SerDes, CPU subsystems, optics, high-speed serial links, PoE, PCIe, NVMe, SATA, I2C, and USB
  • Expertise in high speed circuit design and debug
  • Familiarity with signal and power integrity concepts and associated test and simulation tools
  • Ability to manage multiple concurrent priorities and make progress
  • Able to manage hardware debug and design projects and driving them to closure
Job Responsibility
Job Responsibility
  • Lead failure analysis for critical hardware issues reported from field and factory
  • Determine and document root cause and corrective actions (RCCA)
  • Extract and share 'Lessons Learned' from HW Sustaining failure analysis and share with appropriate Juniper teams to drive hardware quality improvement
  • Collaborate with other Engineering and Manufacturing / Operations teams to drive corrective actions and hardware quality improvements
  • Work with Customer Support teams to understand and address customer hardware concerns
  • Analyze hardware quality metrics and Repair Center findings to identify failure trends requiring investigation
  • Provide technical leadership into multi-sourcing and change management activities on shipping products
  • Review hardware test reports for new products prior to product release
  • Identify any quality concerns and work with the design teams to resolve
  • Mentor and guide junior members, foster a collaborative environment, and encourage curiosity and knowledge sharing
What we offer
What we offer
  • Comprehensive suite of benefits that supports physical, financial and emotional wellbeing
  • Career development programs to help achieve career goals
  • Inclusive culture celebrating individual uniqueness
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer role driving Circle K's cloud-first strategy to unlock the ...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree in Computer Engineering, Computer Science or related discipline
  • Master's Degree preferred
  • 5+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment
  • 5+ years of experience with setting up and operating data pipelines using Python or SQL
  • 5+ years of advanced SQL Programming: PL/SQL, T-SQL
  • 5+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization
  • Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads
  • 5+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data
  • 5+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions
  • 5+ years of experience in defining and enabling data quality standards for auditing, and monitoring
Job Responsibility
Job Responsibility
  • Collaborate with business stakeholders and other technical team members to acquire and migrate data sources
  • Determine solutions that are best suited to develop a pipeline for a particular data source
  • Develop data flow pipelines to extract, transform, and load data from various data sources
  • Efficient in ETL/ELT development using Azure cloud services and Snowflake
  • Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines
  • Provide clear documentation for delivered solutions and processes
  • Identify and implement internal process improvements for data management
  • Stay current with and adopt new tools and applications
  • Build cross-platform data strategy to aggregate multiple sources
  • Proactive in stakeholder communication, mentor/guide junior resources
  • Fulltime
Read More
Arrow Right