This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Your Title: Data/Analytics Engineer\nJob Location: Christchurch (NZ)\nOur Department: Field Systems\nData/Analytics Engineer - Customer 360 Data Pipelines\nAbout the Team: The Field Systems Data Team at Trimble is actively building and maturing our data strategy. We are seeking a dedicated Data Engineer to join our Business Intelligence team and actively shape our data backend and infrastructure in a dynamic environment, proactively breaking down data silos.\nThe Role: As a Data/Analytics Engineer, you will take on central tasks in our critical data projects and in the technical communication with the data platform team. Your efforts will be crucial for building a comprehensive 360-degree customer view, managing the growing demand for insights, and enhancing the overall quality and effectiveness of our data analysis.
Job Responsibility:
Lead or actively participate in the migration of data storage and ETL processes to Snowflake, integrating downstream with BI tools (including Domo)
Design and implement scalable data schemas, audit existing data flows, and ensure the integrity of core finance, marketing, product usage, and licensing datasets for analytics and reporting
Establish necessary platform structures and contribute to knowledge transfer for new data platform components and logic, enabling effective consumption in BI tools
Spearhead data quality and completeness initiatives by designing advanced record linkage and enrichment pipelines using Python and the Splink library
Ensure comprehensive documentation and proper handover of new data enrichment ecosystems and processes
Maintain and optimize SQL-based data extracts crucial for key business dashboards and reporting
Implement and deploy complex classification algorithms (currently prototyped in Python) within the production data environment (Snowflake)
Establish automated, incremental processes for algorithm execution and integration into the existing ETL/dashboard structure
Support the integration and deployment of new data visualization and tooling solutions
Collaborate with business teams (e.g., Sales Operations) to support the integration of critical third-party systems and data sources
Implement and maintain data quality feature sets, including manual data cleanup, to ensure the reliability of data used for achieving production and business targets
Requirements:
Bachelor’s or Master's degree in Data Science, Computer Science, Information Systems, or a related field
Proven experience as a Data/Analytics Engineer, ideally in a growing or evolving data environment
Expert knowledge in Snowflake is required
Very good Python programming skills are a must
Experience with ETL processes and migrating from Domo (or similar BI platforms) to modern data warehouses
Strong interest and focus on data backend and infrastructure topics
Ability to work in a fast-paced environment and solve problems hands-on
Strong problem-solving skills and attention to detail
Knowledge of data governance and data quality best practices
Excellent communication skills, stakeholder engagement, and ability to work independently or as part of a team
Only applicants already legally entitled to work in the relevant location you would be based in will be considered