CrawlJobs Logo

Master Data Analyst Europe

refresco.com Logo

Refresco

Location Icon

Location:
Netherlands , Rotterdam

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

As a Master Data Analyst, your mission is to ensure the accuracy, consistency, and governance of critical business data across systems—empowering effective decision-making and smooth operations. You will thrive in a role that balances day-to-day data management with impactful projects, all while collaborating across functions and implementing MDM solutions. Because this is still a relatively new function within our organization, you will have the freedom to shape your role and make a lasting impact.

Job Responsibility:

  • Maintain and manage critical master data objects such as vendors, customers, and materials—preferably within an SAP S/4HANA environment
  • Leverage MDM governance tools like the Syniti Knowledge Platform to ensure data integrity and control
  • Develop robust data quality controls in SQL-based environments to ensure accuracy, completeness, consistency, and timeliness of data
  • Define and govern data standards, business rules, and documentation to support high-quality master data
  • Monitor and report on Master Data Governance KPIs to drive continuous improvement
  • Identify and lead initiatives to enhance master data processes in collaboration with European Business Process Owners, Finance, Procurement, IT SMEs, and other stakeholders
  • Provide training and support to end users to ensure effective adoption and knowledge transfer of developed MDM solutions
  • Champion data governance efforts aimed at harmonizing and optimizing master data practices across Refresco Europe
  • Support the roll-out and hypercare of our Syniti Knowledge Platform
  • Contribute to European strategic projects, including our SAP S/4HANA roadmap
  • Explore the potential of generative AI to improve master data processes, enhance automation, and unlock new efficiencies in data governance

Requirements:

  • Bachelor degree in Business, Information Systems, Data science or a related field
  • At least 3 years of working experience in MDM
  • A good understanding of master data principles and data governance frameworks
  • Hands-on experience with SAP S/4HANA and MDM tools (preferably Syniti)
  • Proficiency in SQL and experience in building data quality checks and dashboards
  • A collaborative mindset with the ability to work across functions and geographies
  • Passion for continuous improvement and delivering high-impact data solutions
What we offer:
  • A competitive reward package, including bonus eligibility
  • A collaborative, international, and commercial environment where you can make a meaningful impact
  • Opportunities for continuous professional development and growth
  • A dynamic team culture that values integrity, efficiency, and open communication

Additional Information:

Job Posted:
January 19, 2026

Employment Type:
Fulltime
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Master Data Analyst Europe

Azure Data Engineer

The Azure Data Engineer role involves designing, building, and maintaining ETL p...
Location
Location
India , Chennai
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5–8+ years of experience as a Data Engineer
  • Strong hands‑on expertise in Azure (Data Factory, Databricks, Data Lake Storage, SQL, Synapse preferred)
  • Proven ability to build production‑grade ETL/ELT pipelines supporting complex, multi‑regional business processes
  • Experience designing or implementing rules engines (Drools, ODM, or similar)
  • Strong SQL skills and experience with data modeling, data orchestration, and pipeline optimization
  • Experience working in Agile Scrum teams and collaborating across global regions (U.S. and India preferred)
  • Ability to partner closely with analysts and business stakeholders to translate rules into technical solutions
  • Excellent debugging, optimization, and engineering problem‑solving skills
  • Minimum Skills Required: SQL, Python, Azure Data Factory, Databricks, Azure Synapse
Job Responsibility
Job Responsibility
  • Design, build, and maintain Azure‑based ETL pipelines (e.g., Data Factory, Databricks, Data Lake) to ingest, clean, transform, and aggregate compensation‑related datasets across multiple regions
  • Engineer upstream processes to produce 9–10 monthly aggregated output files (customer, revenue, product, sales rep, etc.), delivered 3× per month
  • Ensure repeatability, monitoring, orchestration, and error‑handling for all ingestion and transformation workflows
  • Contribute to the creation of a master stitched data file to replace Varicent’s current data‑assembly functions
  • Build, configure, and maintain a rules engine (ODM, Drools, or similar) to externalize business logic previously embedded in code
  • Translate rules and logic captured by analysts and business SMEs into scalable, testable engine components
  • Implement versioning, governance, and validation mechanisms for all logic used in compensation calculations
  • Ensure rule changes can be managed safely, reducing risk in high‑stakes compensation scenarios
  • Partner with data architects to implement the target‑state Azure data architecture for compensation analytics
  • Develop optimized, scalable physical data models aligned to business logic and downstream needs
Read More
Arrow Right

Azure Data Engineer

The Azure Data Engineer role involves designing and maintaining ETL pipelines us...
Location
Location
India , Chennai
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5–8+ years of experience as a Data Engineer with strong hands‑on expertise in Azure (Data Factory, Databricks, Data Lake Storage, SQL, Synapse preferred)
  • Proven ability to build production‑grade ETL/ELT pipelines supporting complex, multi‑regional business processes
  • Experience designing or implementing rules engines (Drools, ODM, or similar)
  • Strong SQL skills and experience with data modeling, data orchestration, and pipeline optimization
  • Experience working in Agile Scrum teams and collaborating across global regions (U.S. and India preferred)
  • Ability to partner closely with analysts and business stakeholders to translate rules into technical solutions
  • Excellent debugging, optimization, and engineering problem‑solving skills
  • Minimum Skills Required: SQL, Python, Azure Data Factory, Databricks, Azure Synapse
Job Responsibility
Job Responsibility
  • Design, build, and maintain Azure‑based ETL pipelines (e.g., Data Factory, Databricks, Data Lake) to ingest, clean, transform, and aggregate compensation‑related datasets across multiple regions
  • Engineer upstream processes to produce 9–10 monthly aggregated output files (customer, revenue, product, sales rep, etc.), delivered 3× per month
  • Ensure repeatability, monitoring, orchestration, and error‑handling for all ingestion and transformation workflows
  • Contribute to the creation of a master stitched data file to replace Varicent’s current data‑assembly functions
  • Build, configure, and maintain a rules engine (ODM, Drools, or similar) to externalize business logic previously embedded in code
  • Translate rules and logic captured by analysts and business SMEs into scalable, testable engine components
  • Implement versioning, governance, and validation mechanisms for all logic used in compensation calculations
  • Ensure rule changes can be managed safely, reducing risk in high‑stakes compensation scenarios
  • Partner with data architects to implement the target‑state Azure data architecture for compensation analytics
  • Develop optimized, scalable physical data models aligned to business logic and downstream needs
Read More
Arrow Right

Azure Data Engineer

The Azure Data Engineer role requires 5-8 years of experience in designing and m...
Location
Location
India , Chennai
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5–8+ years of experience as a Data Engineer with strong hands‑on expertise in Azure (Data Factory, Databricks, Data Lake Storage, SQL, Synapse preferred)
  • Proven ability to build production‑grade ETL/ELT pipelines supporting complex, multi‑regional business processes
  • Experience designing or implementing rules engines (Drools, ODM, or similar)
  • Strong SQL skills and experience with data modeling, data orchestration, and pipeline optimization
  • Experience working in Agile Scrum teams and collaborating across global regions (U.S. and India preferred)
  • Ability to partner closely with analysts and business stakeholders to translate rules into technical solutions
  • Excellent debugging, optimization, and engineering problem‑solving skills
  • Minimum Skills Required: SQL, Python, Azure Data Factory, Databricks, Azure Synapse
Job Responsibility
Job Responsibility
  • Design, build, and maintain Azure‑based ETL pipelines (e.g., Data Factory, Databricks, Data Lake) to ingest, clean, transform, and aggregate compensation‑related datasets across multiple regions
  • Engineer upstream processes to produce 9–10 monthly aggregated output files (customer, revenue, product, sales rep, etc.), delivered 3× per month
  • Ensure repeatability, monitoring, orchestration, and error‑handling for all ingestion and transformation workflows
  • Contribute to the creation of a master stitched data file to replace Varicent’s current data‑assembly functions
  • Build, configure, and maintain a rules engine (ODM, Drools, or similar) to externalize business logic previously embedded in code
  • Translate rules and logic captured by analysts and business SMEs into scalable, testable engine components
  • Implement versioning, governance, and validation mechanisms for all logic used in compensation calculations
  • Ensure rule changes can be managed safely, reducing risk in high‑stakes compensation scenarios
  • Partner with data architects to implement the target‑state Azure data architecture for compensation analytics
  • Develop optimized, scalable physical data models aligned to business logic and downstream needs
  • Fulltime
Read More
Arrow Right

Azure Data Engineer

The Azure Data Engineer role involves designing and maintaining ETL pipelines us...
Location
Location
India , Chennai
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5–8+ years of experience as a Data Engineer with strong hands‑on expertise in Azure (Data Factory, Databricks, Data Lake Storage, SQL, Synapse preferred)
  • Proven ability to build production‑grade ETL/ELT pipelines supporting complex, multi‑regional business processes
  • Experience designing or implementing rules engines (Drools, ODM, or similar)
  • Strong SQL skills and experience with data modeling, data orchestration, and pipeline optimization
  • Experience working in Agile Scrum teams and collaborating across global regions (U.S. and India preferred)
  • Ability to partner closely with analysts and business stakeholders to translate rules into technical solutions
  • Excellent debugging, optimization, and engineering problem‑solving skills
  • Minimum Skills Required: SQL, Python, Azure Data Factory, Databricks, Azure Synapse
Job Responsibility
Job Responsibility
  • Design, build, and maintain Azure‑based ETL pipelines (e.g., Data Factory, Databricks, Data Lake) to ingest, clean, transform, and aggregate compensation‑related datasets across multiple regions
  • Engineer upstream processes to produce 9–10 monthly aggregated output files (customer, revenue, product, sales rep, etc.), delivered 3× per month
  • Ensure repeatability, monitoring, orchestration, and error‑handling for all ingestion and transformation workflows
  • Contribute to the creation of a master stitched data file to replace Varicent’s current data‑assembly functions
  • Build, configure, and maintain a rules engine (ODM, Drools, or similar) to externalize business logic previously embedded in code
  • Translate rules and logic captured by analysts and business SMEs into scalable, testable engine components
  • Implement versioning, governance, and validation mechanisms for all logic used in compensation calculations
  • Ensure rule changes can be managed safely, reducing risk in high‑stakes compensation scenarios
  • Partner with data architects to implement the target‑state Azure data architecture for compensation analytics
  • Develop optimized, scalable physical data models aligned to business logic and downstream needs
  • Fulltime
Read More
Arrow Right

Azure Data Engineer

The Azure Data Engineer role at NTT DATA involves designing and maintaining Azur...
Location
Location
India , Chennai
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5–8+ years of experience as a Data Engineer with strong hands‑on expertise in Azure (Data Factory, Databricks, Data Lake Storage, SQL, Synapse preferred)
  • Proven ability to build production‑grade ETL/ELT pipelines supporting complex, multi‑regional business processes
  • Experience designing or implementing rules engines (Drools, ODM, or similar)
  • Strong SQL skills and experience with data modeling, data orchestration, and pipeline optimization
  • Experience working in Agile Scrum teams and collaborating across global regions (U.S. and India preferred)
  • Ability to partner closely with analysts and business stakeholders to translate rules into technical solutions
  • Excellent debugging, optimization, and engineering problem‑solving skills
  • Minimum Skills Required: SQL, Python, Azure Data Factory, Databricks, Azure Synapse
Job Responsibility
Job Responsibility
  • Design, build, and maintain Azure‑based ETL pipelines (e.g., Data Factory, Databricks, Data Lake) to ingest, clean, transform, and aggregate compensation‑related datasets across multiple regions
  • Engineer upstream processes to produce 9–10 monthly aggregated output files (customer, revenue, product, sales rep, etc.), delivered 3× per month
  • Ensure repeatability, monitoring, orchestration, and error‑handling for all ingestion and transformation workflows
  • Contribute to the creation of a master stitched data file to replace Varicent’s current data‑assembly functions
  • Build, configure, and maintain a rules engine (ODM, Drools, or similar) to externalize business logic previously embedded in code
  • Translate rules and logic captured by analysts and business SMEs into scalable, testable engine components
  • Implement versioning, governance, and validation mechanisms for all logic used in compensation calculations
  • Ensure rule changes can be managed safely, reducing risk in high‑stakes compensation scenarios
  • Partner with data architects to implement the target‑state Azure data architecture for compensation analytics
  • Develop optimized, scalable physical data models aligned to business logic and downstream needs
Read More
Arrow Right

Azure Data Engineer

The Azure Data Engineer role at NTT DATA involves designing and maintaining ETL ...
Location
Location
India , Chennai
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5–8+ years of experience as a Data Engineer with strong hands‑on expertise in Azure (Data Factory, Databricks, Data Lake Storage, SQL, Synapse preferred)
  • Proven ability to build production‑grade ETL/ELT pipelines supporting complex, multi‑regional business processes
  • Experience designing or implementing rules engines (Drools, ODM, or similar)
  • Strong SQL skills and experience with data modeling, data orchestration, and pipeline optimization
  • Experience working in Agile Scrum teams and collaborating across global regions (U.S. and India preferred)
  • Ability to partner closely with analysts and business stakeholders to translate rules into technical solutions
  • Excellent debugging, optimization, and engineering problem‑solving skills
Job Responsibility
Job Responsibility
  • Design, build, and maintain Azure‑based ETL pipelines (e.g., Data Factory, Databricks, Data Lake) to ingest, clean, transform, and aggregate compensation‑related datasets across multiple regions
  • Engineer upstream processes to produce 9–10 monthly aggregated output files (customer, revenue, product, sales rep, etc.), delivered 3× per month
  • Ensure repeatability, monitoring, orchestration, and error‑handling for all ingestion and transformation workflows
  • Contribute to the creation of a master stitched data file to replace Varicent’s current data‑assembly functions
  • Build, configure, and maintain a rules engine (ODM, Drools, or similar) to externalize business logic previously embedded in code
  • Translate rules and logic captured by analysts and business SMEs into scalable, testable engine components
  • Implement versioning, governance, and validation mechanisms for all logic used in compensation calculations
  • Ensure rule changes can be managed safely, reducing risk in high‑stakes compensation scenarios
  • Partner with data architects to implement the target‑state Azure data architecture for compensation analytics
  • Develop optimized, scalable physical data models aligned to business logic and downstream needs
  • Fulltime
Read More
Arrow Right

Data Architect - Analytics

The Data Architect - Analytics role involves leading the design and implementati...
Location
Location
India , Chennai
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum of 7 years of experience
  • Strong expertise in Azure and Databricks
  • Expertise in data modeling and integration
  • Team Leadership
  • Client Interviews
  • SQL
Job Responsibility
Job Responsibility
  • Engage heavily with business users across North America and Europe, facilitating workshops and data discovery sessions
  • Drive consensus on business rules, data definitions, and data sources, especially where regional processes differ
  • Serve as the architectural thought leader enabling teams to transition from manual, inconsistent processes to standardized, modernized workflows
  • Partner closely with business analysts, data analysts, product owners, and engineering teams across multiple geographies
  • Architect a unified master stitched data model to replace downstream reliance on Varicent for data assembly
  • Lead the re‑architecture of compensation data processing—including internal and external compensation flows—into a scalable, cloud‑native Azure environment
  • Define patterns, frameworks, and integration strategies across Azure services (Data Factory, Databricks, Data Lake, SQL, etc.)
  • Evaluate and evolve the use of rules engines/ODM/Drools to externalize and modernize embedded business logic currently locked in application code
  • Guide decisions to shift logic and data ownership into enterprise‑owned systems rather than third‑party tools
  • Analyze current‑state processes (38 in NA, 9 in Europe) and identify opportunities for re‑engineering, automation, and consolidation
Read More
Arrow Right
New

Business Excellence Lead

Drive the development and execution of the North Europe Data, Digital & Business...
Location
Location
Poland , Greater Poland
Salary
Salary:
Not provided
unilever.com Logo
Unilever
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or master’s degree in information technology, data science, business analytics, or a business field
  • 3-5+ years of experience in data analytics, technology deployment, or digital transformation roles and managing cross-functional teams
  • Experience in managing senior stakeholders
  • Experience in team development and building teams
  • Knowledge of data platforms (e.g., Azure, Databricks), analytics tools (e.g., Power BI, Python), and digital ecosystems
  • Experience in a global or matrixed organization is an advantage
  • Strong understanding of the business model, communication, leadership and problem-solving skills
  • Strategic Thinking & Execution
  • Data Governance & Quality Management
  • Technology Deployment & Integration
Job Responsibility
Job Responsibility
  • Lead data team strategy on regional and country level, build skillset and expertise within the managed team that is requested to deliver short and midterm goals
  • Facilitate team of various data & tech roles (BI experts, Data Clerks, Data Analysts, Sales System Admins) on regional forum building identity of the team and clear roadmap of deliverables
  • Deliver Ecosystem of performance monitoring, tracking and reporting on key business performance indicators
  • Develop or implement tools and processes that deliver business insights with speed
  • Automate repeatable tasks, challenge current ways of working to drive efficiency of UFS NE organization
  • Be a Data Business Partner to North Europe Country Leads and Managing Director, accelerate implementation of strategies through data & tech enablers
  • Cooperate with global team to deliver sustainable data & tech solutions that answer the regional needs
  • Drive CX (Customer Experience) initiatives across markets and continues digital transformation of the company
  • Ensure consistency in data definitions, metrics, and reporting standards across the region
  • Identify opportunities for innovation and continuous improvement in data and tech capabilities. Support the adoption of new tools and technologies across the region
What we offer
What we offer
  • Competitive annual bonus
  • Company car or car allowance
  • Participation in the company share program
  • Private pension plan
  • Private medical care (Medicover)
  • Private life insurance (Unum)
  • Sports and wellness package (Benefit Systems)
  • Two additional vacation days
  • Access to the Unilever employee shop
  • Access to Legimi (e‑book platform)
  • Fulltime
Read More
Arrow Right