CrawlJobs Logo

Master Data Analyst Europe

refresco.com Logo

Refresco

Location Icon

Location:
Netherlands , Rotterdam

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

As a Master Data Analyst, your mission is to ensure the accuracy, consistency, and governance of critical business data across systems—empowering effective decision-making and smooth operations. You will thrive in a role that balances day-to-day data management with impactful projects, all while collaborating across functions and implementing MDM solutions. Because this is still a relatively new function within our organization, you will have the freedom to shape your role and make a lasting impact.

Job Responsibility:

  • Maintain and manage critical master data objects such as vendors, customers, and materials—preferably within an SAP S/4HANA environment
  • Leverage MDM governance tools like the Syniti Knowledge Platform to ensure data integrity and control
  • Develop robust data quality controls in SQL-based environments to ensure accuracy, completeness, consistency, and timeliness of data
  • Define and govern data standards, business rules, and documentation to support high-quality master data
  • Monitor and report on Master Data Governance KPIs to drive continuous improvement
  • Identify and lead initiatives to enhance master data processes in collaboration with European Business Process Owners, Finance, Procurement, IT SMEs, and other stakeholders
  • Provide training and support to end users to ensure effective adoption and knowledge transfer of developed MDM solutions
  • Champion data governance efforts aimed at harmonizing and optimizing master data practices across Refresco Europe
  • Support the roll-out and hypercare of our Syniti Knowledge Platform
  • Contribute to European strategic projects, including our SAP S/4HANA roadmap
  • Explore the potential of generative AI to improve master data processes, enhance automation, and unlock new efficiencies in data governance

Requirements:

  • Bachelor degree in Business, Information Systems, Data science or a related field
  • At least 3 years of working experience in MDM
  • A good understanding of master data principles and data governance frameworks
  • Hands-on experience with SAP S/4HANA and MDM tools (preferably Syniti)
  • Proficiency in SQL and experience in building data quality checks and dashboards
  • A collaborative mindset with the ability to work across functions and geographies
  • Passion for continuous improvement and delivering high-impact data solutions
What we offer:
  • A competitive reward package, including bonus eligibility
  • A collaborative, international, and commercial environment where you can make a meaningful impact
  • Opportunities for continuous professional development and growth
  • A dynamic team culture that values integrity, efficiency, and open communication

Additional Information:

Job Posted:
January 19, 2026

Employment Type:
Fulltime
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Master Data Analyst Europe

Azure Data Engineer

The Azure Data Engineer role involves designing, building, and maintaining ETL p...
Location
Location
India , Chennai
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5–8+ years of experience as a Data Engineer
  • Strong hands‑on expertise in Azure (Data Factory, Databricks, Data Lake Storage, SQL, Synapse preferred)
  • Proven ability to build production‑grade ETL/ELT pipelines supporting complex, multi‑regional business processes
  • Experience designing or implementing rules engines (Drools, ODM, or similar)
  • Strong SQL skills and experience with data modeling, data orchestration, and pipeline optimization
  • Experience working in Agile Scrum teams and collaborating across global regions (U.S. and India preferred)
  • Ability to partner closely with analysts and business stakeholders to translate rules into technical solutions
  • Excellent debugging, optimization, and engineering problem‑solving skills
  • Minimum Skills Required: SQL, Python, Azure Data Factory, Databricks, Azure Synapse
Job Responsibility
Job Responsibility
  • Design, build, and maintain Azure‑based ETL pipelines (e.g., Data Factory, Databricks, Data Lake) to ingest, clean, transform, and aggregate compensation‑related datasets across multiple regions
  • Engineer upstream processes to produce 9–10 monthly aggregated output files (customer, revenue, product, sales rep, etc.), delivered 3× per month
  • Ensure repeatability, monitoring, orchestration, and error‑handling for all ingestion and transformation workflows
  • Contribute to the creation of a master stitched data file to replace Varicent’s current data‑assembly functions
  • Build, configure, and maintain a rules engine (ODM, Drools, or similar) to externalize business logic previously embedded in code
  • Translate rules and logic captured by analysts and business SMEs into scalable, testable engine components
  • Implement versioning, governance, and validation mechanisms for all logic used in compensation calculations
  • Ensure rule changes can be managed safely, reducing risk in high‑stakes compensation scenarios
  • Partner with data architects to implement the target‑state Azure data architecture for compensation analytics
  • Develop optimized, scalable physical data models aligned to business logic and downstream needs
Read More
Arrow Right

Azure Data Engineer

The Azure Data Engineer role requires 5-8 years of experience in designing and m...
Location
Location
India , Chennai
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5–8+ years of experience as a Data Engineer with strong hands‑on expertise in Azure (Data Factory, Databricks, Data Lake Storage, SQL, Synapse preferred)
  • Proven ability to build production‑grade ETL/ELT pipelines supporting complex, multi‑regional business processes
  • Experience designing or implementing rules engines (Drools, ODM, or similar)
  • Strong SQL skills and experience with data modeling, data orchestration, and pipeline optimization
  • Experience working in Agile Scrum teams and collaborating across global regions (U.S. and India preferred)
  • Ability to partner closely with analysts and business stakeholders to translate rules into technical solutions
  • Excellent debugging, optimization, and engineering problem‑solving skills
  • Minimum Skills Required: SQL, Python, Azure Data Factory, Databricks, Azure Synapse
Job Responsibility
Job Responsibility
  • Design, build, and maintain Azure‑based ETL pipelines (e.g., Data Factory, Databricks, Data Lake) to ingest, clean, transform, and aggregate compensation‑related datasets across multiple regions
  • Engineer upstream processes to produce 9–10 monthly aggregated output files (customer, revenue, product, sales rep, etc.), delivered 3× per month
  • Ensure repeatability, monitoring, orchestration, and error‑handling for all ingestion and transformation workflows
  • Contribute to the creation of a master stitched data file to replace Varicent’s current data‑assembly functions
  • Build, configure, and maintain a rules engine (ODM, Drools, or similar) to externalize business logic previously embedded in code
  • Translate rules and logic captured by analysts and business SMEs into scalable, testable engine components
  • Implement versioning, governance, and validation mechanisms for all logic used in compensation calculations
  • Ensure rule changes can be managed safely, reducing risk in high‑stakes compensation scenarios
  • Partner with data architects to implement the target‑state Azure data architecture for compensation analytics
  • Develop optimized, scalable physical data models aligned to business logic and downstream needs
  • Fulltime
Read More
Arrow Right

Azure Data Engineer

The Azure Data Engineer role involves designing and maintaining ETL pipelines us...
Location
Location
India , Chennai
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5–8+ years of experience as a Data Engineer with strong hands‑on expertise in Azure (Data Factory, Databricks, Data Lake Storage, SQL, Synapse preferred)
  • Proven ability to build production‑grade ETL/ELT pipelines supporting complex, multi‑regional business processes
  • Experience designing or implementing rules engines (Drools, ODM, or similar)
  • Strong SQL skills and experience with data modeling, data orchestration, and pipeline optimization
  • Experience working in Agile Scrum teams and collaborating across global regions (U.S. and India preferred)
  • Ability to partner closely with analysts and business stakeholders to translate rules into technical solutions
  • Excellent debugging, optimization, and engineering problem‑solving skills
  • Minimum Skills Required: SQL, Python, Azure Data Factory, Databricks, Azure Synapse
Job Responsibility
Job Responsibility
  • Design, build, and maintain Azure‑based ETL pipelines (e.g., Data Factory, Databricks, Data Lake) to ingest, clean, transform, and aggregate compensation‑related datasets across multiple regions
  • Engineer upstream processes to produce 9–10 monthly aggregated output files (customer, revenue, product, sales rep, etc.), delivered 3× per month
  • Ensure repeatability, monitoring, orchestration, and error‑handling for all ingestion and transformation workflows
  • Contribute to the creation of a master stitched data file to replace Varicent’s current data‑assembly functions
  • Build, configure, and maintain a rules engine (ODM, Drools, or similar) to externalize business logic previously embedded in code
  • Translate rules and logic captured by analysts and business SMEs into scalable, testable engine components
  • Implement versioning, governance, and validation mechanisms for all logic used in compensation calculations
  • Ensure rule changes can be managed safely, reducing risk in high‑stakes compensation scenarios
  • Partner with data architects to implement the target‑state Azure data architecture for compensation analytics
  • Develop optimized, scalable physical data models aligned to business logic and downstream needs
  • Fulltime
Read More
Arrow Right

Azure Data Engineer

We are currently seeking a Azure Data Engineer to join our team in Chennai, Tami...
Location
Location
India , Chennai
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5–8+ years of experience as a Data Engineer with strong hands‑on expertise in Azure (Data Factory, Databricks, Data Lake Storage, SQL, Synapse preferred)
  • Proven ability to build production‑grade ETL/ELT pipelines supporting complex, multi‑regional business processes
  • Experience designing or implementing rules engines (Drools, ODM, or similar)
  • Strong SQL skills and experience with data modeling, data orchestration, and pipeline optimization
  • Experience working in Agile Scrum teams and collaborating across global regions (U.S. and India preferred)
  • Ability to partner closely with analysts and business stakeholders to translate rules into technical solutions
  • Excellent debugging, optimization, and engineering problem‑solving skills
Job Responsibility
Job Responsibility
  • Design, build, and maintain Azure‑based ETL pipelines (e.g., Data Factory, Databricks, Data Lake) to ingest, clean, transform, and aggregate compensation‑related datasets across multiple regions
  • Engineer upstream processes to produce 9–10 monthly aggregated output files (customer, revenue, product, sales rep, etc.), delivered 3× per month
  • Ensure repeatability, monitoring, orchestration, and error‑handling for all ingestion and transformation workflows
  • Contribute to the creation of a master stitched data file to replace Varicent’s current data‑assembly functions
  • Build, configure, and maintain a rules engine (ODM, Drools, or similar) to externalize business logic previously embedded in code
  • Translate rules and logic captured by analysts and business SMEs into scalable, testable engine components
  • Implement versioning, governance, and validation mechanisms for all logic used in compensation calculations
  • Ensure rule changes can be managed safely, reducing risk in high‑stakes compensation scenarios
  • Partner with data architects to implement the target‑state Azure data architecture for compensation analytics
  • Develop optimized, scalable physical data models aligned to business logic and downstream needs
Read More
Arrow Right

Sr Data Engineer

We are currently seeking a Sr Data Engineer to join our team in Chennai, Tamil N...
Location
Location
India , Chennai
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Azure
  • Databricks
  • Data Modeling
  • Team Leadership
  • Client Interviews
  • SQL
Job Responsibility
Job Responsibility
  • Engage heavily with business users across North America and Europe, facilitating workshops and data discovery sessions
  • Drive consensus on business rules, data definitions, and data sources, especially where regional processes differ
  • Serve as the architectural thought leader enabling teams to transition from manual, inconsistent processes to standardized, modernized workflows
  • Partner closely with business analysts, data analysts, product owners, and engineering teams across multiple geographies
  • Architect a unified master stitched data model to replace downstream reliance on Varicent for data assembly
  • Lead the re‑architecture of compensation data processing—including internal and external compensation flows—into a scalable, cloud‑native Azure environment
  • Define patterns, frameworks, and integration strategies across Azure services (Data Factory, Databricks, Data Lake, SQL, etc.)
  • Evaluate and evolve the use of rules engines/ODM/Drools to externalize and modernize embedded business logic currently locked in application code
  • Guide decisions to shift logic and data ownership into enterprise‑owned systems rather than third‑party tools
  • Analyze current‑state processes (38 in NA, 9 in Europe) and identify opportunities for re‑engineering, automation, and consolidation
Read More
Arrow Right

Sr Data Engineer

The Senior Data Engineer role involves leading data architecture and modernizati...
Location
Location
India , Chennai
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong expertise in Azure and Databricks
  • Focus on data modeling and integration
  • Azure
  • Databricks
  • Data Modeling
  • Team Leadership
  • Client Interviews
  • SQL
Job Responsibility
Job Responsibility
  • Engage heavily with business users across North America and Europe, facilitating workshops and data discovery sessions
  • Drive consensus on business rules, data definitions, and data sources, especially where regional processes differ
  • Serve as the architectural thought leader enabling teams to transition from manual, inconsistent processes to standardized, modernized workflows
  • Partner closely with business analysts, data analysts, product owners, and engineering teams across multiple geographies
  • Architect a unified master stitched data model to replace downstream reliance on Varicent for data assembly
  • Lead the re‑architecture of compensation data processing—including internal and external compensation flows—into a scalable, cloud‑native Azure environment
  • Define patterns, frameworks, and integration strategies across Azure services (Data Factory, Databricks, Data Lake, SQL, etc.)
  • Evaluate and evolve the use of rules engines/ODM/Drools to externalize and modernize embedded business logic currently locked in application code
  • Guide decisions to shift logic and data ownership into enterprise‑owned systems rather than third‑party tools
  • Analyze current‑state processes (38 in NA, 9 in Europe) and identify opportunities for re‑engineering, automation, and consolidation
Read More
Arrow Right

Technical Architect/Data Architect

The Technical Architect role requires over 10 years of experience in data archit...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Over 10 years of experience in data architecture and engineering
  • Azure
  • ETL pipelines
  • Data modeling
  • Team leadership
  • Client interviews
  • SQL
  • Databricks
Job Responsibility
Job Responsibility
  • Engage heavily with business users across North America and Europe, facilitating workshops and data discovery sessions
  • Drive consensus on business rules, data definitions, and data sources, especially where regional processes differ
  • Serve as the architectural thought leader enabling teams to transition from manual, inconsistent processes to standardized, modernized workflows
  • Partner closely with business analysts, data analysts, product owners, and engineering teams across multiple geographies
  • Architect a unified master stitched data model to replace downstream reliance on Varicent for data assembly
  • Lead the re‑architecture of compensation data processing—including internal and external compensation flows—into a scalable, cloud‑native Azure environment
  • Define patterns, frameworks, and integration strategies across Azure services (Data Factory, Databricks, Data Lake, SQL, etc.)
  • Evaluate and evolve the use of rules engines/ODM/Drools to externalize and modernize embedded business logic currently locked in application code
  • Guide decisions to shift logic and data ownership into enterprise‑owned systems rather than third‑party tools
  • Analyze current‑state processes (38 in NA, 9 in Europe) and identify opportunities for re‑engineering, automation, and consolidation
Read More
Arrow Right

Data Architect - Analytics

The Data Architect - Analytics role involves leading the design and implementati...
Location
Location
India , Chennai
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum of 7 years of experience
  • Strong expertise in Azure and Databricks
  • Expertise in data modeling and integration
  • Team Leadership
  • Client Interviews
  • SQL
Job Responsibility
Job Responsibility
  • Engage heavily with business users across North America and Europe, facilitating workshops and data discovery sessions
  • Drive consensus on business rules, data definitions, and data sources, especially where regional processes differ
  • Serve as the architectural thought leader enabling teams to transition from manual, inconsistent processes to standardized, modernized workflows
  • Partner closely with business analysts, data analysts, product owners, and engineering teams across multiple geographies
  • Architect a unified master stitched data model to replace downstream reliance on Varicent for data assembly
  • Lead the re‑architecture of compensation data processing—including internal and external compensation flows—into a scalable, cloud‑native Azure environment
  • Define patterns, frameworks, and integration strategies across Azure services (Data Factory, Databricks, Data Lake, SQL, etc.)
  • Evaluate and evolve the use of rules engines/ODM/Drools to externalize and modernize embedded business logic currently locked in application code
  • Guide decisions to shift logic and data ownership into enterprise‑owned systems rather than third‑party tools
  • Analyze current‑state processes (38 in NA, 9 in Europe) and identify opportunities for re‑engineering, automation, and consolidation
Read More
Arrow Right