CrawlJobs Logo

Master data agent

https://www.randstad.com Logo

Randstad

Location Icon

Location:
Poland , Poznań

Category Icon

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

Not provided

Job Description:

Invitation to a Career Transformation: building regional capabilities in Poznan. We are seeking Qualified Specialists and Experienced Leaders to join newly established, strategic teams in Poznan. In partnership with BRIDGESTONE, the global industry leader, we are developing a high-impact operational structure to support the company's European operations. Successful candidates will serve as an integral component in the transition adn establishment of critical processes being relocated from other regions. You will join a newly formed, dynamic team from its inception, actively contributing to the development of its organizational culture. If you are driven by complex challenges and eager to have an impact, this is your opportunity to define your career within a truly world-class organization.

Job Responsibility:

  • Collecting essential information including contact details and business data
  • Enriching master data records with critical information for customer classification and order-to-cash flow
  • Maintaining and updating customer records in company systems to ensure accuracy and completeness
  • Collaborating with sales and internal departments to support customer onboarding and data integration
  • Ensuring compliance with data protection regulations and company policies in all master data processes
  • Generating and analyzing reports on data accuracy and classification to support business decisions
  • Driving continuous improvement of data quality and identifying opportunities for automation
  • Managing customer master data including creations and changes for payer, sold-to, and ship-to entities
  • Handling vendor extensions and maintaining salesmen and customer tables
  • Managing material master data including creations, governance, and customer material info records
  • Monitoring reporting and KPIs for master data activities

Requirements:

  • High school diploma or equivalent
  • One to two years of experience in customer service, order-to-cash, data management, or a related role
  • Experience with Microsoft 365
  • SAP knowledge is an advantage
  • French C1
  • Polish B2
  • English B2
  • Strong communication and interpersonal skills
  • Attention to detail and ability to work collaboratively with cross-functional teams
  • Adaptability and cross-cultural competence
  • Organized, proactive, and solution-oriented with ability to handle multiple tasks simultaneously

Nice to have:

SAP knowledge is an advantage

What we offer:
  • Stable employment based on a permanent employment contract
  • Hybrid model with 3 days remote work and 2 days in the office after a 3-month onboarding period
  • Flexible schedule with start time between 7:00 AM and 10:00 AM
  • Competitive salary
  • Performance-based semi-annual bonus
  • Dynamic career development and vast opportunities
  • Additional day off for health and preventive examinations
  • Co-financing for training & studies after 1 year of working
  • Lunch vouchers
  • Life insurance with option to extend coverage to spouse and adult children
  • Private medical care with costs partially covered by the employer
  • Multisport card for you and loved ones
  • Mybenefit cafeteria system with employer-funded points
  • Tire discounts after one year of work
  • Jubilee awards after 5, 10, 15, and 20 years of service

Additional Information:

Job Posted:
April 20, 2026

Expiration:
June 30, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Master data agent

Senior ML Data Engineer

As a Senior Data Engineer, you will play a pivotal role in our AI/ML workstream,...
Location
Location
Salary
Salary:
Not provided
awin.com Logo
Awin Global
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor or Master’s degree in data science, data engineering, Computer Science with focus on math and statistics / Master’s degree is preferred
  • At least 5 years experience as AI/ML data engineer undertaking above task and accountabilities
  • Strong foundation in computer science principes and statistical methods
  • Strong experience with cloud technology (AWS or Azure)
  • Strong experience with creation of data ingestion pipeline and ET process
  • Strong knowledge of big data tool such as Spark, Databricks and Python
  • Strong understanding of common machine learning techniques and frameworks (e.g. mlflow)
  • Strong knowledge of Natural language processing (NPL) concepts
  • Strong knowledge of scrum practices and agile mindset
Job Responsibility
Job Responsibility
  • Design and maintain scalable data pipelines and storage systems for both agentic and traditional ML workloads
  • Productionise LLM- and agent-based workflows, ensuring reliability, observability, and performance
  • Build and maintain feature stores, vector/embedding stores, and core data assets for ML
  • Develop and manage end-to-end traditional ML pipelines: data prep, training, validation, deployment, and monitoring
  • Implement data quality checks, drift detection, and automated retraining processes
  • Optimise cost, latency, and performance across all AI/ML infrastructure
  • Collaborate with data scientists and engineers to deliver production-ready ML and AI systems
  • Ensure AI/ML systems meet governance, security, and compliance requirements
  • Mentor teams and drive innovation across both agentic and classical ML engineering practices
  • Participate in team meetings and contribute to project planning and strategy discussions
What we offer
What we offer
  • Flexi-Week and Work-Life Balance: We prioritise your mental health and well-being, offering you a flexible four-day Flexi-Week at full pay and with no reduction to your annual holiday allowance. We also offer a variety of different paid special leaves as well as volunteer days
  • Remote Working Allowance: You will receive a monthly allowance to cover part of your running costs. In addition, we will support you in setting up your remote workspace appropriately
  • Pension: Awin offers access to an additional pension insurance to all employees in Germany
  • Flexi-Office: We offer an international culture and flexibility through our Flexi-Office and hybrid/remote work possibilities to work across Awin regions
  • Development: We’ve built our extensive training suite Awin Academy to cover a wide range of skills that nurture you professionally and personally, with trainings conveniently packaged together to support your overall development
  • Appreciation: Thank and reward colleagues by sending them a voucher through our peer-to-peer program
Read More
Arrow Right

Senior ML Data Engineer

As a Senior Data Engineer, you will play a pivotal role in our AI/ML workstream,...
Location
Location
Poland , Warsaw
Salary
Salary:
Not provided
awin.com Logo
Awin Global
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor or Master’s degree in data science, data engineering, Computer Science with focus on math and statistics / Master’s degree is preferred
  • At least 5 years experience as AI/ML data engineer undertaking above task and accountabilities
  • Strong foundation in computer science principes and statistical methods
  • Strong experience with cloud technology (AWS or Azure)
  • Strong experience with creation of data ingestion pipeline and ET process
  • Strong knowledge of big data tool such as Spark, Databricks and Python
  • Strong understanding of common machine learning techniques and frameworks (e.g. mlflow)
  • Strong knowledge of Natural language processing (NPL) concepts
  • Strong knowledge of scrum practices and agile mindset
  • Strong Analytical and Problem-Solving Skills with attention to data quality and accuracy
Job Responsibility
Job Responsibility
  • Design and maintain scalable data pipelines and storage systems for both agentic and traditional ML workloads
  • Productionise LLM- and agent-based workflows, ensuring reliability, observability, and performance
  • Build and maintain feature stores, vector/embedding stores, and core data assets for ML
  • Develop and manage end-to-end traditional ML pipelines: data prep, training, validation, deployment, and monitoring
  • Implement data quality checks, drift detection, and automated retraining processes
  • Optimise cost, latency, and performance across all AI/ML infrastructure
  • Collaborate with data scientists and engineers to deliver production-ready ML and AI systems
  • Ensure AI/ML systems meet governance, security, and compliance requirements
  • Mentor teams and drive innovation across both agentic and classical ML engineering practices
  • Participate in team meetings and contribute to project planning and strategy discussions
What we offer
What we offer
  • Flexi-Week and Work-Life Balance: We prioritise your mental health and well-being, offering you a flexible four-day Flexi-Week at full pay and with no reduction to your annual holiday allowance. We also offer a variety of different paid special leaves as well as volunteer days
  • Remote Working Allowance: You will receive a monthly allowance to cover part of your running costs. In addition, we will support you in setting up your remote workspace appropriately
  • Pension: Awin offers access to an additional pension insurance to all employees in Germany
  • Flexi-Office: We offer an international culture and flexibility through our Flexi-Office and hybrid/remote work possibilities to work across Awin regions
  • Development: We’ve built our extensive training suite Awin Academy to cover a wide range of skills that nurture you professionally and personally, with trainings conveniently packaged together to support your overall development
  • Appreciation: Thank and reward colleagues by sending them a voucher through our peer-to-peer program
Read More
Arrow Right

Manager of Data Science

The Manager of Data Science will lead a dynamic team of data scientists responsi...
Location
Location
United States , Chicago
Salary
Salary:
84836.00 - 153548.75 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Master's Degree Data Science, Computer Science, Statistics, Mathematics, or related quantitative field
  • 7+ years Experience in data science or analytics roles
  • 3+ years Experience leading technical teams and employees, either through direct People Management or through projects
  • Demonstrated success with agentic workflows and GenAI in data science product solutions
  • Strong analytical skills with a foundation in statistical techniques (e.g., regression, clustering, classification, Monte Carlo simulation)
  • Excellent understanding of machine learning techniques and algorithms, such as decision trees, RF, XGBoost, LightGBM, etc.
  • Advanced skills in programming and coding, data science toolkits, and query languages (e.g., Python, R, SQL, and Java)
  • Demonstrated experience with data analysis tools and platforms such as GCP (incl. Vertex AI and BigQuery) and SQL Server
  • Knowledge and familiarity with Power BI, Tableau, or comparable reporting and visualization platforms
  • Advanced applied statistics skills, such as distributions, statistical testing, regression, etc.
Job Responsibility
Job Responsibility
  • Manage, mentor, and develop a dynamic team of data scientists with diverse technical backgrounds
  • Develop and execute on projects and initiatives on the data science roadmap that align with business objectives
  • Collaborate with business leaders, key stakeholders, and cross-functional teams to identify high-impact opportunities for advanced analytics and translate business challenges into data science solutions
  • Lead and support various projects and initiatives, regularly managing competing priorities, deadlines, and expectations
  • Oversee the development of machine learning models, statistical analyses, and predictive algorithms
  • Drive innovation in modeling techniques, tools, and methodologies. Drive continuous improvement in team processes and methodologies
  • Ensure model quality, performance, and reliability through robust validation and testing frameworks
  • Establish best practices for model lifecycle management, including deployment, monitoring, and maintenance
  • Facilitate knowledge transfer and communicate complex technical concepts to non-technical audiences
  • Translate analytic findings into simple, actionable, and practical solutions and recommendations
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Eligible to participate in an annual incentive program
  • Fulltime
Read More
Arrow Right

P2P Master Data Agent

Join our TUI Global Business Services (TGBS) Team as part of our Global Finance ...
Location
Location
Tunisia , Sousse
Salary
Salary:
Not provided
https://www.tui.com Logo
TUI
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Business University degree is advantageous
  • Excellent level of English and German is mandatory
  • Previous experience in AP department, administration, supplier or customer management
  • Ability to use MS Office tools such as Word, Outlook and Excel
  • ERP (enterprise resource planning) experience in SAP and Asterix/Atlas would be advantageous
  • Communication and influencing skills
  • Someone with proven numeric skills and accounting acumen
  • Someone with a methodical, organized approach towards the execution of the assigned tasks
Job Responsibility
Job Responsibility
  • Dealing with vendor master data set-ups and changes
  • Checking the details given in vendor master data requests against the source documents
  • Getting in touch with suppliers (‘Ringback procedure’) to verify vendor master data, in particular for English-speaking suppliers
  • Doing checks to ensure that data such as VAT IDs or company registration numbers are correct
  • Reviewing returned payments from vendors
  • Monitoring a number of Vendor Master email accounts and dealing with queries
  • Actively participating in team meetings to share information and look at priorities
  • Undertake other ad hoc activities as deemed necessary
  • Assisting with the Supplier accounts management, providing stakeholders the right support to the Accounts Payable cycle with accounting activities in accordance with the company policies
  • Maintain strong relationships with stakeholders and suppliers, communicate effectively, and propose improvements of any identified issues on supplier accounts will be further tasks of your role
What we offer
What we offer
  • Attractive remuneration
  • Discretionary bonus schemes
  • Generous travel benefits
  • Extensive health & well-being support
  • Flexible working
  • Access the TUI Learning Hub
  • Opportunities to work on global projects and teams
  • Get involved with incredible local charity and sustainability initiatives like the TUI Care Foundation
  • Fulltime
Read More
Arrow Right

Forward Deployed Engineer - Data Migration & Data Consolidation Platforms

As a Forward Deployed Engineer (FDE) for Data Migration & Data Consolidation Pla...
Location
Location
United States
Salary
Salary:
Not provided
rackspace.com Logo
Rackspace
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7-10+ years of progressive experience in enterprise data engineering, data migration, or large-scale system integration roles within complex, multi-platform environments
  • 3-5+ years directly leading end-to-end data migration or multi-system consolidation programs for Global Enterprises and Industry Leaders, with full ownership of technical delivery and client outcomes
  • Demonstrated client-facing experience serving as a trusted technical advisor to C-level executives, enterprise architecture teams, and cross-functional business stakeholders
  • Proven industry depth in at least two of the following verticals: Healthcare, Financial Services, Manufacturing, Retail, Energy & Utilities, or Public Sector
  • Hands-on migration complexity: successfully delivered programs involving at least 3+ heterogeneous source systems, 100M+ records, complex master data harmonization, and multi-phase cutover execution
  • Advanced proficiency in Python and SQL with working experience in PySpark and TypeScript/JavaScript
  • Hands-on expertise with modern ETL/ELT and data integration platforms (Informatica, Talend, Matillion, Fivetran, AWS Glue, Azure Data Factory)
  • Proven ability to build scalable, version-controlled data pipelines with error handling, incremental loading, and Change Data Capture (CDC)
  • Strong working knowledge of at least one major cloud provider (AWS, Azure, or GCP), including core infrastructure, managed data services, and security configurations
  • Experience with enterprise data warehouse and lakehouse platforms (Snowflake, Databricks, BigQuery, Redshift, Synapse Analytics, Delta Lake)
Job Responsibility
Job Responsibility
  • Migration Execution & Cloud Architecture: Lead end-to-end delivery of enterprise data migrations from corporate systems (SAP, Oracle, Epic ERP) to target cloud data platforms, including the design of cloud landing zones, data governance frameworks, and system rationalization strategies. Establish migration compliance controls, automated rollback procedures, and operational readiness gates while owning full technical accountability for 12–18+ month migration roadmaps
  • Data Pipeline Engineering & Transformation: Build production-grade data connectors to SAP (RFC, IDoc, BAPI, OData), Oracle (AQ, GoldenGate, APIs), and SQL/non-relational sources. Develop ETL/ELT pipelines with LLM-enabled transformation logic, multi-layer validation and reconciliation frameworks, and optimized throughput for datasets scaling from tens of millions to billions of records with built-in CDC and incremental loading
  • Ontology Layer Development & Schema Automation: Construct semantic ontology layers translating raw ERP structures into business-consumable objects (Customer, Order, Invoice, Product, Vendor, Asset). Deploy automated schema mapping agents for source-to-target analysis and transformation logic generation. Build unified master data models with row/column-level security, cross-system lineage tracking, and AI-ready semantic structures
  • Application & Workflow Delivery: Build operational dashboards, migration control centers, and agent-driven workflows for automated validation, exception handling, and anomaly detection using low-code platform tools. Generate TypeScript/Python SDKs for custom integrations and deliver real-time monitoring and self-service interfaces for migration progress, data quality KPIs, and compliance tracking
  • Multi-System Consolidation & Master Data Management: Lead consolidation of 5–15+ fragmented ERP instances into standardized master data models. Resolve complex entity resolution challenges including customer matching, product harmonization, and chart of accounts unification. Establish golden record frameworks, data quality scorecards, survivorship rules, and data stewardship workflows for post-migration governance
  • Client Engagement, Discovery & Modernization Advisory: Serve as primary technical advisor to C-suite and enterprise architecture stakeholders across all engagement phases. Deploy discovery agents to analyze legacy data estates, conduct assessment workshops, facilitate solution design sessions, and deliver executive briefings, go/no-go readiness assessments, and prioritized modernization roadmaps
  • Knowledge Transfer, Enablement & IP Development: Build reusable migration accelerators, playbooks, and reference architectures that scale across engagements. Lead knowledge transfer to upskill client teams for post-migration ownership and collaborate with internal product and sales engineering teams to feed field insights back into platform development and delivery methodology
  • Leadership & Executive Engagement: Operate autonomously in ambiguous, high-stakes client environments, driving outcomes with minimal oversight
  • translate deeply technical concepts into clear, business-level narratives for C-suite audiences through executive briefings and stakeholder communications
  • navigate organizational complexity, competing stakeholder priorities, and enterprise change management dynamics to maintain momentum across multi-workstream engagements
Read More
Arrow Right

Data Scientist Specialist

We are seeking a highly experienced Data Scientist Specialist with deep expertis...
Location
Location
United States , McLean
Salary
Salary:
Not provided
apexsystems.com Logo
Apex Systems
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in AI, Data Science, Computer Science, or related field
  • Extensive experience in AI/ML, including 3+ years in applied GenAI or LLM-based solutions
  • Deep expertise in prompt engineering, fine-tuning, RAG, GraphRAG, vector databases, and multi-modal models
  • Proven experience with AWS cloud-native AI development (SageMaker, Bedrock, MLFlow/Kubeflow on EKS)
  • Strong programming skills in Python and ML/LLM libraries (Transformers, LangChain, etc.)
  • Strong understanding of GenAI system patterns, agentic architectures, evaluation frameworks, and guardrails
  • Demonstrated success working in cross-functional, agile teams
  • GitHub code repository link required for candidate evaluation
Job Responsibility
Job Responsibility
  • Architect and implement GenAI systems: Build scalable AI agents, agentic workflows, and GenAI applications for diverse business use cases
  • Model development & optimization: Fine-tune and optimize lightweight LLMs
  • evaluate and adapt models such as Claude (Anthropic), Azure OpenAI, and open-source alternatives
  • RAG & GraphRAG architectures: Design and deploy Retrieval-Augmented Generation (RAG) and GraphRAG systems using vector databases and enterprise knowledge bases
  • Enterprise data curation: Curate and prepare enterprise data using connectors integrated with AWS Bedrock Knowledge Bases and/or Elasticsearch
  • Agent interoperability: Implement solutions leveraging Model Context Protocol (MCP) and Agent-to-Agent (A2A) communication patterns
  • Experimentation & ML platforms: Build and maintain Jupyter-based notebooks using SageMaker, MLFlow, or Kubeflow on Kubernetes (EKS)
  • Cross-functional collaboration: Work with UI engineers, microservices teams, designers, and data engineers to deliver full-stack GenAI experiences
  • Enterprise integration: Integrate GenAI solutions with enterprise platforms via APIs and standardized GenAI architectural patterns
  • Evaluation & safety: Establish evaluation frameworks, bias mitigation strategies, safety protocols, and guardrails for production deployment
What we offer
What we offer
  • medical
  • dental
  • vision
  • life
  • disability
  • other insurance plans
  • ESPP (employee stock purchase program)
  • 401K program with company match after 12 months
  • HSA (Health Savings Account on the HDHP plan)
  • SupportLinc Employee Assistance Program (EAP) with up to 8 free counseling sessions
Read More
Arrow Right

AI Engineer

In this role you will design and build intelligent, autonomous AI systems that e...
Location
Location
United States , San Diego
Salary
Salary:
199500.00 - 299300.00 USD / Year
teradata.com Logo
Teradata
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or a related field
  • 3–5+ years of experience in software architecture, backend development, or AI infrastructure
  • Strong Python skills and familiarity with Java, Go, and C++
  • Deep expertise in agent development, LLM integration, prompt engineering, runtime systems, and AI tooling
  • Experience with MCP servers, vector databases, RAG systems, graph-based memory, and NLP frameworks
  • Ability to design core agentic capabilities such as memory management, context handling, observability, and identity
  • Strong background in distributed systems, backend services, API design, and cloud-native deployments (AWS, Azure, GCP)
  • Proficiency with containerization, CI/CD pipelines, and scalable production infrastructures
  • Excellent communication skills, documentation habits, and ability to mentor or collaborate across teams
  • Passion for building safe, human-aligned, autonomous systems and extending open-source tools to innovate
Job Responsibility
Job Responsibility
  • Design and build intelligent, autonomous AI systems that enable Teradata to push the boundaries of enterprise-scale agentic technology
  • Lead the development of scalable, secure, cloud-native frameworks that allow AI agents to reason, plan, act, and collaborate in real-world production environments
  • Create the foundational runtime components, automation capabilities, and infrastructure that power next-generation GenAI and Agentic AI solutions
  • Work closely with AI researchers, platform teams, and product leadership to bring advanced agentic capabilities from concept to production across Teradata’s data and AI platform
  • Succeed in this role by enabling enterprise customers to leverage powerful, resilient, and safely governed AI agents that drive measurable business value
What we offer
What we offer
  • Healthcare, life and disability insurance plans
  • 401(k)-retirement savings plan
  • Time-off programs
  • Flexible work model
  • Well-being focus
  • Diversity, Equity, and Inclusion commitment
  • Fulltime
Read More
Arrow Right

Senior Data Scientist - Copilot Studio

Microsoft is a company where passionate innovators come to collaborate, envision...
Location
Location
Israel , Tel Aviv, Herzliya
Salary
Salary:
Not provided
https://www.microsoft.com/ Logo
Microsoft Corporation
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Doctorate in Data Science, Mathematics, Statistics, Econometrics, Physics, Operations Research, Computer Science, or related field AND 3+ year(s) data-science experience (e.g., managing structured and unstructured data, applying statistical techniques and reporting results)
  • OR Master's Degree in Data Science, Mathematics, Statistics, Econometrics, Physics, Operations Research, Computer Science, or related field AND 5+ years data-science experience
  • Proficiency in one or more programming/scripting languages for working with data, such as Python, C++, or C#
  • Experience in building ML and LLM products, focusing on NLP and conversational AI
  • Analytical Mindset: Strong data analysis and problem-solving skills. Ability to use data to draw insights and make decisions, experiment design, and interpret model performance metrics
  • Excellent teamwork and communication skills. Comfortable working in a fast-paced, interdisciplinary environment and presenting complex findings in a clear, impactful way
  • Passion for learning and stay informed with State of the Art progress
  • This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter
Job Responsibility
Job Responsibility
  • Formulate data-driven approaches to evaluate and improve AI agent performance, leveraging diverse algorithms and data sources
  • Apply state-of-the-art LLM and machine learning techniques to analyze and optimize agent behavior
  • Use data exploration to uncover patterns in agent interactions, identify new opportunities or issues, and assess data limitations within our problem space
  • Engage with product teams and collaborate with other data scientists, engineers, designers, and product managers to translate findings into clear, actionable insights that shape product features and improve our Copilot Studio platform
  • Fulltime
Read More
Arrow Right