CrawlJobs Logo

Mdm Product Owner

https://www.roberthalf.com Logo

Robert Half

Location Icon

Location:
United States , Mequon, Wisconsin

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are looking for an experienced Master Data Management (MDM) Product Owner to join our team in Mequon, Wisconsin. In this long-term contract position, you will play a pivotal role in designing, deploying, and managing enterprise-level customer data mastering solutions using Profisee. This role offers the opportunity to lead a team of engineers and analysts while collaborating with stakeholders across governance, engineering, privacy, and business areas to ensure high-quality data solutions that support analytics, reporting, and regulatory requirements.

Job Responsibility:

  • Design and implement customer data mastering solutions aligned with enterprise architecture standards using Profisee
  • Define entity models, attributes, hierarchies, and relationships to support customer data mastering processes
  • Develop match/merge rules, survivorship logic, and strategies for creating trusted golden records
  • Administer Profisee platform configurations, manage environments, and oversee release lifecycles
  • Monitor daily operations, address incidents, and manage defect resolution to ensure platform reliability
  • Collaborate with Data Governance teams to establish quality rules, workflows, and exception handling processes
  • Lead the development and maintenance of source and downstream integrations, ensuring data ingestion and publishing performance
  • Oversee platform upgrades, patches, and configuration changes to ensure scalability and availability
  • Provide technical leadership and mentorship to engineers and analysts, fostering a culture of excellence
  • Partner with cloud and infrastructure teams to optimize platform performance and scalability

Requirements:

  • 6-10 years of detail-oriented experience in Master Data Management, data engineering, or platform operations
  • Proven expertise in designing and operating customer mastering solutions, with hands-on experience using Profisee
  • Strong knowledge of entity modeling, data governance, and integration architecture principles
  • Proficiency in managing platform configurations, upgrades, and environment lifecycles
  • Experience with data quality rules, stewardship workflows, and exception handling processes
  • Advanced skills in API development, data pipelines, and integration engineering
  • Strong leadership abilities with experience mentoring teams and managing stakeholder relationships
  • Familiarity with cloud technologies and security controls in enterprise environments
What we offer:
  • medical, vision, dental, and life and disability insurance
  • eligible to enroll in our company 401(k) plan

Additional Information:

Job Posted:
March 26, 2026

Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Mdm Product Owner

Product Owner

Responsible for defining and managing the vision, roadmap, and delivery of the e...
Location
Location
India , Noida; Chennai; Bangalore
Salary
Salary:
Not provided
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 11+ years of experience in Data Management, with at least 3+ years in MDM initiatives
  • Proven experience as a Product Owner / Business Analyst / Data Governance Lead in MDM or Data Governance projects
  • Strong understanding of MDM platforms (e.g. Semarchy), Data governance frameworks, Data Catalogues (e.g. Atlan), and Data quality processes
  • Hands-on experience with Agile delivery models and backlog tools (Jira, Confluence)
  • Knowledge of Python or SQL for data validation or analysis is an advantage
  • Strong expertise in Semarchy and Atlan
  • Good Knowledge of Snowflake and SQL
  • In-depth knowledge of Data Governance, Data Quality, and MDM frameworks
  • Strategic mindset with the ability to align technical solutions to business outcomes
  • Deep understanding of enterprise data domains and integration landscapes.
Job Responsibility
Job Responsibility
  • Define and drive the MDM roadmap to deliver harmonized, trusted Master data
  • Prioritize Backlog
  • Gather Requirements
  • Align with Governance/Architecture standards
  • Ensure Integration
  • Monitor adoption of best practices and guidelines.
What we offer
What we offer
  • Inclusive and respectful work environment
  • Open positions for people with disabilities
  • Fulltime
Read More
Arrow Right

Data Architect

We are looking for Data Architect to design & develop Data Products that enables...
Location
Location
India , Noida
Salary
Salary:
Not provided
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years of experience in data engineering/management roles
  • 2+ years in enterprise-level data architecture and data governance
  • Experience of design and validate Data architecture standards to deliver reliable, scalable, and compliant Data products
  • Design and document data models and architecture frameworks aligned with group standards
  • Excellent understanding of different Data Models (CDM, LDM, PDM)
  • Proven experience implementing solutions using Snowflake
  • Strong background in Data Governance, Data Quality, and Master data Management (MDM)
  • Strong expertise in Data Modelling (CDM, LDM, PDM)
  • Strong expertise in Snowflake
  • In-depth knowledge of Data Governance, Data Quality, and MDM frameworks
Job Responsibility
Job Responsibility
  • Design & develop Data Products that enables scalability, governance, and innovation
  • Align business goals with technology, ensuring data quality, and enabling advanced analytics and AI/ML use cases
  • Drive the creation and maintenance of advanced data models for Data Products
  • Ensure consistency and reusability of models across business domains and systems
  • Implement MDM, Lineage tracking, and Data cataloguing
  • Ensure Data readiness and certification for Data Products
  • Design, build and optimize Data products using Snowflake
  • Collaborate with business stakeholders to align Data Product use cases with organizational objectives
  • Collaborate with Data Governance teams to ensure alignment between architecture and governance policies
  • Act as an advisor to Data Architects, Data Governance Officers, Data Product Owners and Managers
What we offer
What we offer
  • Commitment to fighting against all forms of discrimination
  • Inclusive and respectful work environment
  • All positions open to people with disabilities
  • Fulltime
Read More
Arrow Right

Sr Architect, Technical

The Sr Technical Architect at T-Mobile is responsible for Retail MDM design and ...
Location
Location
United States , Bellevue; Overland Park; Frisco
Salary
Salary:
Not provided
https://www.t-mobile.com Logo
T-Mobile
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree in Computer Science, Engineering, or IT
  • 4-7 years of progressive experience in software engineering/enterprise architecture/technology leadership
  • 4-7 years of experience in solutions design and enterprise architecture delivering IT solutions
  • 4-7 years of experience in enterprise applications, PLM, middle-tier services, database, storage, distributed computing, virtualization, and/or application technology
  • knowledge and experience with enterprise architecture, MDM / EMM concepts, Ivanti, Jamf, Intune MDM
  • experience with Agile Scrum methodologies and commonly used tools
  • excellent written and verbal communication skills with ability to present complex technical information
  • Certified Scrum Master (CSM) or Scrum Product Owner Certification (CSPO) preferred
  • legally authorized to work in the United States
  • at least 18 years of age.
Job Responsibility
Job Responsibility
  • Design and implement Retail MDM
  • maintain Retail MDM frameworks, standards, and libraries
  • create scalable, extensible designs that perform well
  • perform advanced troubleshooting of production and pre-production systems
  • document and evolve coding standards and best practices
  • coach engineers and conduct code reviews
  • author frameworks and utilities to improve efficiency
  • communicate solution designs clearly
  • provide support during ideation and vision stages of projects
  • follow established architecture standards and best practices.
What we offer
What we offer
  • Medical, dental, and vision insurance
  • flexible spending account
  • 401(k)
  • employee stock grants
  • employee stock purchase plan
  • paid time off
  • paid parental and family leave
  • family building benefits
  • back-up care
  • childcare subsidy
  • Fulltime
Read More
Arrow Right

Distribution and Marketing Data Product Manager

Data is one of Beazley’s greatest assets and this roles is critical to supportin...
Location
Location
United States , Multiple Locations
Salary
Salary:
130000.00 - 150000.00 USD / Year
socialvalueportal.com Logo
Social Value Portal Ltd
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Business, Marketing, Data Science, Computer Science, Economics, Statistics or related field
  • Master’s degree preferred
  • Proven experience in data product management, marketing analytics or distribution strategy, preferably in insurance or financial services
  • Experience working with data, building data models, and sharing insights
  • Strategic and curious with the ability to design and develop data and insights that support our Distribution and Marketing team’s goals, planning, performance and incentives that drive growth
  • Understand the specialty insurance market, customer segmentation and distribution channels, with experience in North America, Lloyd’s, Retail and Wholesale markets preferred
  • Ability to lead workshops that help your stakeholders identify data needs and articulate their desired user experience, with the ability to build dashboards preferred
  • Strong organization and communication skills with the ability to direct work, document requirements and present demos
  • Advanced technical skills with the ability to dive into the data, identify anomalies, and provide high quality, trusted data
  • Understanding of Specialty Insurance principles and key drivers to create opportunities, loyalty and growth
Job Responsibility
Job Responsibility
  • Partner with the global Distribution and Marketing team to understand, prioritize and develop data products and insights that support their business strategy
  • Build and own a roadmap to provide regular updates on delivery commitments for data products, insights, enhancements and queries
  • Manage stakeholder relationships to support the growth strategy for Beazley customers, brokers, teams and products
  • Produce insights and key data trends that highlight business performance, RoI, efficiencies and game-changing growth opportunities
  • Inspire the adoption and use of insights to drive decisions in investment and operations that improve efficiency and drive growth by leading demonstrations and hands on training sessions
  • Lead a team of Product Owners, Product Analysts, Business Analysts and a development team to deliver and maintain data products and insights
  • maintaining a backlog of work within Jira
  • Represent the business in data governance discussions, escalating issues as appropriate
  • Ensure that data product development considers policy, methodology and standards, and ensure these are adhered to during product development
  • Evaluate the performance of your data product portfolio against KPIs defined by the business and provide feedback on the value delivered
What we offer
What we offer
  • Attractive base compensation and discretionary performance related bonus
  • Competitively priced medical, dental and vision insurance
  • Company paid life, and short- and long-term disability insurance
  • 401(k) plan with 5% company match and immediate vesting
  • 22 days PTO (prorated for 1st calendar year of employment), 11 paid holidays per year, with the ability to flex the religious bank holidays to suit your religious beliefs
  • Up to $700 reimbursement for home office setup
  • Free in-office lunch, travel reimbursement for travel to office, and monthly lifestyle allowance
  • Up to 26 weeks of fully paid parental leave
  • Up to 2.5 days paid annually for volunteering at a charity of your choice
  • Flexible working policy, trusting our employees to do what works best for them and their teams
  • Fulltime
Read More
Arrow Right

Architect - Strategy and Governance

The Architect in Delivery is a hands-on technical lead for cloud solutions. You ...
Location
Location
United States
Salary
Salary:
117500.00 - 176300.00 USD / Year
3cloudsolutions.com Logo
3Cloud
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 6+ years of experience delivering solutions in a primary domain such as application development, cloud platform engineering, DevOps, or data and analytics
  • 3+ years of experience in solution architecture and leading development or engineering teams
  • Proven experience leading hybrid and remote teams and maintaining alignment, quality, and predictable delivery across time zones
  • Expertise in at least one major public cloud platform, with Azure strongly preferred
  • Deep understanding of core concepts, languages, frameworks, and design patterns in your primary domain, with the ability to apply them to build robust, maintainable systems
  • Experience using AI-augmented development and integrating AI or machine learning capabilities into solutions through APIs, libraries, or cloud-based AI services
  • 7+ years of experience in data governance, data management, or 5+ in large consulting firm
  • 2+ years with Azure technologies
  • Proven experience delivering governance programs
  • Strong understanding of data platform and analytics ecosystems
Job Responsibility
Job Responsibility
  • Own end-to-end architecture and scope for your projects, leading teams to design and deliver Azure solutions that meet agreed goals for value, security, scalability, reliability, performance, and cost
  • Serve as the technical lead and subject matter expert on your engagements, guiding consultants and engineers so implementation stays aligned with the architecture and standards
  • Apply sound engineering practices—clear interfaces, modular design, and testability—to produce reliable, maintainable solutions in your primary domain
  • Provide architectural guidance throughout implementation, including design and pull-request reviews, shaping cloud patterns and guardrails and offering practical options to unblock teams
  • Use AI tools to accelerate code, tests, documentation, and similar tasks, and share effective practices so they become repeatable gains for teams and clients
  • Oversee and implement at least one of: Microsoft Fabric, Power BI (workspace governance, multi-tenant strategy, semantic model and report ownership), Microsoft Purview (catalog, lineage, classification, glossary), Profisee (MDM domains, hierarchies, survivorship, workflows), Unity Catalog (permissions, data product governance)
  • Lead discovery and requirements sessions for your workstreams, clarifying scope, dependencies, and risks, and shaping a prioritized backlog with clear acceptance criteria tied to business outcomes
  • Explain complex technical topics in simple, direct language for both technical and non-technical audiences, and consistently connect technical decisions back to client goals
  • Build trust with managers and product owners, influencing priorities and roadmaps in ways that support the client’s long-term interests
  • Enable teams to deliver with stable velocity and low defect rates by providing clear architecture, interfaces, and automation across build, test, and deployment, and by designing for incremental delivery with a clear Definition of Done
What we offer
What we offer
  • Flexible work location with a virtual first approach to work
  • 401(K) with match up to 50% of your 6% contributions of eligible pay
  • Generous PTO providing a minimum of 15 days in addition to 9 paid company holidays and 2 floating personal days
  • Two medical plan options to allow you the choice to elect what works best for you
  • Option for vision and dental coverage
  • 100% employer paid coverage for life and disability insurance
  • Paid leave for birth parents and non-birth parents
  • Option for Healthcare FSA, HSA, and Dependent Care FSA
  • $67.00 monthly tech and home office allowance
  • Utilization and/or discretionary bonus eligibility based on role
  • Fulltime
Read More
Arrow Right

Azure Data Engineer

The Azure Data Engineer role involves designing, building, and maintaining ETL p...
Location
Location
India , Chennai
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5–8+ years of experience as a Data Engineer
  • Strong hands‑on expertise in Azure (Data Factory, Databricks, Data Lake Storage, SQL, Synapse preferred)
  • Proven ability to build production‑grade ETL/ELT pipelines supporting complex, multi‑regional business processes
  • Experience designing or implementing rules engines (Drools, ODM, or similar)
  • Strong SQL skills and experience with data modeling, data orchestration, and pipeline optimization
  • Experience working in Agile Scrum teams and collaborating across global regions (U.S. and India preferred)
  • Ability to partner closely with analysts and business stakeholders to translate rules into technical solutions
  • Excellent debugging, optimization, and engineering problem‑solving skills
  • Minimum Skills Required: SQL, Python, Azure Data Factory, Databricks, Azure Synapse
Job Responsibility
Job Responsibility
  • Design, build, and maintain Azure‑based ETL pipelines (e.g., Data Factory, Databricks, Data Lake) to ingest, clean, transform, and aggregate compensation‑related datasets across multiple regions
  • Engineer upstream processes to produce 9–10 monthly aggregated output files (customer, revenue, product, sales rep, etc.), delivered 3× per month
  • Ensure repeatability, monitoring, orchestration, and error‑handling for all ingestion and transformation workflows
  • Contribute to the creation of a master stitched data file to replace Varicent’s current data‑assembly functions
  • Build, configure, and maintain a rules engine (ODM, Drools, or similar) to externalize business logic previously embedded in code
  • Translate rules and logic captured by analysts and business SMEs into scalable, testable engine components
  • Implement versioning, governance, and validation mechanisms for all logic used in compensation calculations
  • Ensure rule changes can be managed safely, reducing risk in high‑stakes compensation scenarios
  • Partner with data architects to implement the target‑state Azure data architecture for compensation analytics
  • Develop optimized, scalable physical data models aligned to business logic and downstream needs
Read More
Arrow Right

Azure Data Engineer

The Azure Data Engineer role requires 5-8 years of experience in designing and m...
Location
Location
India , Chennai
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5–8+ years of experience as a Data Engineer with strong hands‑on expertise in Azure (Data Factory, Databricks, Data Lake Storage, SQL, Synapse preferred)
  • Proven ability to build production‑grade ETL/ELT pipelines supporting complex, multi‑regional business processes
  • Experience designing or implementing rules engines (Drools, ODM, or similar)
  • Strong SQL skills and experience with data modeling, data orchestration, and pipeline optimization
  • Experience working in Agile Scrum teams and collaborating across global regions (U.S. and India preferred)
  • Ability to partner closely with analysts and business stakeholders to translate rules into technical solutions
  • Excellent debugging, optimization, and engineering problem‑solving skills
  • Minimum Skills Required: SQL, Python, Azure Data Factory, Databricks, Azure Synapse
Job Responsibility
Job Responsibility
  • Design, build, and maintain Azure‑based ETL pipelines (e.g., Data Factory, Databricks, Data Lake) to ingest, clean, transform, and aggregate compensation‑related datasets across multiple regions
  • Engineer upstream processes to produce 9–10 monthly aggregated output files (customer, revenue, product, sales rep, etc.), delivered 3× per month
  • Ensure repeatability, monitoring, orchestration, and error‑handling for all ingestion and transformation workflows
  • Contribute to the creation of a master stitched data file to replace Varicent’s current data‑assembly functions
  • Build, configure, and maintain a rules engine (ODM, Drools, or similar) to externalize business logic previously embedded in code
  • Translate rules and logic captured by analysts and business SMEs into scalable, testable engine components
  • Implement versioning, governance, and validation mechanisms for all logic used in compensation calculations
  • Ensure rule changes can be managed safely, reducing risk in high‑stakes compensation scenarios
  • Partner with data architects to implement the target‑state Azure data architecture for compensation analytics
  • Develop optimized, scalable physical data models aligned to business logic and downstream needs
  • Fulltime
Read More
Arrow Right

Azure Data Engineer

The Azure Data Engineer role involves designing and maintaining ETL pipelines us...
Location
Location
India , Chennai
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5–8+ years of experience as a Data Engineer with strong hands‑on expertise in Azure (Data Factory, Databricks, Data Lake Storage, SQL, Synapse preferred)
  • Proven ability to build production‑grade ETL/ELT pipelines supporting complex, multi‑regional business processes
  • Experience designing or implementing rules engines (Drools, ODM, or similar)
  • Strong SQL skills and experience with data modeling, data orchestration, and pipeline optimization
  • Experience working in Agile Scrum teams and collaborating across global regions (U.S. and India preferred)
  • Ability to partner closely with analysts and business stakeholders to translate rules into technical solutions
  • Excellent debugging, optimization, and engineering problem‑solving skills
  • Minimum Skills Required: SQL, Python, Azure Data Factory, Databricks, Azure Synapse
Job Responsibility
Job Responsibility
  • Design, build, and maintain Azure‑based ETL pipelines (e.g., Data Factory, Databricks, Data Lake) to ingest, clean, transform, and aggregate compensation‑related datasets across multiple regions
  • Engineer upstream processes to produce 9–10 monthly aggregated output files (customer, revenue, product, sales rep, etc.), delivered 3× per month
  • Ensure repeatability, monitoring, orchestration, and error‑handling for all ingestion and transformation workflows
  • Contribute to the creation of a master stitched data file to replace Varicent’s current data‑assembly functions
  • Build, configure, and maintain a rules engine (ODM, Drools, or similar) to externalize business logic previously embedded in code
  • Translate rules and logic captured by analysts and business SMEs into scalable, testable engine components
  • Implement versioning, governance, and validation mechanisms for all logic used in compensation calculations
  • Ensure rule changes can be managed safely, reducing risk in high‑stakes compensation scenarios
  • Partner with data architects to implement the target‑state Azure data architecture for compensation analytics
  • Develop optimized, scalable physical data models aligned to business logic and downstream needs
  • Fulltime
Read More
Arrow Right