CrawlJobs Logo

Collibra Architect

https://www.roberthalf.com Logo

Robert Half

Location Icon

Location:
United States , Seattle

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Robert Half is searching for a Collibra Architect to serve as a partner day-to-day data governance and operational activities, with a strong focus on metadata management, ingestion monitoring, and Collibra adoption. This position is 100% remote and is 3 month contract opportunity with possibility to extend.

Job Responsibility:

  • Configure the firm-wide metadata management systems (Collibra) to accurately represent the firm's critical data assets and supporting policies
  • Implement and administer a metadata management system to operationalize firm-wide data governance
  • Build and curate an organizationally accepted and effective business glossary and data dictionaries
  • Design and implement operating, asset, and metamodels in Collibra while ensuring usability and understandability of the system to a non-technical audience
  • Construct and implement workflows, reference data, and data lineage harvesting
  • Create and curate asset types, domain types, attributes, and relations to accurately describe priority data
  • Validate completeness of data lineage between the system of record and point of consumption for key data
  • Translate business and technical requirements into Collibra designs
  • Oversee the implementations and validate outcomes with users
  • Assess, recommend, and implement capabilities related to business and technical metadata, data lineage, data profiling, data quality improvement efforts, and issue/request management in Collibra
  • Designs automated and bulk metadata uploads
  • Builds dashboards within Collibra that enables stakeholders to quickly find and understand data
  • Provide thought leadership, frameworks, best practices and subject matter expertise in Collibra to deliver effective data solutions to the firm
  • Develop and deliver training and collateral, including tip sheets, documentation, specifications, guides, etc
  • Maintains necessary processes, controls, and procedures to ensure data accuracy and integrity within the catalog

Requirements:

  • 3-5 years of progressive experience in metadata management roles, required
  • 3 years of experience working with Collibra, required
  • Hands-on experience working with Collibra in a customized or configured environment
  • Proven experience defining business terms, tools, and workflows within Collibra and driving them through approval processes
  • Strong understanding of metadata management and data lineage concepts
  • Experience partnering closely with a manager and data governance counsel
  • Ability to translate governance concepts into clear, usable standards for business and technical team
What we offer:
  • medical
  • vision
  • dental
  • life and disability insurance
  • eligible to enroll in our company 401(k) plan

Additional Information:

Job Posted:
February 20, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Collibra Architect

New

Senior Technical Product Manager - Data Governance Technology Platform Lead

Technical Product Managers at U.S. Bank oversee the strategic product management...
Location
Location
United States , Cupertino; Atlanta; Hopkins
Salary
Salary:
148495.00 - 174700.00 USD / Year
usbank.com Logo
U.S. Bank National Association
Expiration Date
February 27, 2026
Flip Icon
Requirements
Requirements
  • 12–15+ years in data governance, enterprise data management, or platform architecture leadership roles
  • Expert level knowledge of metadata management, data lineage, data quality, compliance, and enterprise governance frameworks
  • Hands on proficiency with Collibra, Snowflake, Databricks Unity Catalog, and related governance ecosystem technologies
  • Proven experience architecting and governing multi cloud environments (AWS + Azure) and hybrid/on prem integrations
  • Experience designing API driven integrations, automation workflows, and governance engineering patterns
  • Background in building governed data marketplaces, secure data exchange models, entitlement frameworks, and data contracts
  • Experience defining governance standards for data products and enterprise data sharing ecosystems
  • Experience with cost optimization, budgeting, and financial governance for platform ecosystems
  • Proven product leadership, including vision setting, roadmap development, and backlog management
  • Strong stakeholder management skills with the ability to influence across technical, business, and executive domains
Job Responsibility
Job Responsibility
  • Architect and maintain end-to-end governance frameworks for data lineage, data quality, metadata management, access governance, and compliance across AWS, Azure, and on prem environments
  • Establish scalable patterns for metadata ingestion, lineage orchestration, and policy enforcement to ensure consistent and automated governance
  • Unify Collibra, Snowflake, Databricks Unity Catalog, and adjacent technologies into a cohesive governance fabric that supports interoperability and consistent controls
  • Define integration standards and interoperability patterns to ensure unified governance across platforms, data domains, and cloud environments
  • Define governance operating models for data products, shared datasets, and marketplace offerings to support standardized, governed data consumption
  • Enable secure, compliant, and frictionless data sharing using entitlement frameworks, data contracts, and standardized exchange patterns
  • Partner with SMEs across lineage, data quality, privacy, risk, and engineering to embed governance capabilities into workflows and development lifecycles
  • Lead governance evangelism to promote adoption, improve usability, and drive consistent enterprise wide execution
  • Develop API driven patterns and automation for metadata ingestion, lineage capture, quality metrics, and governance workflows
  • Champion automation first practices to reduce operational overhead, improve governance scalability, and ensure consistent control enforcement
What we offer
What we offer
  • Healthcare (medical, dental, vision)
  • Basic term and optional term life insurance
  • Short-term and long-term disability
  • Pregnancy disability and parental leave
  • 401(k) and employer-funded retirement plan
  • Paid vacation (from two to five weeks depending on salary grade and tenure)
  • Up to 11 paid holiday opportunities
  • Adoption assistance
  • Sick and Safe Leave accruals of one hour for every 30 worked, up to 80 hours per calendar year unless otherwise provided by law
  • Fulltime
!
Read More
Arrow Right

Senior Database Architect

Zachary Piper Solutions is seeking a Senior Database Architect to join a federal...
Location
Location
United States , DC or Springfield
Salary
Salary:
155000.00 - 165000.00 USD / Year
pipercompanies.com Logo
Piper Companies
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science or related field
  • 10+ years in data governance with Collibra expertise
  • Strong knowledge of metadata, lineage, and MDM
  • Hands-on Collibra integration with modern data platforms
  • Familiarity with data privacy and regulatory standards
  • Experience with automation and AI/ML for governance
Job Responsibility
Job Responsibility
  • Implement data governance frameworks using Collibra
  • Maintain metadata, lineage, and MDM practices
  • Integrate Collibra with platforms (Snowflake, Databricks, AWS)
  • Ensure compliance with GDPR, CCPA, and federal standards
  • Train users on Collibra and self-service data discovery
  • Automate governance workflows and apply AI/ML for data quality
What we offer
What we offer
  • Medical, Dental, Vision
  • 401k, PTO, holidays, and sick leave as required by law
  • Fulltime
Read More
Arrow Right

Senior Data Engineer (Graph)

As a Senior Data Engineer, you will play a pivotal role in transforming data int...
Location
Location
United States , San Francisco
Salary
Salary:
90.00 - 93.00 USD / Hour
softwareresources.com Logo
Software Resources
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of data engineering experience developing data pipelines
  • Understanding core concepts of graph databases and its advantages over a traditional RDBMS for modeling data. Knowing use cases.
  • Proficiency in at least one major programming language (e.g., Python)
  • ETL development for graph databases (extracting or loading into a graph databases)
  • Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
  • Experience Neo4j with Snowflake
  • Strong algorithmic problem-solving expertise
  • Comfortable working in a fast-paced and highly collaborative environment.
  • Excellent written and verbal communication
  • Willingness and ability to learn and pick up new skill sets
Job Responsibility
Job Responsibility
  • Create and maintain Data Platform pipelines
  • supporting structured, graph, and unstructured datasets
  • Architect and implement graph database models, schema design, and build robust, scalable solutions.
  • Fluency with data engineering concepts and platforms (AWS: S3, Lambda, SNS, SQS…
  • Iceberg), data platforms (Snowflake), configuration (data contracts), transformation, orchestration (dbt, Airflow), data quality (Great Expectations, Anomalo, Soda, Collibra).
  • Be an active participant and advocate of agile/scrum ceremonies to collaborate and improve processes for our team
  • Collaborate with product managers, architects, and other engineers to drive the success of the Core Data Platform
  • Document standards and best practices for pipeline configurations, naming conventions, etc.
  • Ensure high operational efficiency and quality of the Core Data Platform datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our stakeholders (Engineering, Data Science, Operations, and Analytics teams)
  • Engage with and understand our customers, forming relationships that allow us to understand and prioritize both innovative new offerings and incremental technology improvements
What we offer
What we offer
  • medical, dental, and vision coverage
  • a 401(k) with company match
  • short-term disability
  • life insurance with AD&D
  • Fulltime
Read More
Arrow Right
New

Senior Software Engineer

Wells Fargo is seeking a Senior Software Engineer. This is for Data Engineering ...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
https://www.wellsfargo.com/ Logo
Wells Fargo
Expiration Date
February 23, 2026
Flip Icon
Requirements
Requirements
  • 4+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education
  • Strong years of Software Engineering experience OR equivalent (industry, training, military, education)
  • Hands-on experience with Python, SQL, and bash scripting for automation
  • Strong experience building big data pipelines using Apache Spark, Hive, Hadoop
  • Experience with Autosys/Airflow or similar orchestration tools
  • Working knowledge of REST APIs, Object Storage, Dremio, and CI/CD pipelines
  • Strong troubleshooting and problem‑solving capabilities
  • Solid foundation in data modeling (conceptual/logical/physical) and database design
  • Cloud-native engineering experience — serverless, managed Spark, event-driven architectures
  • Familiarity with containerization (Docker, K8s) and workflow operators
Job Responsibility
Job Responsibility
  • Lead moderately complex initiatives and deliverables within technical domain environments
  • Contribute to large scale planning of strategies
  • Design, code, test, debug, and document for projects and programs associated with technology domain, including upgrades and deployments
  • Review moderately complex technical challenges that require an in-depth evaluation of technologies and procedures
  • Resolve moderately complex issues and lead a team to meet existing client needs or potential new clients needs while leveraging solid understanding of the function, policies, procedures, or compliance requirements
  • Collaborate and consult with peers, colleagues, and mid-level managers to resolve technical challenges and achieve goals
  • Lead projects and act as an escalation point, provide guidance and direction to less experienced staff
  • Deliver high-quality engineering outcomes during Data Center exit migrations and DPC onboarding, ensuring validations, automation, and production readiness
  • Collaborate with cross-functional teams to build scalable, high‑performance data solutions using Python, SQL, Spark, Iceberg, Dremio, and Autosys
  • Design, build, test, deploy, and maintain large-scale structured and unstructured data pipelines using Python, SQL, Apache Spark, and modern data lake/lakehouse technologies
  • Fulltime
!
Read More
Arrow Right

Data Architect

The IT company Andersen invites a Data Architect in Abu Dhabi to join its team f...
Location
Location
United Arab Emirates , Abu Dhabi
Salary
Salary:
Not provided
andersenlab.com Logo
Andersen
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience in data architecture, data engineering, or solution architecture roles for 8+ years
  • Hands-on expertise with Oracle technologies
  • Proven experience designing scalable data platforms in hybrid (cloud + on-prem) environments
  • Strong understanding of data governance, security, and compliance
  • Experience with AI/ML data preparation and orchestration pipelines
  • Excellent communication and documentation skills
  • Bachelor's or Master's degree in Computer Science, Data Science, or related field
  • Level of English – from Upper-Intermediate and above
Job Responsibility
Job Responsibility
  • Designing and evolving data architecture across ingestion, transformation, storage, and consumption layers
  • Defining data modeling strategies (conceptual, logical, physical) for structured and unstructured data
  • Architecting data lakes and lakehouses (Delta Lake, Iceberg, Hudi) with bronze-silver-gold layering
  • Leading implementation of ETL/ELT pipelines using Airflow, dbt, Talend, Informatica
  • Ensuring data governance, lineage, and metadata management using tools like Apache Atlas, DataHub, Collibra
  • Collaborating with AI/ML teams to prepare data for model training, including feature engineering and data tagging
  • Integrating streaming solutions (Kafka, Flink, Spark Streaming) for real-time data processing
  • Implementing security frameworks: RBAC, row-level security, encryption, GDPR/HIPAA compliance
  • Supporting BI and analytics platforms (Tableau, Oracle Analytics) with well-structured data models
  • Guiding CI/CD for data pipelines, containerization (Docker), and orchestration (Kubernetes)
What we offer
What we offer
  • Experience in teamwork with leaders in FinTech, Healthcare, Retail, Telecom, and others
  • The opportunity to change the project and/or develop expertise in an interesting business domain
  • Guarantee of professional, financial, and career growth
  • The opportunity to earn up to an additional 1,000 USD per month, depending on the level of expertise, which will be included in the annual bonus, by participating in the company's activities
  • Access to the corporate training portal
  • Bright corporate life (parties / pizza days / PlayStation / fruits / coffee / snacks / movies)
  • Certification compensation (AWS, PMP, etc)
  • Referral program
  • English courses
  • Private health insurance and compensation for sports activities
  • Fulltime
Read More
Arrow Right

Data Catalogue Engineer

We are looking for one Data Catalogue Engineer responsible for maintaining, enha...
Location
Location
India , Noida
Salary
Salary:
Not provided
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years of experience in Data cataloguing, Metadata management and Data stewardship
  • Administer and maintain the Data catalogue platform, managing user roles, permissions, and configurations
  • Schedule, monitor, and troubleshoot metadata ingestion
  • Experience of automating process automate metadata extraction, transformation, and loading processes into the catalogue using Python
  • Coordinate with platform and infrastructure teams for upgrades, enhancements, and patches
  • Strong background in Data Governance, Data Quality, and Master data Management (MDM)
  • Strong expertise in Data Catalogue (e.g. Atlan, Collibra) and Data Stewardship
  • Strong expertise in Snowflake/BigQuery/dbt
  • In-depth knowledge of Data Governance, Data Quality, and MDM frameworks
  • Experience of working with Python
Job Responsibility
Job Responsibility
  • Maintain, enhance, and support the organization's Data catalogue platform (e.g. Atlan, Collibra)
  • Support Metadata management and Data stewardship
  • Ensure data assets from systems such as Snowflake, BigQuery, and dbt are accurately catalogued, governed, and easily discoverable
  • Use Python to develop automation scripts, templates, and reusable frameworks that streamline metadata ingestion and catalogue operations
  • Act as an advisor to Data Architects, Data Governance Officers, Data Product Owners and Managers
What we offer
What we offer
  • Commitment to fighting against all forms of discrimination
  • Inclusive and respectful work environment
  • All positions open to people with disabilities
  • Fulltime
Read More
Arrow Right

Senior Data Consultant

Ivy Partners is a Swiss consulting firm that assists companies in their strategi...
Location
Location
Portugal , Lisboa
Salary
Salary:
Not provided
ivy.partners Logo
IVY Partners
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Between 6 to 10 years of professional experience
  • strong understanding and experience in data architecture within diverse domains such as finance, purchasing, etc.
  • deep expertise in managing large datasets
  • proficient in tools like BPMN 2.0 (preferably using Draw.io or LucidChart)
  • skilled in building and interpreting conceptual and logical data models
  • effectively document architecture and business processes
  • strong ability to perform gap analysis on existing datasets versus group standards
  • develop and manage a detailed remediation plan
  • both english and portuguese are required
Job Responsibility
Job Responsibility
  • Collaborate with diverse stakeholder communities, including business experts, data engineers, data scientists, and technical architects, to leverage data analytics, AI, and Master Data Management
  • Structure data platforms optimally for various use cases, ensuring alignment with business needs and governance standards
  • Develop and standardize conceptual data models, and document business processes in BPMN format to support data consumption
  • Identify and define critical business processes and associated data stewards or process owners
  • Assess existing datasets against defined data architecture standards, propose remediation plans, and manage the redesign process to ensure dataset standardization and reusability
  • Ensure adherence to data governance frameworks, including data risk management policies, data classification, access policies, and retention protocols
  • Document all relevant details in data catalogs and governance tools like Collibra
What we offer
What we offer
  • nurturing environment where everyone is valued
  • training and opportunities for advancement both in Switzerland and internationally
  • climate of trust based on transparency, professionalism, and commitment
  • encouraging innovation
  • collective is at the heart of our actions
  • strive to generate a positive impact
Read More
Arrow Right

Data Architect

We are looking for Data Architect to design & develop Data Products that enables...
Location
Location
India , Noida
Salary
Salary:
Not provided
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years of experience in data engineering/management roles
  • 2+ years in enterprise-level data architecture and data governance
  • Experience of design and validate Data architecture standards to deliver reliable, scalable, and compliant Data products
  • Design and document data models and architecture frameworks aligned with group standards
  • Excellent understanding of different Data Models (CDM, LDM, PDM)
  • Proven experience implementing solutions using Snowflake
  • Strong background in Data Governance, Data Quality, and Master data Management (MDM)
  • Strong expertise in Data Modelling (CDM, LDM, PDM)
  • Strong expertise in Snowflake
  • In-depth knowledge of Data Governance, Data Quality, and MDM frameworks
Job Responsibility
Job Responsibility
  • Design & develop Data Products that enables scalability, governance, and innovation
  • Align business goals with technology, ensuring data quality, and enabling advanced analytics and AI/ML use cases
  • Drive the creation and maintenance of advanced data models for Data Products
  • Ensure consistency and reusability of models across business domains and systems
  • Implement MDM, Lineage tracking, and Data cataloguing
  • Ensure Data readiness and certification for Data Products
  • Design, build and optimize Data products using Snowflake
  • Collaborate with business stakeholders to align Data Product use cases with organizational objectives
  • Collaborate with Data Governance teams to ensure alignment between architecture and governance policies
  • Act as an advisor to Data Architects, Data Governance Officers, Data Product Owners and Managers
What we offer
What we offer
  • Commitment to fighting against all forms of discrimination
  • Inclusive and respectful work environment
  • All positions open to people with disabilities
  • Fulltime
Read More
Arrow Right