This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The mission of the Data & Analytics (D&A) team is to enable data users to easily Client, understand, and access trusted data products. A critical enabler of this mission is robust governance and automation within Databricks and Unity Catalog. The Senior Backend Engineer will design, build, and scale automation capabilities that enforce governance standards, improve data quality, and provide transparency into metadata, lineage, and usage. This role will ensure that the Metadata Catalog UI and supporting services are powered by trusted, well-governed, and observable data infrastructure
Job Responsibility:
Databricks & Unity Catalog Engineering: Build and maintain backend services leveraging Databricks (SQL, PySpark, Delta Lake, Jobs/Workflows)
Administer Unity Catalog including metadata, permissions, lineage, and tags
Integrate Unity Catalog APIs to surface data into the Metadata Catalog UI
Governance Automation: Develop automation scripts and pipelines to enforce access controls, tagging, and role-based policies
Implement governance workflows integrating with tools such as ServiceNow for request and approval processes
Automate compliance checks for regulatory and security requirements (IAM, PII handling, encryption)
Data Quality & Observability: Implement data quality frameworks (Great Expectations, Deequ, or equivalent) to validate datasets
Build monitoring and observability pipelines for logging, usage metrics, audit trails, and alerts
Ensure high system reliability and proactive issue detection
API Development & Integration: Design and implement APIs to integrate Databricks services with external platforms (ServiceNow, monitoring tools)
Build reusable automation utilities and integration frameworks for governance at scale
DevOps & CI/CD: Manage source control and CI/CD pipelines (GitHub, Azure DevOps, Jenkins) for backend workflows
Deploy scalable and secure backend services in cloud environments (Azure preferred)
Document, test, and industrialize automation solutions for production environments
Requirements:
Strong proficiency in Databricks (SQL, PySpark, Delta Lake, Jobs/Workflows)
Deep knowledge of Unity Catalog administration and APIs
Expertise in Python for automation scripts, API integrations, and data quality checks
Experience with governance frameworks (access control, tagging enforcement, lineage, compliance)
Solid foundation in security & compliance best practices (IAM, encryption, PII)
Experience with CI/CD and deployment pipelines (GitHub Actions, Azure DevOps, Jenkins)
Familiarity with monitoring/observability tools and building custom logging & alerting pipelines
Experience integrating with external systems (ServiceNow, monitoring platforms)
Experience with modern data quality frameworks (Great Expectations, Deequ, or equivalent)
Strong problem-solving and debugging skills in distributed systems
Clear communication and documentation skills to collaborate across GT and D&A teams
Bachelor's degree in Computer Science, Engineering, or related field OR equivalent professional experience
5+ years of backend engineering experience in data platforms
3+ years working with Databricks and/or Unity Catalog in enterprise environments
Demonstrated ability to design and deliver automation solutions for governance, quality, and compliance at scale
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.