This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As Snap continues to scale, clients are investing heavily in modern data platforms built on Databricks. We’re supporting increasingly complex programmes, from enterprise migrations to capability‑wide Databricks enablement and we’re expanding our Databricks practice to meet this demand. We’re looking for a Senior Data Platform Consultant with strong Databricks experience to join our Delivery team. This role is ideal for someone who enjoys shaping platform strategy, leading team delivery, and enabling others to grow their technical skills. You’ll join a supportive, collaborative team where everyone’s perspective is valued. If you enjoy helping clients unlock the value of their data through modern engineering and thoughtful leadership, we’d love to hear from you. You’ll also lead the design and delivery of complex Databricks‑based analytics platform solutions, while helping shape Snap’s growing Databricks capability. We’re looking for someone with strong Databricks expertise, collaborative leadership skills and the ability to shape client outcomes.
Job Responsibility:
Leading the architecture and delivery of large-scale Databricks Lakehouse solutions
Driving Databricks migration and modernisation projects
Advising clients on Databricks best practices, capability uplift and roadmap development
Guiding cross-functional delivery teams and supporting capability growth across engineering and analytics
Establishing scalable patterns, reusable assets and improvements to Databricks delivery approaches
Coaching consultants and team leads on Databricks engineering, governance, optimisation and platform design
Contributing to Snap’s internal Databricks initiatives, thought leadership and go-to-market strategy
Requirements:
Deep, hands-on experience with Databricks, including: Lakeflow Jobs and Lakeflow Declarative Pipelines
Autoloader and similar spark structured streaming patterns
PySpark and Spark SQL
Implementing CI/CD pipelines using Databricks Asset Bundles
Unity Catalog and data governance including attribute based access controls
Strong knowledge of python and SQL
Experience deploying and managing Databricks workspaces with IaC tools like Terraform
Strong experience in at least one cloud platform (Azure, GCP, AWS)
Experience with enterprise extraction tools like Fivetran, HVR and Azure Data Factory, and their integration with a Databricks platform
Experience leading large, complex data platform deliveries
Strong architectural thinking across ingestion, transformation, modelling, orchestration and governance
Clear communication skills and confidence working with a range of stakeholders
A practical, outcomes focused approach that connects technical delivery to business value