This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Snowflake Lead will be responsible for leading the architecture and design of Snowflake-based data platforms on Azure. The role requires a minimum of 8 years of experience in data engineering, with a strong focus on Snowflake and Azure services. Candidates should possess excellent leadership and communication skills, as well as a deep understanding of data modeling and ETL processes. This position offers the opportunity to work in a dynamic and innovative environment, contributing to significant data migration projects.
Job Responsibility:
Lead the end-to-end architecture and design of Snowflake-based data platforms on Azure, including integration with Azure services (ADF, Synapse pipelines, Azure Functions, Key Vault, ADLS, etc.)
Define and implement data modeling standards (star/snowflake schema, data vault, dimensional modeling) tailored for analytics, BI, and downstream data products
Design secure, scalable, and cost-efficient Snowflake environments, including warehouses, databases, schemas, roles, resource monitors, and virtual warehouses
Lead migration strategy and roadmap for moving data from legacy/on-prem systems to Snowflake on Azure
Work with stakeholders to assess current state (source systems, ETL, reporting, data quality) and design target-state architecture on Snowflake
Define migration waves/phases, including data profiling, schema conversion, historical load, incremental load, and cutover strategy
Oversee and implement data ingestion pipelines from various sources (databases, flat files, APIs, streaming) into ADLS / Landing zones and then into Snowflake using tools like Azure Data Factory, Synapse pipelines, or Databricks, plus CDC where applicable
Manage data reconciliation and validation to ensure completeness, accuracy, and performance parity (or improvement) compared to legacy platforms
Lead a team of data engineers / ETL developers delivering Snowflake-based solutions and migration workstreams
Define and enforce coding standards, code review practices, and CI/CD pipelines for Snowflake objects (SQL, stored procedures, views, tasks, streams)
Design & build ELT/ETL patterns (staging → raw → curated → semantic layers), using tools such as dbt, ADF, Synapse, Databricks, or other orchestration tools
Implement automated testing frameworks (unit tests, regression tests, data quality checks) and monitoring (SLAs)
Monitor query performance and optimize Snowflake workloads using query profiling, clustering, partitioning, and warehouse sizing strategies
Implement resource monitors, auto-scaling, and auto-suspend policies to optimize compute usage and manage Snowflake consumption costs
Requirements:
8+ years overall experience in Data Engineering / Data Warehousing / Analytics
5+ years hands-on experience with Snowflake in production environments
Proven experience leading at least one large end-to-end migration from on-prem / legacy DWH to Snowflake on Azure (Netezza, Yellowbrick, Oracle, SQL Server, etc.)
Strong experience with Azure cloud services: Azure Data Factory, Data Lake Storage (ADLS), Azure Databricks and/or Synapse, Key Vault, Azure DevOps or GitHub
Strong expertise in SQL (complex queries, window functions, performance tuning)
Deep understanding of Snowflake features: Virtual warehouses, micro-partitioning, clustering, tasks, streams, time travel, zero-copy cloning, external tables, Snowpipe, etc.
Experience with ELT/ETL tools and frameworks: Azure Data Factory, Databricks, Synapse, dbt, or similar
Strong data modeling skills (dimensional modeling, 3NF, data vault is a plus)
Hands-on experience setting up CI/CD pipelines for data and Snowflake objects (Azure DevOps, GitHub Actions, etc.)
Strong leadership and team management capabilities
ability to lead mixed onshore/offshore teams
Excellent communication skills, able to explain complex technical topics to non-technical stakeholders
Strong problem-solving, analytical, and decision-making skills
Nice to have:
Familiarity with Python / Spark / Scala is a plus, especially for large-scale transformations or Databricks-based workloads
Experience working with BI tools (Power BI, Tableau, Looker, etc.) against Snowflake