CrawlJobs Logo

Snowflake Lead

nttdata.com Logo

NTT DATA

Location Icon

Location:
India , Remote

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

The Snowflake Lead will be responsible for leading the architecture and design of Snowflake-based data platforms on Azure. The role requires a minimum of 8 years of experience in data engineering, with a strong focus on Snowflake and Azure services. Candidates should possess excellent leadership and communication skills, as well as a deep understanding of data modeling and ETL processes. This position offers the opportunity to work in a dynamic and innovative environment, contributing to significant data migration projects.

Job Responsibility:

  • Lead the end-to-end architecture and design of Snowflake-based data platforms on Azure, including integration with Azure services (ADF, Synapse pipelines, Azure Functions, Key Vault, ADLS, etc.)
  • Define and implement data modeling standards (star/snowflake schema, data vault, dimensional modeling) tailored for analytics, BI, and downstream data products
  • Design secure, scalable, and cost-efficient Snowflake environments, including warehouses, databases, schemas, roles, resource monitors, and virtual warehouses
  • Lead migration strategy and roadmap for moving data from legacy/on-prem systems to Snowflake on Azure
  • Work with stakeholders to assess current state (source systems, ETL, reporting, data quality) and design target-state architecture on Snowflake
  • Define migration waves/phases, including data profiling, schema conversion, historical load, incremental load, and cutover strategy
  • Oversee and implement data ingestion pipelines from various sources (databases, flat files, APIs, streaming) into ADLS / Landing zones and then into Snowflake using tools like Azure Data Factory, Synapse pipelines, or Databricks, plus CDC where applicable
  • Manage data reconciliation and validation to ensure completeness, accuracy, and performance parity (or improvement) compared to legacy platforms
  • Lead a team of data engineers / ETL developers delivering Snowflake-based solutions and migration workstreams
  • Define and enforce coding standards, code review practices, and CI/CD pipelines for Snowflake objects (SQL, stored procedures, views, tasks, streams)
  • Design & build ELT/ETL patterns (staging → raw → curated → semantic layers), using tools such as dbt, ADF, Synapse, Databricks, or other orchestration tools
  • Implement automated testing frameworks (unit tests, regression tests, data quality checks) and monitoring (SLAs)
  • Monitor query performance and optimize Snowflake workloads using query profiling, clustering, partitioning, and warehouse sizing strategies
  • Implement resource monitors, auto-scaling, and auto-suspend policies to optimize compute usage and manage Snowflake consumption costs

Requirements:

  • 8+ years overall experience in Data Engineering / Data Warehousing / Analytics
  • 5+ years hands-on experience with Snowflake in production environments
  • Proven experience leading at least one large end-to-end migration from on-prem / legacy DWH to Snowflake on Azure (Netezza, Yellowbrick, Oracle, SQL Server, etc.)
  • Strong experience with Azure cloud services: Azure Data Factory, Data Lake Storage (ADLS), Azure Databricks and/or Synapse, Key Vault, Azure DevOps or GitHub
  • Strong expertise in SQL (complex queries, window functions, performance tuning)
  • Deep understanding of Snowflake features: Virtual warehouses, micro-partitioning, clustering, tasks, streams, time travel, zero-copy cloning, external tables, Snowpipe, etc.
  • Experience with ELT/ETL tools and frameworks: Azure Data Factory, Databricks, Synapse, dbt, or similar
  • Strong data modeling skills (dimensional modeling, 3NF, data vault is a plus)
  • Hands-on experience setting up CI/CD pipelines for data and Snowflake objects (Azure DevOps, GitHub Actions, etc.)
  • Strong leadership and team management capabilities
  • ability to lead mixed onshore/offshore teams
  • Excellent communication skills, able to explain complex technical topics to non-technical stakeholders
  • Strong problem-solving, analytical, and decision-making skills

Nice to have:

  • Familiarity with Python / Spark / Scala is a plus, especially for large-scale transformations or Databricks-based workloads
  • Experience working with BI tools (Power BI, Tableau, Looker, etc.) against Snowflake

Additional Information:

Job Posted:
January 26, 2026

Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Snowflake Lead

Snowflake Technical Lead

We are seeking a skilled Data Transformation Specialist to join our team and pla...
Location
Location
India , Noida
Salary
Salary:
Not provided
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Expertise in SQL – Ability to write and optimize complex queries
  • Hands-on experience in Snowflake – Working knowledge of Snowflake architecture, storage, and processing
  • Proficiency in Python – Ability to write scripts for data transformation and automation
  • Familiarity with data modeling and transformation techniques
  • Experience in ETL/ELT processes and working with structured and semi-structured data
  • Strong problem-solving skills and attention to detail
  • Ability to work independently and manage multiple priorities
Job Responsibility
Job Responsibility
  • Develop and implement data transformation solutions based on the designs provided by RSMB
  • Write, optimize, and maintain complex SQL queries to ensure efficient data processing
  • Work within the Snowflake environment, ensuring best practices for data storage, processing, and retrieval
  • Collaborate with stakeholders, including data architects and analysts, to ensure smooth execution of data solutions
  • Troubleshoot and resolve performance issues in SQL queries and Snowflake environments
  • Leverage Python for scripting and automation of data processing tasks
  • Ensure data integrity, quality, and compliance with industry standards
What we offer
What we offer
  • Inclusive and respectful work environment
  • Open to people with disabilities
  • Fulltime
Read More
Arrow Right

Snowflake - Senior Technical Lead

Position: Snowflake - Senior Technical Lead; Experience: 8-11 years; Education: ...
Location
Location
India , Noida; Bangalore
Salary
Salary:
Not provided
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Snowflake
  • Snowpipe
  • SQL
  • Data Modelling
  • DV 2.0
  • Data Quality
  • AWS
  • Snowflake Security
Job Responsibility
Job Responsibility
  • Define data partitioning, clustering, and micro-partition strategies to optimize performance and cost
  • Lead the implementation of ETL/ELT processes using Snowflake features (Streams, Tasks, Snowpipe)
  • Automate schema migrations, deployments, and pipeline orchestration
  • Monitor query performance and resource utilization
  • tune warehouses, caching, and clustering
  • Implement workload isolation for concurrent workloads
  • Define and enforce role-based access control (RBAC), masking policies, and object tagging
  • Ensure data encryption, compliance, and audit logging are correctly configured
  • Establish best practices for dimensional modeling and data vault architecture
  • Create and maintain data dictionaries, lineage documentation, and governance standards
What we offer
What we offer
  • Inclusive work environment
  • Open positions for people with disabilities
Read More
Arrow Right

Power BI Module Lead

Power BI Module Lead role requiring a strong blend of technical skills, business...
Location
Location
India , Noida
Salary
Salary:
Not provided
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Highly skilled Data Analyst / BI Developer with proven expertise in Power BI, SQL, Snowflake, and data modeling
  • Strong knowledge of Power BI, including DAX, Power Query (M), and building enterprise-grade dashboards
  • Solid understanding of data modeling techniques (3NF, Star Schema, Snowflake Schema, Data Vault 2.0)
  • Experience in ETL development to extract, transform, and load data from multiple sources into Snowflake
  • Write efficient SQL queries to handle large volumes of data with focus on Snowflake
  • Apply UI/UX design principles to Power BI dashboards
  • Experience with Data connectivity with Multiple Sources - Oracle BI, SQL, API's, Dataflows, Data lakes, Data Warehouse
  • Strong Power Query (M language) scripting for transformations
  • Implement performance tuning and optimization techniques for Power BI reports
  • Prepare detailed documentation including data dictionaries, process workflows, and user guides
Job Responsibility
Job Responsibility
  • Collaborate with business stakeholders to understand reporting needs, ensuring alignment on report design, usability, and functionality
  • Design, develop, and deliver high-quality BI solutions
  • Balance technical detail with business needs, ensuring insights are actionable and aligned with goals
  • Mentor and lead teams in BI best practices
What we offer
What we offer
  • Commitment to fighting against all forms of discrimination
  • Inclusive and respectful work environment
  • All positions open to people with disabilities
  • Fulltime
Read More
Arrow Right

ETL Testing Lead

We are looking for an ETL Testing Lead with strong expertise in SQL and Data War...
Location
Location
India , Noida; Bengaluru
Salary
Salary:
Not provided
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8-12 years of experience
  • BE/B.Tech/MS/M.Tech/MCA degree
  • Strong hands-on expertise in ETL testing, SQL, and Data Warehousing
  • Good knowledge of Snowflake, Python, and Data Governance/MDM concepts
  • Experience in data validation automation and reporting
  • Excellent communication and stakeholder management skills
Job Responsibility
Job Responsibility
  • Lead ETL and data validation testing across multiple systems
  • Design and implement automated data quality and reconciliation frameworks
  • Define and manage data quality rules, metrics, and thresholds
  • Collaborate with Data Architects, Governance, and Product teams to resolve data issues
  • Monitor data quality KPIs and ensure compliance with standards
What we offer
What we offer
  • Commitment to fighting against all forms of discrimination
  • Inclusive and respectful work environment
  • Positions open to people with disabilities
  • Fulltime
Read More
Arrow Right

Lead Data Engineer

Lead Data Engineer to serve as both a technical leader and people coach for our ...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or master’s degree in computer science, Engineering, or related field
  • 8-10 years of data engineering experience with strong hands-on delivery using ADF, SQL, Python, Databricks, and Spark
  • Experience designing data pipelines, warehouse models, and processing frameworks using Snowflake or Azure Synapse
  • Proficient with CI/CD tools (Azure DevOps, GitHub) and observability practices
  • Solid grasp of data governance, metadata tagging, and role-based access control
  • Proven ability to mentor and grow engineers in a matrixed or global environment
  • Strong verbal and written communication skills, with the ability to operate cross-functionally
  • Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management)
  • Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools
  • Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance)
Job Responsibility
Job Responsibility
  • Design, develop, and maintain scalable pipelines across ADF, Databricks, Snowflake, and related platforms
  • Lead the technical execution of non-domain specific initiatives (e.g. reusable dimensions, TLOG standardization, enablement pipelines)
  • Architect data models and re-usable layers consumed by multiple downstream pods
  • Guide platform-wide patterns like parameterization, CI/CD pipelines, pipeline recovery, and auditability frameworks
  • Mentoring and coaching team
  • Partner with product and platform leaders to ensure engineering consistency and delivery excellence
  • Act as an L3 escalation point for operational data issues impacting foundational pipelines
  • Own engineering best practices, sprint planning, and quality across the Enablement pod
  • Contribute to platform discussions and architectural decisions across regions
  • Fulltime
Read More
Arrow Right

Lead Data Engineer

Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader i...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or master’s degree in computer science, Engineering, or related field
  • 7-9 years of data engineering experience with strong hands-on delivery using ADF, SQL, Python, Databricks, and Spark
  • Experience designing data pipelines, warehouse models, and processing frameworks using Snowflake or Azure Synapse
  • Proficient with CI/CD tools (Azure DevOps, GitHub) and observability practices
  • Solid grasp of data governance, metadata tagging, and role-based access control
  • Proven ability to mentor and grow engineers in a matrixed or global environment
  • Strong verbal and written communication skills, with the ability to operate cross-functionally
  • Certifications in Azure, Databricks, or Snowflake are a plus
  • Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management)
  • Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools
Job Responsibility
Job Responsibility
  • Design, develop, and maintain scalable pipelines across ADF, Databricks, Snowflake, and related platforms
  • Lead the technical execution of non-domain specific initiatives (e.g. reusable dimensions, TLOG standardization, enablement pipelines)
  • Architect data models and re-usable layers consumed by multiple downstream pods
  • Guide platform-wide patterns like parameterization, CI/CD pipelines, pipeline recovery, and auditability frameworks
  • Mentoring and coaching team
  • Partner with product and platform leaders to ensure engineering consistency and delivery excellence
  • Act as an L3 escalation point for operational data issues impacting foundational pipelines
  • Own engineering best practices, sprint planning, and quality across the Enablement pod
  • Contribute to platform discussions and architectural decisions across regions
  • Fulltime
Read More
Arrow Right

Head of Partner Marketing

Sigma is looking for a Head of Partner Marketing with a technical background in ...
Location
Location
United States , San Francisco
Salary
Salary:
190000.00 - 215000.00 USD / Year
sigmacomputing.com Logo
Sigma Computing
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years of B2B technology marketing experience
  • at least 3 years focused on partner or alliance marketing in the data/BI space
  • proven success designing and executing co‑marketing programs with major cloud platform partners (Snowflake, Databricks) and/or leading ISVs and SIs
  • deep understanding of cloud data warehouse ecosystems (Snowflake, Databricks) and how analytics/BI layers integrate atop them
  • demonstrated ability to craft compelling joint value messaging and develop enablement assets for both technical and executive audiences
  • experience managing multi‑channel demand generation campaigns (webinars, ebooks, ABM, events) with a strong focus on KPI definition and ROI analysis
  • exceptional project management skills—able to juggle multiple partner initiatives with tight deadlines and competing priorities
  • excellent written and verbal communication skills, with experience presenting co‑marketing strategies and performance metrics to internal and external leadership
  • proficiency with marketing automation (Marketo, HubSpot, or similar), CRM (Salesforce), and analytics tools
  • self‑motivated, detail‑oriented, and comfortable navigating ambiguity to build structure and process
Job Responsibility
Job Responsibility
  • Design and execute co‑marketing campaigns with Snowflake and Databricks that drive joint lead generation, deal acceleration, and partner certifications
  • Build integrated partner demand programs—including webinars, ebooks, ABM plays, and event sponsorships—aligned to partner roadmaps and Sigma’s product launches
  • Increase Sigma’s brand awareness and visibility through partner‑owned channels
  • Create enablement assets (battlecards, one‑pagers, email templates, presentations) that articulate joint value propositions and empower both Sigma and partner sales teams
  • Collaborate with Product Marketing to develop technical briefs, ROI calculators, and partner playbooks highlighting Sigma’s differentiation on Snowflake and Databricks architectures
  • Partner with ISVs and SIs to identify co‑selling opportunities, co‑author case studies, and co‑host workshops or field events showcasing end‑to‑end Sigma solutions
  • Manage positioning, content, and co‑branded activities at Snowflake Summit, Databricks Data + AI Summit, and other partner conferences, ensuring consistent messaging and high‑impact presence
  • Track and report program performance (influenced pipeline, partner‑sourced bookings, MDF utilization)
  • iterate campaigns based on ROI and partner feedback
  • Serve as liaison between internal teams (Sales, Partnerships, Demand Gen, Product) and external partner stakeholders to align on goals, timelines, and deliverables
What we offer
What we offer
  • Equity
  • Generous health benefits
  • Flexible time off policy
  • Paid bonding time for all new parents
  • Traditional and Roth 401k
  • Commuter and FSA benefits
  • Lunch Program
  • Dog friendly office
  • Fulltime
Read More
Arrow Right

Collibra Data Quality Lead

Client looking for someone who is having very good Technical hands-on experience...
Location
Location
United States , Pittsburgh
Salary
Salary:
60.00 USD / Hour
realign-llc.com Logo
Realign
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong understanding of Data Quality and Data Governance
  • Proficiency in data profiling, data cleansing, and data validation techniques
  • Strong Collibra development experience
  • Must be able to setup rules using Collibra DQ
  • Must be able to set up workflows in Collibra DIP
  • Strong experience in SQL
  • Strong experience with ETL processes and good knowledge on any ETL tool
  • Good experience with Unix Scripting
  • Experience working on different databases like Oracle, Vertica, Snowflake etc
  • Good experience leading a team
Read More
Arrow Right