CrawlJobs Logo

Senior Snowflake Engineer

nttdata.com Logo

NTT DATA

Location Icon

Location:
India , Bangalore

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are looking for a Senior Snowflake Engineer with 7-12 years of experience in data engineering and analytics. The ideal candidate will have strong expertise in Snowflake and SQL, and will be responsible for leading the design and optimization of data solutions. This role requires collaboration with various teams and mentoring junior engineers.

Job Responsibility:

  • Lead design and development of Snowflake data models (staging, integration, and consumption layers)
  • Build and optimize complex SQL-based ELT pipelines
  • Drive performance tuning and cost optimization: Query optimization, Warehouse sizing and workload management, Clustering strategies
  • Implement data quality checks and reconciliation frameworks
  • Support ingestion from multiple source systems (operational, third-party, external)
  • Collaborate with Data Architects on solution design and standards
  • Mentor junior engineers and perform code reviews
  • Support production releases and resolve complex data issues
  • Actively participate in Agile ceremonies and technical discussions

Requirements:

  • Total Experience: 7–12+ years in data engineering / analytics roles
  • Relevant Snowflake Experience: 4–6+ years hands-on
  • Cloud Data Platform Experience: Required (Snowflake on AWS / Azure / GCP)
  • Advanced SQL (complex transformations, performance tuning)
  • Deep hands-on experience with Snowflake features: Virtual warehouses, Time travel and zero-copy cloning, Secure data sharing
  • Strong understanding of data warehousing, dimensional modeling, and ELT patterns
  • Experience with orchestration and ingestion frameworks
  • Familiarity with CI/CD, version control, and release management

Nice to have:

  • Snowflake certification
  • Experience working in regulated industries (insurance, banking, utilities)
  • Exposure to BI tools (Power BI, Tableau)
  • Experience supporting analytics and data science workloads

Additional Information:

Job Posted:
January 24, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Snowflake Engineer

Senior Data Engineer

We are looking for a Senior Data Engineer to join one of the best team at Sigma ...
Location
Location
Salary
Salary:
Not provided
sigma.software Logo
Sigma Software Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Python / strong
  • SQL / strong
  • Snowflake / good
  • English / strong
What we offer
What we offer
  • Diversity of Domains & Businesses
  • Variety of technology
  • Health & Legal support
  • Active professional community
  • Continuous education and growing
  • Flexible schedule
  • Remote work
  • Outstanding offices (if you choose it)
  • Sports and community activities
Read More
Arrow Right

Senior Data Engineer

Location
Location
United States , Flowood
Salary
Salary:
Not provided
phasorsoft.com Logo
PhasorSoft Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience with Snowflake or Azure Cloud Data Engineering, including setting up and managing data pipelines
  • Proficiency in designing and implementing ETL processes for data integration
  • Knowledge of data warehousing concepts and best practices
  • Strong SQL skills for querying and manipulating data in Snowflake or Azure databases
  • Experience with data modeling techniques and tools to design efficient data structures
  • Understanding of data governance principles and experience implementing them in cloud environments
  • Proficiency in Tableau or Power BI for creating visualizations and interactive dashboards
  • Ability to write scripts (e.g., Python, PowerShell) for automation and orchestration of data pipelines
  • Skills to monitor and optimize data pipelines for performance and cost efficiency
  • Knowledge of cloud data security practices and tools to ensure data protection
Job Responsibility
Job Responsibility
  • Design, implement, and maintain data pipelines and architectures on Snowflake or Azure Cloud platforms
  • Develop ETL processes to extract, transform, and load data from various sources into data warehouses
  • Optimize data storage, retrieval, and processing for performance and cost-efficiency in cloud environments
  • Collaborate with stakeholders to understand data requirements and translate them into technical solutions
  • Implement data security and governance best practices to ensure data integrity and compliance
  • Work with reporting tools such as Tableau or Power BI to create interactive dashboards and visualizations
  • Monitor and troubleshoot data pipelines, ensuring reliability and scalability
  • Automate data workflows and processes using cloud-native services and scripting languages
  • Provide technical expertise and support to data analysts, scientists, and business users
  • Fulltime
Read More
Arrow Right

Senior Analytics Engineer

Reporting to our Director of Data and Analytics, we are looking for a Sr. Analyt...
Location
Location
Ireland , Dublin
Salary
Salary:
Not provided
tines.com Logo
Tines
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5-7 years experience in data engineering with significant architectural experience
  • Proven track record building data architecture in B2B SaaS, particularly with Salesforce/CRM data
  • Deep expertise with modern data stack (Snowflake, DBT, AWS and BI tools - we use Omni)
  • Strong background in data modeling and system design
  • Experience implementing B2B SaaS metrics and GTM reporting
  • Demonstrated ability to translate business requirements into technical solutions
  • AWS experience, particularly with data services and S3
  • Understanding of data governance and security principles
  • Balance of strategic thinking and hands-on implementation skills
  • Able to tailor communication (written and oral) appropriately to different audiences, technical/non-technical, of different seniority
Job Responsibility
Job Responsibility
  • Design and implement scalable data architecture as Tines grows
  • Establish and maintain data modeling standards and best practices
  • Build and manage ETL processes across diverse data sources
  • Develop core data models in DBT with a focus on efficiency and reusability
  • Own our Snowflake implementation and Omni data layer
  • Drive data governance and security best practices
  • Create clear, maintainable documentation
  • Ensure reliable, high-performance data systems
  • Partner with stakeholders to translate business needs into technical solutions, including presenting solutions to leadership
What we offer
What we offer
  • Competitive salary
  • Startup equity & extended exercise window
  • Matching retirement plans
  • Home office setup
  • Private healthcare plans
  • 25 days annual leave
  • Extra company holidays
  • Generous parental leave programs
  • Flexibility in how and where you work
  • Phone and home Internet allowance
Read More
Arrow Right

Senior Data Engineer – Data Engineering & AI Platforms

We are looking for a highly skilled Senior Data Engineer (L2) who can design, bu...
Location
Location
India , Chennai, Madurai, Coimbatore
Salary
Salary:
Not provided
optisolbusiness.com Logo
OptiSol Business Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong hands-on expertise in cloud ecosystems (Azure / AWS / GCP)
  • Excellent Python programming skills with data engineering libraries and frameworks
  • Advanced SQL capabilities including window functions, CTEs, and performance tuning
  • Solid understanding of distributed processing using Spark/PySpark
  • Experience designing and implementing scalable ETL/ELT workflows
  • Good understanding of data modeling concepts (dimensional, star, snowflake)
  • Familiarity with GenAI/LLM-based integration for data workflows
  • Experience working with Git, CI/CD, and Agile delivery frameworks
  • Strong communication skills for interacting with clients, stakeholders, and internal teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable ETL/ELT pipelines across cloud and big data platforms
  • Contribute to architectural discussions by translating business needs into data solutions spanning ingestion, transformation, and consumption layers
  • Work closely with solutioning and pre-sales teams for technical evaluations and client-facing discussions
  • Lead squads of L0/L1 engineers—ensuring delivery quality, mentoring, and guiding career growth
  • Develop cloud-native data engineering solutions using Python, SQL, PySpark, and modern data frameworks
  • Ensure data reliability, performance, and maintainability across the pipeline lifecycle—from development to deployment
  • Support long-term ODC/T&M projects by demonstrating expertise during technical discussions and interviews
  • Integrate emerging GenAI tools where applicable to enhance data enrichment, automation, and transformations
What we offer
What we offer
  • Opportunity to work at the intersection of Data Engineering, Cloud, and Generative AI
  • Hands-on exposure to modern data stacks and emerging AI technologies
  • Collaboration with experts across Data, AI/ML, and cloud practices
  • Access to structured learning, certifications, and leadership mentoring
  • Competitive compensation with fast-track career growth and visibility
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

The Senior Data Engineer is a key contributor to the Data Engineering team, inte...
Location
Location
Poland , Krakow
Salary
Salary:
Not provided
energyhub.com Logo
EnergyHub
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Storied career in data engineering
  • Clear understanding of the difference between OLAP and OLTP RDBMS systems, and demonstrable experience planning and implementing high-performance data processes using both
  • Proven proficiency with our total toolkit (AWS, Snowflake, dbt, Airflow, Docker, Fivetran, Hightouch, Sigma, Jira, Confluence, Slack, etc) or comparable counterparts
  • Love of language, especially SQL & Python
  • Obsessive drive for quality and performance
  • Penchant for concise clarity in your communication, code, and documentation
  • Warmth and inclusiveness for your collaborators, colleagues, and peers
  • Proactive preoccupation with observability
Job Responsibility
Job Responsibility
  • Design processes for ingesting, processing, observing, and maintaining billions of data points
  • Manage access, roles, and security for our Snowflake instance
  • Architect schema and data flow
  • Collaborate with your team and product to strike the ideal balance between business-driven and team-driven priorities
  • Monitor cloud costs across our toolset
  • Mentor and train your team and others across the business
  • Build and maintain relationships with our diverse partner teams
What we offer
What we offer
  • Medical care
  • Life insurance
  • Sports package
  • Annual budget for professional development ($2,000)
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer – Dublin (Hybrid) Contract Role | 3 Days Onsite. We are see...
Location
Location
Ireland , Dublin
Salary
Salary:
Not provided
solasit.ie Logo
Solas IT Recruitment
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience as a Data Engineer working with distributed data systems
  • 4+ years of deep Snowflake experience, including performance tuning, SQL optimization, and data modelling
  • Strong hands-on experience with the Hadoop ecosystem: HDFS, Hive, Impala, Spark (PySpark preferred)
  • Oozie, Airflow, or similar orchestration tools
  • Proven expertise with PySpark, Spark SQL, and large-scale data processing patterns
  • Experience with Databricks and Delta Lake (or equivalent big-data platforms)
  • Strong programming background in Python, Scala, or Java
  • Experience with cloud services (AWS preferred): S3, Glue, EMR, Redshift, Lambda, Athena, etc.
Job Responsibility
Job Responsibility
  • Build, enhance, and maintain large-scale ETL/ELT pipelines using Hadoop ecosystem tools including HDFS, Hive, Impala, and Oozie/Airflow
  • Develop distributed data processing solutions with PySpark, Spark SQL, Scala, or Python to support complex data transformations
  • Implement scalable and secure data ingestion frameworks to support both batch and streaming workloads
  • Work hands-on with Snowflake to design performant data models, optimize queries, and establish solid data governance practices
  • Collaborate on the migration and modernization of current big-data workloads to cloud-native platforms and Databricks
  • Tune Hadoop, Spark, and Snowflake systems for performance, storage efficiency, and reliability
  • Apply best practices in data modelling, partitioning strategies, and job orchestration for large datasets
  • Integrate metadata management, lineage tracking, and governance standards across the platform
  • Build automated validation frameworks to ensure accuracy, completeness, and reliability of data pipelines
  • Develop unit, integration, and end-to-end testing for ETL workflows using Python, Spark, and dbt testing where applicable
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer role driving Circle K's cloud-first strategy to unlock the ...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree in Computer Engineering, Computer Science or related discipline
  • Master's Degree preferred
  • 5+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment
  • 5+ years of experience with setting up and operating data pipelines using Python or SQL
  • 5+ years of advanced SQL Programming: PL/SQL, T-SQL
  • 5+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization
  • Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads
  • 5+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data
  • 5+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions
  • 5+ years of experience in defining and enabling data quality standards for auditing, and monitoring
Job Responsibility
Job Responsibility
  • Collaborate with business stakeholders and other technical team members to acquire and migrate data sources
  • Determine solutions that are best suited to develop a pipeline for a particular data source
  • Develop data flow pipelines to extract, transform, and load data from various data sources
  • Efficient in ETL/ELT development using Azure cloud services and Snowflake
  • Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines
  • Provide clear documentation for delivered solutions and processes
  • Identify and implement internal process improvements for data management
  • Stay current with and adopt new tools and applications
  • Build cross-platform data strategy to aggregate multiple sources
  • Proactive in stakeholder communication, mentor/guide junior resources
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

We are looking for a Senior Data Engineer with a collaborative, “can-do” attitud...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s Degree in Computer Engineering, Computer Science or related discipline, Master’s Degree preferred
  • 5+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment
  • 5+ years of experience with setting up and operating data pipelines using Python or SQL
  • 5+ years of advanced SQL Programming: PL/SQL, T-SQL
  • 5+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization
  • Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads
  • 5+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data
  • 5+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions
  • 5+ years of experience in defining and enabling data quality standards for auditing, and monitoring
  • Strong analytical abilities and a strong intellectual curiosity
Job Responsibility
Job Responsibility
  • Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals
  • Demonstrate deep technical and domain knowledge of relational and non-relation databases, Data Warehouses, Data lakes among other structured and unstructured storage options
  • Determine solutions that are best suited to develop a pipeline for a particular data source
  • Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development
  • Efficient in ETL/ELT development using Azure cloud services and Snowflake, Testing and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance)
  • Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics delivery
  • Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders
  • Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability)
  • Stay current with and adopt new tools and applications to ensure high quality and efficient solutions
  • Build cross-platform data strategy to aggregate multiple sources and process development datasets
  • Fulltime
Read More
Arrow Right