CrawlJobs Logo

Snowflake Engineer

nttdata.com Logo

NTT DATA

Location Icon

Location:
India , Bangalore

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

The Snowflake Engineer will design, develop, and support data solutions on the Snowflake Data Cloud for an insurance client. The role requires 4-7 years of experience in data engineering, with a focus on building scalable data pipelines and analytical models. A bachelor's degree in Computer Science or a related field is required, along with strong SQL and Snowflake skills. Preferred qualifications include Snowflake certification and experience with BI tools.

Job Responsibility:

  • Design and develop Snowflake-based data models supporting insurance use cases (policy, claims, customer, risk)
  • Build and optimize SQL-based ELT pipelines for large insurance datasets
  • Implement performance tuning and cost optimization in Snowflake: Query optimization, Warehouse sizing and clustering
  • Support ingestion of data from core insurance systems (policy admin, claims, billing)
  • Implement basic data quality and reconciliation checks
  • Collaborate with BI and analytics teams to enable dashboards, reports, and advanced analytics
  • Support Agile delivery activities and production support

Requirements:

  • Total Experience: 4–7 years in data engineering / analytics roles
  • Relevant Snowflake Experience: 2–4+ years hands-on
  • Insurance Domain Experience: Preferred (P&C, Life, Health, or Reinsurance)
  • Advanced SQL (complex joins, window functions, performance tuning)
  • Hands-on Snowflake experience: Virtual warehouses, Time travel and zero-copy cloning, Secure data sharing
  • Strong understanding of data warehousing and dimensional modelling
  • Experience working with structured and semi-structured insurance data
  • Familiarity with version control and CI/CD practices
  • Experience working with insurance data such as: Policy, endorsement, and coverage data, Claims and loss data, Premium, billing, and payments, Customer and risk attributes
  • Awareness of data accuracy, lineage, and audit requirements in regulated insurance environments
  • Bachelor's degree in Computer Science or a related field

Nice to have:

  • Snowflake certification
  • Exposure to actuarial or risk analytics datasets
  • Experience with BI tools (Power BI, Tableau)
  • Experience in regulated financial services environments

Additional Information:

Job Posted:
January 24, 2026

Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Snowflake Engineer

Data Engineer – Snowflake & ETL

We are seeking a Data Engineer in Hyderabad (WFO) with expertise in data enginee...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
rightanglesol.com Logo
Right Angle Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum 5+ years of experience in data engineering, ETL, and Snowflake development
  • Proven expertise in Snowflake including SQL scripting, performance tuning, and data warehousing concepts
  • Hands-on experience with Matillion ETL for building and maintaining ETL jobs
  • Strong understanding of cloud platforms (AWS, Azure, or GCP) and cloud-based data architectures
  • Proficiency in SQL, Python, or other scripting languages for automation and data transformation
  • Experience with API integrations and data ingestion frameworks
  • Knowledge of data governance, security policies, and access control within Snowflake environments
  • Excellent communication skills with the ability to engage both business and technical stakeholders
  • Self-motivated professional capable of working independently and delivering projects on time
  • Qualification: BE/BS/MTech/MS or equivalent work experience
  • Fulltime
Read More
Arrow Right

Sr. Data Engineer - Snowflake

Data Ideology is seeking a Sr. Snowflake Data Engineer to join our growing team ...
Location
Location
Salary
Salary:
Not provided
dataideology.com Logo
Data Ideology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in data engineering, data warehousing, or data architecture
  • 3+ years of hands-on Snowflake experience (performance tuning, data sharing, Snowpark, Snowpipe, etc.)
  • Strong SQL and Python skills, with production experience using dbt
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and modern data tooling (Airflow, Fivetran, Power BI, Looker, Informatica, etc.)
  • Prior experience in a consulting or client-facing delivery role
  • Excellent communication skills, with the ability to collaborate across technical and business stakeholders
  • SnowPro Core Certification required (or willingness to obtain upon hire)
  • advanced Snowflake certifications preferred
Job Responsibility
Job Responsibility
  • Design and build scalable, secure, and cost-effective data solutions in Snowflake
  • Develop and optimize data pipelines using tools such as dbt, Python, CloverDX, and cloud-native services
  • Participate in discovery sessions with clients to gather requirements and translate them into solution designs and project plans
  • Collaborate with engagement managers and account teams to help scope work and provide technical input for Statements of Work (SOWs)
  • Serve as a Snowflake subject matter expert, guiding best practices in performance tuning, cost optimization, access control, and workload management
  • Lead modernization and migration initiatives to move clients from legacy systems into Snowflake
  • Integrate Snowflake with BI tools, governance platforms, and AI/ML frameworks
  • Contribute to internal accelerators, frameworks, and proofs of concept
  • Mentor junior engineers and support knowledge sharing across the team
What we offer
What we offer
  • Flexible Time Off Policy
  • Eligibility for Health Benefits
  • Retirement Plan with Company Match
  • Training and Certification Reimbursement
  • Utilization Based Incentive Program
  • Commission Incentive Program
  • Referral Bonuses
  • Work from Home
  • Fulltime
Read More
Arrow Right

Snowflake Solutions Engineer

We are seeking an innovative Snowflake Solutions Engineer to join our growing IT...
Location
Location
United States , Easton
Salary
Salary:
Not provided
victaulic.com Logo
Victaulic
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Information Systems, Data Engineering, Data Science or related technical field
  • At least 2 years of recent hands-on experience with Snowflake platform including advanced features
  • Minimum 3 years of experience in data engineering or solutions architecture roles
  • 7-10 years of experience in Data Architecture/Engineering and/or BI in a multi-dimensional environment
  • Proven track record of developing data applications or analytical solutions for business users
  • Snowflake Expertise: Advanced knowledge of Snowflake architecture including data warehousing, data lakes, and emerging lakehouse features
  • Security and Governance: Deep understanding of RBAC, row-level security, data masking, and Snowflake security best practices
  • DevOps and CI/CD: Strong experience with GitHub, SnowDDL, automated deployment pipelines, and infrastructure as code
  • Application Development: Proficiency with Snowflake Streamlit for building interactive data applications
  • SQL Proficiency: Expert-level SQL skills with experience in complex analytical queries and optimization
Job Responsibility
Job Responsibility
  • Snowflake Native Application Development (30%): Design and develop interactive data applications using Snowflake Streamlit for self-service analytics and operational workflows that enable business users to interact with data through intuitive interfaces
  • Create reusable application frameworks and component libraries for rapid solution delivery
  • Integrate Snowflake Native Apps and third-party marketplace applications to extend platform capabilities
  • Develop custom UDFs and stored procedures to support advanced application logic and business rules
  • Data Architecture and Modern Platform Design (30%): Design and implement modern data architecture solutions spanning data warehousing, data lakes, and lakehouse patterns
  • Implement and maintain medallion architecture (bronze-silver-gold) patterns for data quality and governance
  • Evaluate and recommend architecture patterns for diverse use cases including structured analytics, semi-structured data processing, and AI/ML workloads
  • Establish best practices for data organization, storage optimization, and query performance across different data architecture patterns
  • AI Support and Advanced Analytics Collaboration (15%): Support AI and data science teams with Snowflake platform capabilities and best practices
  • Collaborate on implementing Snowflake Cortex AI features for business use cases
  • Fulltime
Read More
Arrow Right

Software Engineer-Snowflake

Join our Snowflake Managed Services team as a Software Engineer to work on data ...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
genzeon.com Logo
Genzeon
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4+ years of hands-on experience in Snowflake development and support
  • Strong SQL, data modeling, and performance tuning experience
  • Exposure to CI/CD pipelines and scripting languages (e.g., Python, Shell)
  • Understanding of Snowflake security (RBAC), warehouse sizing, cost controls
  • Experience with data pipelines and orchestration tools (Airflow, dbt, ADF)
Job Responsibility
Job Responsibility
  • Design and develop Snowflake pipelines, data models, and transformations
  • Provide L2/L3 production support for Snowflake jobs, queries, and integrations
  • Troubleshoot failed jobs, resolve incidents, and conduct RCA
  • Tune queries, monitor warehouses, and help optimize Snowflake usage and cost
  • Handle service requests like user provisioning, access changes, and role management
  • Participate in code reviews, deployment pipelines, and continuous improvement
  • Document issues, enhancements, and standard procedures (runbooks)
Read More
Arrow Right

Senior Data Engineer

Location
Location
United States , Flowood
Salary
Salary:
Not provided
phasorsoft.com Logo
PhasorSoft Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience with Snowflake or Azure Cloud Data Engineering, including setting up and managing data pipelines
  • Proficiency in designing and implementing ETL processes for data integration
  • Knowledge of data warehousing concepts and best practices
  • Strong SQL skills for querying and manipulating data in Snowflake or Azure databases
  • Experience with data modeling techniques and tools to design efficient data structures
  • Understanding of data governance principles and experience implementing them in cloud environments
  • Proficiency in Tableau or Power BI for creating visualizations and interactive dashboards
  • Ability to write scripts (e.g., Python, PowerShell) for automation and orchestration of data pipelines
  • Skills to monitor and optimize data pipelines for performance and cost efficiency
  • Knowledge of cloud data security practices and tools to ensure data protection
Job Responsibility
Job Responsibility
  • Design, implement, and maintain data pipelines and architectures on Snowflake or Azure Cloud platforms
  • Develop ETL processes to extract, transform, and load data from various sources into data warehouses
  • Optimize data storage, retrieval, and processing for performance and cost-efficiency in cloud environments
  • Collaborate with stakeholders to understand data requirements and translate them into technical solutions
  • Implement data security and governance best practices to ensure data integrity and compliance
  • Work with reporting tools such as Tableau or Power BI to create interactive dashboards and visualizations
  • Monitor and troubleshoot data pipelines, ensuring reliability and scalability
  • Automate data workflows and processes using cloud-native services and scripting languages
  • Provide technical expertise and support to data analysts, scientists, and business users
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

We are looking for a Senior Data Engineer to join one of the best team at Sigma ...
Location
Location
Salary
Salary:
Not provided
sigma.software Logo
Sigma Software Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Python / strong
  • SQL / strong
  • Snowflake / good
  • English / strong
What we offer
What we offer
  • Diversity of Domains & Businesses
  • Variety of technology
  • Health & Legal support
  • Active professional community
  • Continuous education and growing
  • Flexible schedule
  • Remote work
  • Outstanding offices (if you choose it)
  • Sports and community activities
Read More
Arrow Right

Data Engineer

We are looking for a Data Engineer to join a data-intensive real estate/proptech...
Location
Location
Portugal
Salary
Salary:
Not provided
ascendixtech.com Logo
Ascendix Tech
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Snowflake – Advanced level, including User-Defined Functions (UDFs), stored procedures, and performance optimization
  • Python – Proficient in data manipulation, automation, and scripting (pandas, SQL connectors)
  • SQL – Expert level, including query optimization
  • Data Modeling – Dimensional modeling (star/snowflake schemas), normalization (3NF)
  • OLTP & OLAP – Understanding of transactional vs. analytical database design
  • Data Quality – Data cleansing, validation, consistency checks
  • Real estate data (property types, comparable analysis)
  • Multi-source data integration and harmonization
  • KPI reporting and dashboards
  • Real estate or property data systems (preferred)
Job Responsibility
Job Responsibility
  • Design scalable data models for high-volume property data (2,600-6,500 properties/month)
  • Build and maintain data integration pipelines from multiple sources
  • Implement complex entity relationships and dynamic grouping
  • Optimize for both transactional and analytical workloads
  • Develop Snowflake UDFs and stored procedures for business logic
What we offer
What we offer
  • Health insurance
  • Vacation days: 22 days per year
  • Paid Time off Benefits
  • Meal Card
  • Friendly and calm atmosphere in the company
  • Individual development plan
  • Technical leads and mentors
  • Open management and well-established processes
  • Regular performance reviews
  • Free access to the company accounts on educational platforms (Udemy, Pluralsight)
  • Fulltime
Read More
Arrow Right

Data Analytics Engineer

SDG Group is expanding its global Data & Analytics practice and is seeking a mot...
Location
Location
Egypt , Cairo
Salary
Salary:
Not provided
sdggroup.com Logo
SDG
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in computer science, Engineering, Information Systems, or related field
  • Hands-on experience in DataOps / Data Engineering
  • Strong knowledge in Databricks OR Snowflake (one of them is mandatory)
  • Proficiency in Python and SQL
  • Experience with Azure data ecosystem (ADF, ADLS, Synapse, etc.)
  • Understanding of CI/CD practices and DevOps for data.
  • Knowledge of data modeling, orchestration frameworks, and monitoring tools
  • Strong analytical and troubleshooting skills
  • Eagerness to learn and grow in a global consulting environment
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable and reliable data pipelines following DataOps best practices
  • Work with modern cloud data stacks using Databricks (Spark, Delta Lake) or Snowflake (Snow pipe, tasks, streams)
  • Develop and optimize ETL/ELT workflows using Python, SQL, and orchestration tools
  • Work with Azure data services (ADF, ADLS, Azure SQL, Azure Functions)
  • Implement CI/CD practices using Azure DevOps or Git-based workflows
  • Ensure data quality, consistency, and governance across all delivered data solutions
  • Monitor and troubleshoot pipelines for performance and operational excellence
  • Collaborate with international teams, architects, and analytics consultants
  • Contribute to technical documentation and solution design assets
What we offer
What we offer
  • Remote working model aligned with international project needs
  • Opportunity to work on European and global engagements
  • Mentorship and growth paths within SDG Group
  • A dynamic, innovative, and collaborative environment
  • Access to world-class training and learning platforms
  • Fulltime
Read More
Arrow Right