CrawlJobs Logo

Snowflake Data Engineer

jobs.360resourcing.co.uk Logo

360 Resourcing Solutions

Location Icon

Location:
United Kingdom

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are looking for a Snowflake Data Engineer to help build and evolve our platform. You’ll design, develop, and optimise end-to-end data pipelines using Snowflake and Matillion, ensuring high-quality, performant, and reusable datasets. You’ll also support analytics, embedded products, and AI-enabled initiatives while collaborating across product, engineering, analytics, and business teams.

Job Responsibility:

  • Design, build, and maintain robust data pipelines using Matillion and Snowflake
  • Develop analytics-ready Snowflake data models following dimensional modelling best practices
  • Implement and evolve a medallion architecture (bronze, silver, gold) with clear lineage and governance
  • Optimise Snowflake performance and cost through clustering, warehouse sizing, query tuning, and workload management
  • Build secure, resilient, well-tested pipelines with monitoring, alerting, and error handling
  • Create datasets powering Looker dashboards, embedded analytics, and self-service reporting
  • Translate analytics and business requirements into performant, reusable Snowflake models

Requirements:

  • Previous experience in Data Engineering / Analytics Engineering on Snowflake platforms
  • Strong hands-on experience with Matillion (or equivalent ETL/ELT tools) and advanced SQL
  • Expertise in dimensional modelling, medallion architecture, and analytics-ready data design
  • Experience with data quality, governance, and building high-impact data products
  • Curious, proactive, and committed to continuous learning
What we offer:
  • Private Counselling with a weekly confidential helpline available
  • Simplyhealth private healthcare plan
  • £150 Wellbeing Allowance per year
  • Working elsewhere policy (4 weeks per year)
  • Hybrid working
  • Buy and sell annual leave scheme (upto 3 days per year)

Additional Information:

Job Posted:
January 05, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Snowflake Data Engineer

Data Engineer – Snowflake & ETL

We are seeking a Data Engineer in Hyderabad (WFO) with expertise in data enginee...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
rightanglesol.com Logo
Right Angle Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum 5+ years of experience in data engineering, ETL, and Snowflake development
  • Proven expertise in Snowflake including SQL scripting, performance tuning, and data warehousing concepts
  • Hands-on experience with Matillion ETL for building and maintaining ETL jobs
  • Strong understanding of cloud platforms (AWS, Azure, or GCP) and cloud-based data architectures
  • Proficiency in SQL, Python, or other scripting languages for automation and data transformation
  • Experience with API integrations and data ingestion frameworks
  • Knowledge of data governance, security policies, and access control within Snowflake environments
  • Excellent communication skills with the ability to engage both business and technical stakeholders
  • Self-motivated professional capable of working independently and delivering projects on time
  • Qualification: BE/BS/MTech/MS or equivalent work experience
  • Fulltime
Read More
Arrow Right

Senior Data Engineer – Data Engineering & AI Platforms

We are looking for a highly skilled Senior Data Engineer (L2) who can design, bu...
Location
Location
India , Chennai, Madurai, Coimbatore
Salary
Salary:
Not provided
optisolbusiness.com Logo
OptiSol Business Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong hands-on expertise in cloud ecosystems (Azure / AWS / GCP)
  • Excellent Python programming skills with data engineering libraries and frameworks
  • Advanced SQL capabilities including window functions, CTEs, and performance tuning
  • Solid understanding of distributed processing using Spark/PySpark
  • Experience designing and implementing scalable ETL/ELT workflows
  • Good understanding of data modeling concepts (dimensional, star, snowflake)
  • Familiarity with GenAI/LLM-based integration for data workflows
  • Experience working with Git, CI/CD, and Agile delivery frameworks
  • Strong communication skills for interacting with clients, stakeholders, and internal teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable ETL/ELT pipelines across cloud and big data platforms
  • Contribute to architectural discussions by translating business needs into data solutions spanning ingestion, transformation, and consumption layers
  • Work closely with solutioning and pre-sales teams for technical evaluations and client-facing discussions
  • Lead squads of L0/L1 engineers—ensuring delivery quality, mentoring, and guiding career growth
  • Develop cloud-native data engineering solutions using Python, SQL, PySpark, and modern data frameworks
  • Ensure data reliability, performance, and maintainability across the pipeline lifecycle—from development to deployment
  • Support long-term ODC/T&M projects by demonstrating expertise during technical discussions and interviews
  • Integrate emerging GenAI tools where applicable to enhance data enrichment, automation, and transformations
What we offer
What we offer
  • Opportunity to work at the intersection of Data Engineering, Cloud, and Generative AI
  • Hands-on exposure to modern data stacks and emerging AI technologies
  • Collaboration with experts across Data, AI/ML, and cloud practices
  • Access to structured learning, certifications, and leadership mentoring
  • Competitive compensation with fast-track career growth and visibility
  • Fulltime
Read More
Arrow Right

Sr. Data Engineer - Snowflake

Data Ideology is seeking a Sr. Snowflake Data Engineer to join our growing team ...
Location
Location
Salary
Salary:
Not provided
dataideology.com Logo
Data Ideology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in data engineering, data warehousing, or data architecture
  • 3+ years of hands-on Snowflake experience (performance tuning, data sharing, Snowpark, Snowpipe, etc.)
  • Strong SQL and Python skills, with production experience using dbt
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and modern data tooling (Airflow, Fivetran, Power BI, Looker, Informatica, etc.)
  • Prior experience in a consulting or client-facing delivery role
  • Excellent communication skills, with the ability to collaborate across technical and business stakeholders
  • SnowPro Core Certification required (or willingness to obtain upon hire)
  • advanced Snowflake certifications preferred
Job Responsibility
Job Responsibility
  • Design and build scalable, secure, and cost-effective data solutions in Snowflake
  • Develop and optimize data pipelines using tools such as dbt, Python, CloverDX, and cloud-native services
  • Participate in discovery sessions with clients to gather requirements and translate them into solution designs and project plans
  • Collaborate with engagement managers and account teams to help scope work and provide technical input for Statements of Work (SOWs)
  • Serve as a Snowflake subject matter expert, guiding best practices in performance tuning, cost optimization, access control, and workload management
  • Lead modernization and migration initiatives to move clients from legacy systems into Snowflake
  • Integrate Snowflake with BI tools, governance platforms, and AI/ML frameworks
  • Contribute to internal accelerators, frameworks, and proofs of concept
  • Mentor junior engineers and support knowledge sharing across the team
What we offer
What we offer
  • Flexible Time Off Policy
  • Eligibility for Health Benefits
  • Retirement Plan with Company Match
  • Training and Certification Reimbursement
  • Utilization Based Incentive Program
  • Commission Incentive Program
  • Referral Bonuses
  • Work from Home
  • Fulltime
Read More
Arrow Right

Lead Data Engineer

Our client, a leading energy company based in Edinburgh, Scotland, is seeking a ...
Location
Location
United Kingdom , Edinburgh
Salary
Salary:
Not provided
nettalent.net Logo
Net Talent
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven experience as a Data Engineer, with a focus on Python development and data pipeline architecture
  • Hands-on experience with Snowflake data warehousing platform
  • Experience working in a data management environment, ideally within a major client or enterprise setting
  • Strong understanding of data modelling, ETL/ELT processes, and data security standards
  • Demonstrated leadership capabilities, with experience mentoring or managing junior team members
  • Excellent communication skills and the ability to collaborate across various departments
  • Problem-solving mindset with a passion for innovative data solutions and continuous learning
Job Responsibility
Job Responsibility
  • Design, develop, and maintain scalable data pipelines and architectures to support business analytics and reporting needs
  • Lead and mentor a team of data engineers, ensuring best practices in data engineering methodologies and tools
  • Collaborate with cross-functional teams including data analysts, data scientists, and business stakeholders to understand data requirements
  • Implement data security, governance, and compliance standards across all data solutions
  • Utilise Snowflake for data warehousing solutions, ensuring optimal performance and security
  • Develop automation scripts and optimise data workflows for efficiency and reliability
  • Monitor and troubleshoot data pipelines to resolve issues promptly, ensuring data integrity and availability
What we offer
What we offer
  • Excellent package on offer
  • supportive work environment
  • competitive salary
  • opportunities for professional development
  • collaborative culture that fosters growth and innovation
  • Fulltime
Read More
Arrow Right

Snowflake Solutions Engineer

We are seeking an innovative Snowflake Solutions Engineer to join our growing IT...
Location
Location
United States , Easton
Salary
Salary:
Not provided
victaulic.com Logo
Victaulic
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Information Systems, Data Engineering, Data Science or related technical field
  • At least 2 years of recent hands-on experience with Snowflake platform including advanced features
  • Minimum 3 years of experience in data engineering or solutions architecture roles
  • 7-10 years of experience in Data Architecture/Engineering and/or BI in a multi-dimensional environment
  • Proven track record of developing data applications or analytical solutions for business users
  • Snowflake Expertise: Advanced knowledge of Snowflake architecture including data warehousing, data lakes, and emerging lakehouse features
  • Security and Governance: Deep understanding of RBAC, row-level security, data masking, and Snowflake security best practices
  • DevOps and CI/CD: Strong experience with GitHub, SnowDDL, automated deployment pipelines, and infrastructure as code
  • Application Development: Proficiency with Snowflake Streamlit for building interactive data applications
  • SQL Proficiency: Expert-level SQL skills with experience in complex analytical queries and optimization
Job Responsibility
Job Responsibility
  • Snowflake Native Application Development (30%): Design and develop interactive data applications using Snowflake Streamlit for self-service analytics and operational workflows that enable business users to interact with data through intuitive interfaces
  • Create reusable application frameworks and component libraries for rapid solution delivery
  • Integrate Snowflake Native Apps and third-party marketplace applications to extend platform capabilities
  • Develop custom UDFs and stored procedures to support advanced application logic and business rules
  • Data Architecture and Modern Platform Design (30%): Design and implement modern data architecture solutions spanning data warehousing, data lakes, and lakehouse patterns
  • Implement and maintain medallion architecture (bronze-silver-gold) patterns for data quality and governance
  • Evaluate and recommend architecture patterns for diverse use cases including structured analytics, semi-structured data processing, and AI/ML workloads
  • Establish best practices for data organization, storage optimization, and query performance across different data architecture patterns
  • AI Support and Advanced Analytics Collaboration (15%): Support AI and data science teams with Snowflake platform capabilities and best practices
  • Collaborate on implementing Snowflake Cortex AI features for business use cases
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Intratek Computer, Inc. is seeking a highly skilled and experienced Sr Data Engi...
Location
Location
United States , Los Angeles
Salary
Salary:
Not provided
intrapc.com Logo
Intratek Computer, Inc.
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proficiency in WhereScape RED for data warehouse automation, including designing, building, and managing data warehouses
  • Expertise in Snowflake’s cloud data platform, including data loading, transformation, and querying using Snowflake SQL
  • Experience with SQL-based development, optimization, and tuning for large-scale data processing
  • Strong understanding of dimensional modeling concepts and experience in designing and implementing data models for analytics and reporting purposes
  • Ability to optimize data pipelines and queries for performance and scalability
  • Familiarity with Snowflake’s features such as virtual warehouses, data sharing, and data governance capabilities
  • Knowledge of WhereScape scripting language (WSL) for customizing and extending automation processes
  • Experience with data integration tools and techniques to ingest data from various sources into Snowflake
  • Understanding of data governance principles and experience implementing data governance frameworks within Snowflake
  • Ability to implement data quality checks and ensure data integrity within the data warehouse environment
What we offer
What we offer
  • Medical benefits
  • Paid vacation
  • Paid holidays
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer role driving Circle K's cloud-first strategy to unlock the ...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree in Computer Engineering, Computer Science or related discipline
  • Master's Degree preferred
  • 5+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment
  • 5+ years of experience with setting up and operating data pipelines using Python or SQL
  • 5+ years of advanced SQL Programming: PL/SQL, T-SQL
  • 5+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization
  • Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads
  • 5+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data
  • 5+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions
  • 5+ years of experience in defining and enabling data quality standards for auditing, and monitoring
Job Responsibility
Job Responsibility
  • Collaborate with business stakeholders and other technical team members to acquire and migrate data sources
  • Determine solutions that are best suited to develop a pipeline for a particular data source
  • Develop data flow pipelines to extract, transform, and load data from various data sources
  • Efficient in ETL/ELT development using Azure cloud services and Snowflake
  • Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines
  • Provide clear documentation for delivered solutions and processes
  • Identify and implement internal process improvements for data management
  • Stay current with and adopt new tools and applications
  • Build cross-platform data strategy to aggregate multiple sources
  • Proactive in stakeholder communication, mentor/guide junior resources
  • Fulltime
Read More
Arrow Right

Sr. Data Engineer

We are looking for a Sr. Data Engineer to join our growing Quality Engineering t...
Location
Location
Salary
Salary:
Not provided
dataideology.com Logo
Data Ideology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Information Systems, or a related field (or equivalent experience)
  • 5+ years of experience in data engineering, data warehousing, or data architecture
  • Expert-level experience with Snowflake, including data modeling, performance tuning, security, and migration from legacy platforms
  • Hands-on experience with Azure Data Factory (ADF) for building, orchestrating, and optimizing data pipelines
  • Strong experience with Informatica (PowerCenter and/or IICS) for ETL/ELT development, workflow management, and performance optimization
  • Deep knowledge of data modeling techniques (dimensional, tabular, and modern cloud-native patterns)
  • Proven ability to translate business requirements into scalable, high-performance data solutions
  • Experience designing and supporting end-to-end data pipelines across cloud and hybrid architectures
  • Strong proficiency in SQL and experience optimizing large-scale analytic workloads
  • Experience working within SDLC frameworks, CI/CD practices, and version control
Job Responsibility
Job Responsibility
  • Ability to collect and understand business requirements and translate those requirements into data models, integration strategies, and implementation plans
  • Lead modernization and migration initiatives to move clients from legacy systems into Snowflake, ensuring functionality, performance and data integrity
  • Ability to work within the SDLC framework in multiple environments and understand the complexities and dependencies of the data warehouse
  • Optimize and troubleshoot ETL/ELT workflows, applying best practices for scheduling, orchestration, and performance tuning
  • Maintain documentation, architecture diagrams, and migration plans to support knowledge transfer and project tracking
What we offer
What we offer
  • PTO Policy
  • Eligibility for Health Benefits
  • Retirement Plan
  • Work from Home
  • Fulltime
Read More
Arrow Right