CrawlJobs Logo

Snowflake Data Engineer

nttdata.com Logo

NTT DATA

Location Icon

Location:
Romania , Cluj

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

The Snowflake Data Engineer role requires a minimum of 5 years of experience in data engineering, with a strong focus on Snowflake and cloud-native platforms. The successful candidate will collaborate with clients to design and maintain secure, high-performing data pipelines that enable data-driven decision-making and advanced analytics. A BSc/MSc in Computer Science or a related field is required, along with Snowflake certifications. Responsibilities include developing data solutions, ensuring data accuracy, and promoting engineering best practices. This position offers opportunities for remote, hybrid, or office work.

Job Responsibility:

  • Client Engagement & Delivery: Collaborate with clients to understand data requirements, contribute to solution delivery, and ensure successful project outcomes
  • Data Pipeline Development: Design, build, and optimise robust, scalable data pipelines to support analytics, reporting, and operational workloads
  • Snowflake & Cloud Data Platforms: Develop and maintain data solutions on Snowflake and modern cloud ecosystems, ensuring performance, security, and reliability
  • Data Architecture & Modelling: Implement high‑quality data models and contribute to architectural decisions that support enterprise‑grade data platforms
  • Collaboration & Best Practices: Work closely with cross‑functional teams to promote engineering best practices, code quality, and continuous improvement
  • Quality & Governance: Ensure data accuracy, consistency, and compliance through strong testing, documentation, and governance practices
  • Partner with architects to translate solution designs into high‑quality engineering deliverables
  • Collaborate with technical teams to support development, troubleshooting, and end‑to‑end data delivery
  • Client Stakeholders: Engage with client technical leaders
  • Delivery of high-performing, scalable, and secure data pipelines aligned to client requirements
  • High client satisfaction and successful adoption of Snowflake-based solutions
  • Demonstrated ability to innovate and improve data engineering practices
  • Contribution to the growth of the practice through reusable assets, accelerators, and technical leadership

Requirements:

  • BSc/MSc in Computer Science, Data Engineering, or related field
  • Snowflake certifications (SnowPro Core, Advanced) highly desirable
  • Minimum 5–8 years in data engineering, data warehousing, or data architecture roles, with at least 3+ years working with Snowflake
  • Proven experience in data engineering and pipeline development on Snowflake and cloud-native platforms
  • Deep expertise with Snowflake features (warehouses, Snowpark, data sharing, performance tuning)
  • Proficiency in ETL/ELT tools such as DBT, Matillion, Talend, or equivalent
  • Strong SQL and Python (or equivalent language) skills for data manipulation and automation
  • Hands-on experience with cloud platforms (AWS, Azure, GCP)
  • Knowledge of data modelling methodologies (star schemas, Data Vault, Kimball, Inmon)
  • Familiarity with data lake architectures and distributed processing frameworks (e.g., Spark, Hadoop)
  • Experience with version control tools (GitHub, Bitbucket) and CI/CD pipelines
  • Understanding of data governance, security, and compliance frameworks
  • Strong consulting values with ability to collaborate effectively in client-facing environments
  • Expertise across the data lifecycle: ingestion, transformation, modelling, governance, and consumption
  • Strong problem-solving, analytical, and communication skills
  • Experience leading or mentoring teams of engineers in delivering high-quality data solutions
What we offer:
  • Smooth integration and a supportive mentor
  • Pick your working style: choose from Remote, Hybrid or Office work opportunities
  • Our projects have different working hours to suit your needs
  • Sharpen your tech skills with our sponsored certifications, trainings and top e-learning platforms
  • Private Health Insurance
  • Individual coaching sessions
  • Accredited Coaching School
  • Epic parties or themed events

Additional Information:

Job Posted:
May 03, 2026

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Snowflake Data Engineer

Sr. Data Engineer - Snowflake

Data Ideology is seeking a Sr. Snowflake Data Engineer to join our growing team ...
Location
Location
Salary
Salary:
Not provided
dataideology.com Logo
Data Ideology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in data engineering, data warehousing, or data architecture
  • 3+ years of hands-on Snowflake experience (performance tuning, data sharing, Snowpark, Snowpipe, etc.)
  • Strong SQL and Python skills, with production experience using dbt
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and modern data tooling (Airflow, Fivetran, Power BI, Looker, Informatica, etc.)
  • Prior experience in a consulting or client-facing delivery role
  • Excellent communication skills, with the ability to collaborate across technical and business stakeholders
  • SnowPro Core Certification required (or willingness to obtain upon hire)
  • advanced Snowflake certifications preferred
Job Responsibility
Job Responsibility
  • Design and build scalable, secure, and cost-effective data solutions in Snowflake
  • Develop and optimize data pipelines using tools such as dbt, Python, CloverDX, and cloud-native services
  • Participate in discovery sessions with clients to gather requirements and translate them into solution designs and project plans
  • Collaborate with engagement managers and account teams to help scope work and provide technical input for Statements of Work (SOWs)
  • Serve as a Snowflake subject matter expert, guiding best practices in performance tuning, cost optimization, access control, and workload management
  • Lead modernization and migration initiatives to move clients from legacy systems into Snowflake
  • Integrate Snowflake with BI tools, governance platforms, and AI/ML frameworks
  • Contribute to internal accelerators, frameworks, and proofs of concept
  • Mentor junior engineers and support knowledge sharing across the team
What we offer
What we offer
  • Flexible Time Off Policy
  • Eligibility for Health Benefits
  • Retirement Plan with Company Match
  • Training and Certification Reimbursement
  • Utilization Based Incentive Program
  • Commission Incentive Program
  • Referral Bonuses
  • Work from Home
  • Fulltime
Read More
Arrow Right

Senior Data Engineer – Data Engineering & AI Platforms

We are looking for a highly skilled Senior Data Engineer (L2) who can design, bu...
Location
Location
India , Chennai, Madurai, Coimbatore
Salary
Salary:
Not provided
optisolbusiness.com Logo
OptiSol Business Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong hands-on expertise in cloud ecosystems (Azure / AWS / GCP)
  • Excellent Python programming skills with data engineering libraries and frameworks
  • Advanced SQL capabilities including window functions, CTEs, and performance tuning
  • Solid understanding of distributed processing using Spark/PySpark
  • Experience designing and implementing scalable ETL/ELT workflows
  • Good understanding of data modeling concepts (dimensional, star, snowflake)
  • Familiarity with GenAI/LLM-based integration for data workflows
  • Experience working with Git, CI/CD, and Agile delivery frameworks
  • Strong communication skills for interacting with clients, stakeholders, and internal teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable ETL/ELT pipelines across cloud and big data platforms
  • Contribute to architectural discussions by translating business needs into data solutions spanning ingestion, transformation, and consumption layers
  • Work closely with solutioning and pre-sales teams for technical evaluations and client-facing discussions
  • Lead squads of L0/L1 engineers—ensuring delivery quality, mentoring, and guiding career growth
  • Develop cloud-native data engineering solutions using Python, SQL, PySpark, and modern data frameworks
  • Ensure data reliability, performance, and maintainability across the pipeline lifecycle—from development to deployment
  • Support long-term ODC/T&M projects by demonstrating expertise during technical discussions and interviews
  • Integrate emerging GenAI tools where applicable to enhance data enrichment, automation, and transformations
What we offer
What we offer
  • Opportunity to work at the intersection of Data Engineering, Cloud, and Generative AI
  • Hands-on exposure to modern data stacks and emerging AI technologies
  • Collaboration with experts across Data, AI/ML, and cloud practices
  • Access to structured learning, certifications, and leadership mentoring
  • Competitive compensation with fast-track career growth and visibility
  • Fulltime
Read More
Arrow Right

Data Engineer – Snowflake & ETL

We are seeking a Data Engineer in Hyderabad (WFO) with expertise in data enginee...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
rightanglesol.com Logo
Right Angle Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum 5+ years of experience in data engineering, ETL, and Snowflake development
  • Proven expertise in Snowflake including SQL scripting, performance tuning, and data warehousing concepts
  • Hands-on experience with Matillion ETL for building and maintaining ETL jobs
  • Strong understanding of cloud platforms (AWS, Azure, or GCP) and cloud-based data architectures
  • Proficiency in SQL, Python, or other scripting languages for automation and data transformation
  • Experience with API integrations and data ingestion frameworks
  • Knowledge of data governance, security policies, and access control within Snowflake environments
  • Excellent communication skills with the ability to engage both business and technical stakeholders
  • Self-motivated professional capable of working independently and delivering projects on time
  • Qualification: BE/BS/MTech/MS or equivalent work experience
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Intratek Computer, Inc. is seeking a highly skilled and experienced Sr Data Engi...
Location
Location
United States , Los Angeles
Salary
Salary:
Not provided
intrapc.com Logo
Intratek Computer, Inc.
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proficiency in WhereScape RED for data warehouse automation, including designing, building, and managing data warehouses
  • Expertise in Snowflake’s cloud data platform, including data loading, transformation, and querying using Snowflake SQL
  • Experience with SQL-based development, optimization, and tuning for large-scale data processing
  • Strong understanding of dimensional modeling concepts and experience in designing and implementing data models for analytics and reporting purposes
  • Ability to optimize data pipelines and queries for performance and scalability
  • Familiarity with Snowflake’s features such as virtual warehouses, data sharing, and data governance capabilities
  • Knowledge of WhereScape scripting language (WSL) for customizing and extending automation processes
  • Experience with data integration tools and techniques to ingest data from various sources into Snowflake
  • Understanding of data governance principles and experience implementing data governance frameworks within Snowflake
  • Ability to implement data quality checks and ensure data integrity within the data warehouse environment
What we offer
What we offer
  • Medical benefits
  • Paid vacation
  • Paid holidays
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer role driving Circle K's cloud-first strategy to unlock the ...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree in Computer Engineering, Computer Science or related discipline
  • Master's Degree preferred
  • 5+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment
  • 5+ years of experience with setting up and operating data pipelines using Python or SQL
  • 5+ years of advanced SQL Programming: PL/SQL, T-SQL
  • 5+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization
  • Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads
  • 5+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data
  • 5+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions
  • 5+ years of experience in defining and enabling data quality standards for auditing, and monitoring
Job Responsibility
Job Responsibility
  • Collaborate with business stakeholders and other technical team members to acquire and migrate data sources
  • Determine solutions that are best suited to develop a pipeline for a particular data source
  • Develop data flow pipelines to extract, transform, and load data from various data sources
  • Efficient in ETL/ELT development using Azure cloud services and Snowflake
  • Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines
  • Provide clear documentation for delivered solutions and processes
  • Identify and implement internal process improvements for data management
  • Stay current with and adopt new tools and applications
  • Build cross-platform data strategy to aggregate multiple sources
  • Proactive in stakeholder communication, mentor/guide junior resources
  • Fulltime
Read More
Arrow Right

Sr. Data Engineer

We are looking for a Sr. Data Engineer to join our growing Quality Engineering t...
Location
Location
Salary
Salary:
Not provided
dataideology.com Logo
Data Ideology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Information Systems, or a related field (or equivalent experience)
  • 5+ years of experience in data engineering, data warehousing, or data architecture
  • Expert-level experience with Snowflake, including data modeling, performance tuning, security, and migration from legacy platforms
  • Hands-on experience with Azure Data Factory (ADF) for building, orchestrating, and optimizing data pipelines
  • Strong experience with Informatica (PowerCenter and/or IICS) for ETL/ELT development, workflow management, and performance optimization
  • Deep knowledge of data modeling techniques (dimensional, tabular, and modern cloud-native patterns)
  • Proven ability to translate business requirements into scalable, high-performance data solutions
  • Experience designing and supporting end-to-end data pipelines across cloud and hybrid architectures
  • Strong proficiency in SQL and experience optimizing large-scale analytic workloads
  • Experience working within SDLC frameworks, CI/CD practices, and version control
Job Responsibility
Job Responsibility
  • Ability to collect and understand business requirements and translate those requirements into data models, integration strategies, and implementation plans
  • Lead modernization and migration initiatives to move clients from legacy systems into Snowflake, ensuring functionality, performance and data integrity
  • Ability to work within the SDLC framework in multiple environments and understand the complexities and dependencies of the data warehouse
  • Optimize and troubleshoot ETL/ELT workflows, applying best practices for scheduling, orchestration, and performance tuning
  • Maintain documentation, architecture diagrams, and migration plans to support knowledge transfer and project tracking
What we offer
What we offer
  • PTO Policy
  • Eligibility for Health Benefits
  • Retirement Plan
  • Work from Home
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Location
Location
United States , Flowood
Salary
Salary:
Not provided
phasorsoft.com Logo
PhasorSoft Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience with Snowflake or Azure Cloud Data Engineering, including setting up and managing data pipelines
  • Proficiency in designing and implementing ETL processes for data integration
  • Knowledge of data warehousing concepts and best practices
  • Strong SQL skills for querying and manipulating data in Snowflake or Azure databases
  • Experience with data modeling techniques and tools to design efficient data structures
  • Understanding of data governance principles and experience implementing them in cloud environments
  • Proficiency in Tableau or Power BI for creating visualizations and interactive dashboards
  • Ability to write scripts (e.g., Python, PowerShell) for automation and orchestration of data pipelines
  • Skills to monitor and optimize data pipelines for performance and cost efficiency
  • Knowledge of cloud data security practices and tools to ensure data protection
Job Responsibility
Job Responsibility
  • Design, implement, and maintain data pipelines and architectures on Snowflake or Azure Cloud platforms
  • Develop ETL processes to extract, transform, and load data from various sources into data warehouses
  • Optimize data storage, retrieval, and processing for performance and cost-efficiency in cloud environments
  • Collaborate with stakeholders to understand data requirements and translate them into technical solutions
  • Implement data security and governance best practices to ensure data integrity and compliance
  • Work with reporting tools such as Tableau or Power BI to create interactive dashboards and visualizations
  • Monitor and troubleshoot data pipelines, ensuring reliability and scalability
  • Automate data workflows and processes using cloud-native services and scripting languages
  • Provide technical expertise and support to data analysts, scientists, and business users
  • Fulltime
Read More
Arrow Right

Data Engineer

Introlligent is a global technology solutions provider known for delivering cutt...
Location
Location
Singapore , Singapore
Salary
Salary:
Not provided
introlligent.com Logo
Introlligent
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Extensive experience in reporting, data visualisation, data mining, data integration and ad hoc analysis
  • Strong Snowflake & Tableau expertise is a must
  • Familiar with Supply Chain/Operations environment preferred
  • Excellent data analysis and presentation skills
  • Excellent attention-to-detail, ability to compile and validate large amounts of data while maintaining a very high degree of accuracy
  • Excellent communication and comprehension skills
  • Ability to operate in a fast paced, rapidly changing environment
  • Business Acumen and ability to rapidly understand complex business process
  • Excellent problem solving skills: ability to analyze and resolve complex problems in a structured and logical manner
  • Excellent/Advanced Excel skills
Read More
Arrow Right