CrawlJobs Logo

Senior Data Engineer (Snowflake)

leverx.com Logo

LeverX

Location Icon

Location:
Poland

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

At LeverX, we have had the privilege of delivering over 1,500 projects for various clients. With 20+ years in the market, our team of 2,200+ is strong, reliable, and always evolving: learning, growing, and striving for excellence. We are looking for a Data Analytics Engineer to join us.

Job Responsibility:

  • Design and build scalable data pipelines in Snowflake using SQL, Snowpark (Python), and Stored Procedures
  • Own Snowflake security and governance, including RBAC, dynamic data masking, row access policies, and object tagging
  • Develop and optimize data ingestion using Snowpipe, COPY INTO, and external stages
  • Optimize performance through virtual warehouse sizing, query profiling, clustering, and search optimization
  • Implement data transformations and semantic layers using dbt, applying medallion or Data Vault modeling patterns
  • Build reliable orchestration and CI/CD for data pipelines using Snowflake Tasks & Streams, external orchestrators, and infrastructure as code
  • Collaborate with and mentor engineers, partner with analytics and ML teams, and drive continuous improvements across the data platform

Requirements:

  • 5+ years of hands-on experience in Data Engineering
  • Strong expertise in Snowflake, including architecture, advanced SQL, Snowpark (Python), and data governance (RBAC, masking policies)
  • Solid experience with dbt (models, tests, snapshots, incremental strategies, CI/CD)
  • Experience with data orchestration tools (Airflow, Dagster, Prefect) or Snowflake Tasks and Streams
  • Proven ability to design and maintain scalable Data Warehouse or Data Lakehouse architectures
  • Experience optimizing costs by managing Snowflake credits, warehouse configurations, and storage usage
  • Strong Python skills for data transformation, automation, and scripting
  • English B2+

Nice to have:

  • SnowPro Core or SnowPro Advanced Architect certifications
  • Experience building production-grade dashboards in BI tools (Tableau, Looker, Superset, Metabase)
  • Experience with Cloud Platforms (AWS/Azure/GCP) regarding IAM, S3/Blob Storage networking
What we offer:
  • Projects in different domains: healthcare, manufacturing, e-commerce, fintech, etc.
  • Projects for every taste: Startup products, enterprise solutions, research & development initiatives, and projects at the crossroads of SAP and the latest web technologies
  • Global clients based in Europe and the US, including Fortune 500 companies
  • Employment security: We hire for our team, not just a specific project. If your project ends, we will find you a new one
  • Healthy work atmosphere: On average, our employees stay with the company for 4+ years
  • Market-based compensation and regular performance reviews
  • Internal expert communities and courses
  • Perks to support your growth and well-being

Additional Information:

Job Posted:
January 07, 2026

Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Data Engineer (Snowflake)

Senior Data & AI Innovation Engineer

We are seeking a highly proactive, self-driven Senior Data & AI Engineer to serv...
Location
Location
Singapore , Singapore
Salary
Salary:
7000.00 - 8000.00 SGD / Month
https://www.randstad.com Logo
Randstad
Expiration Date
January 08, 2026
Flip Icon
Requirements
Requirements
  • Proven, hands-on experience in implementing and supporting practical AI use cases (beyond academic study), understanding how to embed AI components into existing services
  • 4+ years of hands-on experience in implementing and operating Snowflake Data Cloud in a production environment
  • Certification (e.g., SnowPro Data Engineer) is highly desirable
  • Familiarity with MLOps concepts and tools (e.g., Docker, MLflow, LangChain) and an understanding of LLMs, RAG pipelines, and generative AI deployment
  • Strong programming skills in Python for data manipulation, scripting, and AI model support
Job Responsibility
Job Responsibility
  • Proactively identify, design, and implement initial AI Proof-of-Concepts (POCs) across the APAC region, focusing on quick-win solutions like AI-powered chatbots and intelligent inventory monitoring systems
  • Analyze business processes to identify areas where AI components can be effectively embedded to solve immediate business challenges
  • Partner with business stakeholders to understand AI data needs, perform data engineering/prep, and ensure data readiness to support and sustain deployed AI models
  • Stay ahead of technology trends, perform proactive research on Data and AI solutions, and evangelize new capabilities to regional teams
  • Act as the APAC SME, collaborating closely with cross-regional peers and global teams to contribute to and align with the company Global Data Platform roadmap (Snowflake)
  • Define and execute the complete migration strategy from legacy data warehouses/databases (e.g., PostgreSQL, MS SQL) to the Snowflake Data Cloud platform
  • Design, build, and optimize scalable, robust ETL/ELT data pipelines to curate raw data into BI and Advanced Analytics datasets
  • Implement and manage Snowflake governance, including access control, data security, usage monitoring, and performance optimization aligned with global best practices
Read More
Arrow Right

Senior Data Platform Engineer

We are looking for an experienced data engineer to join our platform engineering...
Location
Location
United States
Salary
Salary:
141000.00 - 225600.00 USD / Year
axon.com Logo
Axon
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience in data engineering, software engineering with a data focus, data science, or a related role
  • Knowledge of designing data pipelines from a variety of source (e.g. streaming, flat files, APIs)
  • Proficiency in SQL and experience with relational databases (e.g., PostgreSQL)
  • Experience with real-time data processing frameworks (e.g., Apache Kafka, Spark Streaming, Flink, Pulsar, Redpanda)
  • Strong programming skills in common data-focused languages (e.g., Python, Scala)
  • Experience with data pipeline and workflow management tools (e.g., Apache Airflow, Prefect, Temporal)
  • Familiarity with AWS-based data solutions
  • Strong understanding of data warehousing concepts and technologies (Snowflake)
  • Experience documenting data dependency maps and data lineage
  • Strong communication and collaboration skills
Job Responsibility
Job Responsibility
  • Design, implement, and maintain scalable data pipelines and infrastructure
  • Collaborate with software engineers, product managers, customer success managers, and others across the business to understand data requirements
  • Optimize and manage our data storage solutions
  • Ensure data quality, reliability, and security across the data lifecycle
  • Develop and maintain ETL processes and frameworks
  • Work with stakeholders to define data availability SLAs
  • Create and manage data models to support business intelligence and analytics
What we offer
What we offer
  • Competitive salary and 401k with employer match
  • Discretionary time off
  • Paid parental leave for all
  • Medical, Dental, Vision plans
  • Fitness Programs
  • Emotional & Development Programs
  • Snacks in our offices
Read More
Arrow Right

Senior Data Engineer II

We are looking for a skilled Data Engineer to join our growing team. You will pl...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
alterdomus.com Logo
Alter Domus
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field
  • 3+ years of experience as a Data Engineer or in a similar role working with cloud-based data platforms
  • Technical Skills: Cloud & Orchestration: Airflow (self-managed or managed services like Amazon MWAA) for workflow orchestration, DAG development, and scheduling
  • Familiarity with best practices for Airflow DAG structure, dependency management, and error handling
  • AWS Expertise: Hands-on experience with AWS Lake Formation, S3, Athena, and related services (e.g., Lambda, Glue, IAM)
  • Snowflake: Proficient in setting up data warehouses, configuring security, and optimizing queries on Snowflake
  • Data Ingestion & Transformation: Experience with Airbyte or similar tools for data ingestion
  • dbt or other SQL-based transformation frameworks for modular data processing
  • Programming: Proficiency in Python and/or Java/Scala for building data pipelines and custom integrations
  • Query Languages: Advanced knowledge of SQL for data manipulation and analysis
Job Responsibility
Job Responsibility
  • Data Pipeline Orchestration: Design, build, and maintain end-to-end data pipelines using Airflow (including managed services like Amazon MWAA) to orchestrate, schedule, and monitor batch/streaming workflows
  • Implement DAGs (Directed Acyclic Graphs) with retry logic, error handling, and alerting to ensure data quality and pipeline reliability
  • Data Ingestion & Transformation: Integrate data from various sources using Airbyte for ingestion and dbt for transformations in a scalable and modular fashion
  • Collaborate with Data Analysts and Data Scientists to implement transformations and business logic, ensuring data is analytics-ready
  • Data Modeling & Warehousing: Design and implement efficient data models for both structured and semi-structured data in AWS S3 (data lake) and Snowflake (data warehouse)
  • Ensure data schemas and transformations support advanced analytics, BI reporting, and machine learning use cases
  • Data Governance & Security: Utilize AWS Lake Formation APIs and best practices to maintain data security, access controls, and compliance
  • Work closely with IT security to establish robust encryption standards, audit trails, and identity/role-based access
  • Performance Optimization: Optimize AWS Athena queries and configurations (e.g., data partitioning) for performance and cost efficiency
  • Monitor and tune Airflow DAGs, Snowflake queries, and data transformations to improve throughput and reliability
What we offer
What we offer
  • Support for professional accreditations
  • Flexible arrangements, generous holidays, plus an additional day off for your birthday
  • Continuous mentoring along your career progression
  • Active sports, events and social committees across our offices
  • 24/7 support available from our Employee Assistance Program
  • The opportunity to invest in our growth and success through our Employee Share Plan
  • Plus additional local benefits depending on your location
Read More
Arrow Right

Senior Data Engineer

As a Senior Data Engineer, you will be pivotal in designing, building, and optim...
Location
Location
United States
Salary
Salary:
102000.00 - 125000.00 USD / Year
wpromote.com Logo
Wpromote
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent practical experience
  • 4+ years of experience in data engineering or a related field
  • Intermediate to advanced programming skills in Python
  • Proficiency in SQL and experience with relational databases
  • Strong knowledge of database and data warehousing design and management
  • Strong experience with DBT (data build tool) and test-driven development practices
  • Proficiency with at least 1 cloud database (e.g. BigQuery, Snowflake, Redshift, etc.)
  • Excellent problem-solving skills, project management habits, and attention to detail
  • Advanced level Excel and Google Sheets experience
  • Familiarity with data orchestration tools (e.g. Airflow, Dagster, AWS Glue, Azure data factory, etc.)
Job Responsibility
Job Responsibility
  • Developing data pipelines leveraging a variety of technologies including dbt and BigQuery
  • Gathering requirements from non-technical stakeholders and building effective solutions
  • Identifying areas of innovation that align with existing company and team objectives
  • Managing multiple pipelines across Wpromote’s client portfolio
What we offer
What we offer
  • Half-day Fridays year round
  • Unlimited PTO
  • Extended Holiday break (Winter)
  • Flexible schedules
  • Work from anywhere options*
  • 100% paid parental leave
  • 401(k) matching
  • Medical, Dental, Vision, Life, Pet Insurance
  • Sponsored life insurance
  • Short Term Disability insurance and additional voluntary insurance
  • Fulltime
Read More
Arrow Right

Senior Crypto Data Engineer

Token Metrics is seeking a multi-talented Senior Big Data Engineer to facilitate...
Location
Location
Vietnam , Hanoi
Salary
Salary:
Not provided
tokenmetrics.com Logo
Token Metrics
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Data Engineering, Big Data Analytics, Computer Engineering, or related field
  • A Master's degree in a relevant field is an added advantage
  • 3+ years of Python, Java or any programming language development experience
  • 3+ years of SQL & No-SQL experience (Snowflake Cloud DW & MongoDB experience is a plus)
  • 3+ years of experience with schema design and dimensional data modeling
  • Expert proficiency in SQL, NoSQL, Python, C++, Java, R
  • Expert with building Data Lake, Data Warehouse or suitable equivalent
  • Expert in AWS Cloud
  • Excellent analytical and problem-solving skills
  • A knack for independence and group work
Job Responsibility
Job Responsibility
  • Liaising with coworkers and clients to elucidate the requirements for each task
  • Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed
  • Reformulating existing frameworks to optimize their functioning
  • Testing such structures to ensure that they are fit for use
  • Building a data pipeline from different data sources using different data types like API, CSV, JSON, etc
  • Preparing raw data for manipulation by Data Scientists
  • Implementing proper data validation and data reconciliation methodologies
  • Ensuring that your work remains backed up and readily accessible to relevant coworkers
  • Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer – Dublin (Hybrid) Contract Role | 3 Days Onsite. We are see...
Location
Location
Ireland , Dublin
Salary
Salary:
Not provided
solasit.ie Logo
Solas IT Recruitment
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience as a Data Engineer working with distributed data systems
  • 4+ years of deep Snowflake experience, including performance tuning, SQL optimization, and data modelling
  • Strong hands-on experience with the Hadoop ecosystem: HDFS, Hive, Impala, Spark (PySpark preferred)
  • Oozie, Airflow, or similar orchestration tools
  • Proven expertise with PySpark, Spark SQL, and large-scale data processing patterns
  • Experience with Databricks and Delta Lake (or equivalent big-data platforms)
  • Strong programming background in Python, Scala, or Java
  • Experience with cloud services (AWS preferred): S3, Glue, EMR, Redshift, Lambda, Athena, etc.
Job Responsibility
Job Responsibility
  • Build, enhance, and maintain large-scale ETL/ELT pipelines using Hadoop ecosystem tools including HDFS, Hive, Impala, and Oozie/Airflow
  • Develop distributed data processing solutions with PySpark, Spark SQL, Scala, or Python to support complex data transformations
  • Implement scalable and secure data ingestion frameworks to support both batch and streaming workloads
  • Work hands-on with Snowflake to design performant data models, optimize queries, and establish solid data governance practices
  • Collaborate on the migration and modernization of current big-data workloads to cloud-native platforms and Databricks
  • Tune Hadoop, Spark, and Snowflake systems for performance, storage efficiency, and reliability
  • Apply best practices in data modelling, partitioning strategies, and job orchestration for large datasets
  • Integrate metadata management, lineage tracking, and governance standards across the platform
  • Build automated validation frameworks to ensure accuracy, completeness, and reliability of data pipelines
  • Develop unit, integration, and end-to-end testing for ETL workflows using Python, Spark, and dbt testing where applicable
Read More
Arrow Right

Senior Data Engineer

We are looking for a Senior Data Engineer to join one of the best team at Sigma ...
Location
Location
Salary
Salary:
Not provided
sigma.software Logo
Sigma Software Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Python / strong
  • SQL / strong
  • Snowflake / good
  • English / strong
What we offer
What we offer
  • Diversity of Domains & Businesses
  • Variety of technology
  • Health & Legal support
  • Active professional community
  • Continuous education and growing
  • Flexible schedule
  • Remote work
  • Outstanding offices (if you choose it)
  • Sports and community activities
Read More
Arrow Right

Senior Data Engineer

At Relatient, we’re on a mission to simplify access to care – intelligently. As ...
Location
Location
India , Pune
Salary
Salary:
Not provided
relatient.com Logo
Relatient
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree, B.E./ B. Tech, computer engineering, or equivalent work experience in lieu of a degree is required, Master’s degree preferred
  • 7+ years of experience in database engineering, data warehousing, or data architecture
  • Proven expertise with at least one major data warehouse platform (e.g. Postgres, Snowflake, Redshift, BigQuery)
  • Strong SQL and ETL/ELT development skills
  • Deep understanding of data modeling
  • Experience with cloud data ecosystems (AWS)
  • Hands-on experience with orchestration tools and version control (Git)
  • Experience in data governance, security, and compliance best practices
  • Experience building/generating analytical reports using Power BI
Job Responsibility
Job Responsibility
  • Architect, design, and implement robust end-to-end data warehouse (DW) solutions using modern technologies (e.g. Postgres or on-prem solutions)
  • Define data modeling standards (dimensional and normalized) and build ETL/ELT pipelines for efficient data flow and transformation
  • Integrate data from multiple sources (ERP, CRM. APIs, flat files, real-time streams)
  • Develop and maintain scalable and reliable data ingestion, transformation, and storage pipelines
  • Ensure data quality, consistency, and lineage across all data systems
  • Analyst and tune SQL queries, schemas, indexes, and ETL process to maximize database and warehouse performance
  • Monitor data systems and optimize storage costs and query response times
  • Implement high availability, backup, disaster recovery, and data security strategies
  • Collaborate with DevOps and Infrastructure teams to ensure optimal deployment, scaling, and performance of DW environments
  • Work closely with Data Scientists, Analysts, and Business Teams to translate business needs into technical data solutions
What we offer
What we offer
  • INR 5,00,000/- of life insurance coverage for all full-time employees and their immediate family
  • INR 15,00,000/- of group accident insurance
  • Education reimbursement
  • 10 national and state holidays, plus 1 floating holiday
  • Flexible working hours and a hybrid policy
  • Fulltime
Read More
Arrow Right