CrawlJobs Logo

Lead Snowflake Engineer

intellibus.com Logo

Intellibus

Location Icon

Location:
United States , Reston

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Imagine working at Intellibus to engineer platforms that impact billions of lives around the world. With your passion and focus we will accomplish great things together! Our Platform Engineering Team is working to solve the Multiplicity Problem. We are trusted by some of the most reputable and established FinTech Firms. Recently, our team has spearheaded the Conversion & Go Live of apps that support the backbone of the Financial Trading Industry. Are you a data enthusiast with a natural ability for analytics? We’re looking forward skilled Data/Analytics Engineers to fill multiple roles for our exciting new client. This is your chance to shine, demonstrating your dedication and commitment in a role that promises both challenge and reward.

Requirements:

  • Design, develop, and maintain data pipelines to ingest, transform, and load data from various sources into Snowflake
  • Implement ETL (Extract, Transform, Load) processes using Snowflake's features such as Snowpipe, Streams, and Tasks
  • Design and implement efficient data models and schemas within Snowflake to support reporting, analytics, and business intelligence needs
  • Optimize data warehouse performance and scalability using Snowflake features like clustering, partitioning, and materialized views
  • Integrate Snowflake with external systems and data sources, including on-premises databases, cloud storage, and third-party APIs
  • Implement data synchronization processes to ensure consistency and accuracy of data across different systems
  • Monitor and optimize query performance and resource utilization within Snowflake using query profiling, query optimization techniques, and workload management features
  • Identify and resolve performance bottlenecks and optimize data warehouse configurations for maximum efficiency
  • Work on Snowflake modeling – roles, databases, schemas, ETL tools with cloud-driven skills
  • Work on SQL performance measuring, query tuning, and database tuning
  • Handle SQL language and cloud-based technologies
  • Set up the RBAC model at the infra and data level
  • Work on Data Masking / Encryption / Tokenization, Data Wrangling / Data Pipeline orchestration (tasks)
  • Setup AWS S3/EC2, Configure External stages, and SQS/SNS
  • Perform Data Integration e.g. MSK Kafka connect and other partners like Delta Lake (data bricks)
  • ETL – Experience with ETL processes for data integration
  • SQL – Strong SQL skills for querying and data manipulation
  • Python – Strong command of Python, especially in AWS Boto3, JSON handling, and dictionary operations
  • Unix – Competent in Unix for file operations, searches, and regular expressions
  • AWS – Proficient with AWS services including EC2, Glue, S3, Step Functions, and Lambda for scalable cloud solutions
  • Database Modeling – Solid grasp of database design principles, including logical and physical data models, and change data capture (CDC) mechanisms
  • Snowflake – Experienced in Snowflake for efficient data integration, utilizing features like Snowpipe, Streams, Tasks, and Stored Procedures
  • Airflow – Fundamental knowledge of Airflow for orchestrating complex data workflows and setting up automated pipelines
  • Bachelor's degree in Computer Science, or a related field is preferred. Relevant work experience may be considered in lieu of a degree
  • Excellent communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and stakeholders
  • Proven leadership abilities, with experience mentoring junior developers and driving technical excellence within the team
  • At least 7 years of Data Wrangling Experience
  • At least 7 years of Snowflake Experience
  • At least 7 years of ETL Experience
What we offer:
  • A dynamic environment where your skills will make a direct impact
  • The opportunity to work with cutting-edge technologies and innovative projects
  • A collaborative team that values your passion and focus

Additional Information:

Job Posted:
January 02, 2026

Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Lead Snowflake Engineer

Lead Data Engineer

Our client, a leading energy company based in Edinburgh, Scotland, is seeking a ...
Location
Location
United Kingdom , Edinburgh
Salary
Salary:
Not provided
nettalent.net Logo
Net Talent
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven experience as a Data Engineer, with a focus on Python development and data pipeline architecture
  • Hands-on experience with Snowflake data warehousing platform
  • Experience working in a data management environment, ideally within a major client or enterprise setting
  • Strong understanding of data modelling, ETL/ELT processes, and data security standards
  • Demonstrated leadership capabilities, with experience mentoring or managing junior team members
  • Excellent communication skills and the ability to collaborate across various departments
  • Problem-solving mindset with a passion for innovative data solutions and continuous learning
Job Responsibility
Job Responsibility
  • Design, develop, and maintain scalable data pipelines and architectures to support business analytics and reporting needs
  • Lead and mentor a team of data engineers, ensuring best practices in data engineering methodologies and tools
  • Collaborate with cross-functional teams including data analysts, data scientists, and business stakeholders to understand data requirements
  • Implement data security, governance, and compliance standards across all data solutions
  • Utilise Snowflake for data warehousing solutions, ensuring optimal performance and security
  • Develop automation scripts and optimise data workflows for efficiency and reliability
  • Monitor and troubleshoot data pipelines to resolve issues promptly, ensuring data integrity and availability
What we offer
What we offer
  • Excellent package on offer
  • supportive work environment
  • competitive salary
  • opportunities for professional development
  • collaborative culture that fosters growth and innovation
  • Fulltime
Read More
Arrow Right

Lead Data Engineer

Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader i...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or master’s degree in computer science, Engineering, or related field
  • 7-9 years of data engineering experience with strong hands-on delivery using ADF, SQL, Python, Databricks, and Spark
  • Experience designing data pipelines, warehouse models, and processing frameworks using Snowflake or Azure Synapse
  • Proficient with CI/CD tools (Azure DevOps, GitHub) and observability practices
  • Solid grasp of data governance, metadata tagging, and role-based access control
  • Proven ability to mentor and grow engineers in a matrixed or global environment
  • Strong verbal and written communication skills, with the ability to operate cross-functionally
  • Certifications in Azure, Databricks, or Snowflake are a plus
  • Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management)
  • Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools
Job Responsibility
Job Responsibility
  • Design, develop, and maintain scalable pipelines across ADF, Databricks, Snowflake, and related platforms
  • Lead the technical execution of non-domain specific initiatives (e.g. reusable dimensions, TLOG standardization, enablement pipelines)
  • Architect data models and re-usable layers consumed by multiple downstream pods
  • Guide platform-wide patterns like parameterization, CI/CD pipelines, pipeline recovery, and auditability frameworks
  • Mentoring and coaching team
  • Partner with product and platform leaders to ensure engineering consistency and delivery excellence
  • Act as an L3 escalation point for operational data issues impacting foundational pipelines
  • Own engineering best practices, sprint planning, and quality across the Enablement pod
  • Contribute to platform discussions and architectural decisions across regions
  • Fulltime
Read More
Arrow Right

Sr. Data Engineer - Snowflake

Data Ideology is seeking a Sr. Snowflake Data Engineer to join our growing team ...
Location
Location
Salary
Salary:
Not provided
dataideology.com Logo
Data Ideology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in data engineering, data warehousing, or data architecture
  • 3+ years of hands-on Snowflake experience (performance tuning, data sharing, Snowpark, Snowpipe, etc.)
  • Strong SQL and Python skills, with production experience using dbt
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and modern data tooling (Airflow, Fivetran, Power BI, Looker, Informatica, etc.)
  • Prior experience in a consulting or client-facing delivery role
  • Excellent communication skills, with the ability to collaborate across technical and business stakeholders
  • SnowPro Core Certification required (or willingness to obtain upon hire)
  • advanced Snowflake certifications preferred
Job Responsibility
Job Responsibility
  • Design and build scalable, secure, and cost-effective data solutions in Snowflake
  • Develop and optimize data pipelines using tools such as dbt, Python, CloverDX, and cloud-native services
  • Participate in discovery sessions with clients to gather requirements and translate them into solution designs and project plans
  • Collaborate with engagement managers and account teams to help scope work and provide technical input for Statements of Work (SOWs)
  • Serve as a Snowflake subject matter expert, guiding best practices in performance tuning, cost optimization, access control, and workload management
  • Lead modernization and migration initiatives to move clients from legacy systems into Snowflake
  • Integrate Snowflake with BI tools, governance platforms, and AI/ML frameworks
  • Contribute to internal accelerators, frameworks, and proofs of concept
  • Mentor junior engineers and support knowledge sharing across the team
What we offer
What we offer
  • Flexible Time Off Policy
  • Eligibility for Health Benefits
  • Retirement Plan with Company Match
  • Training and Certification Reimbursement
  • Utilization Based Incentive Program
  • Commission Incentive Program
  • Referral Bonuses
  • Work from Home
  • Fulltime
Read More
Arrow Right

Lead Data Engineer

Lead Data Engineer to serve as both a technical leader and people coach for our ...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or master’s degree in computer science, Engineering, or related field
  • 8-10 years of data engineering experience with strong hands-on delivery using ADF, SQL, Python, Databricks, and Spark
  • Experience designing data pipelines, warehouse models, and processing frameworks using Snowflake or Azure Synapse
  • Proficient with CI/CD tools (Azure DevOps, GitHub) and observability practices
  • Solid grasp of data governance, metadata tagging, and role-based access control
  • Proven ability to mentor and grow engineers in a matrixed or global environment
  • Strong verbal and written communication skills, with the ability to operate cross-functionally
  • Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management)
  • Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools
  • Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance)
Job Responsibility
Job Responsibility
  • Design, develop, and maintain scalable pipelines across ADF, Databricks, Snowflake, and related platforms
  • Lead the technical execution of non-domain specific initiatives (e.g. reusable dimensions, TLOG standardization, enablement pipelines)
  • Architect data models and re-usable layers consumed by multiple downstream pods
  • Guide platform-wide patterns like parameterization, CI/CD pipelines, pipeline recovery, and auditability frameworks
  • Mentoring and coaching team
  • Partner with product and platform leaders to ensure engineering consistency and delivery excellence
  • Act as an L3 escalation point for operational data issues impacting foundational pipelines
  • Own engineering best practices, sprint planning, and quality across the Enablement pod
  • Contribute to platform discussions and architectural decisions across regions
  • Fulltime
Read More
Arrow Right
New

Snowflake Solutions Engineer

We are seeking an innovative Snowflake Solutions Engineer to join our growing IT...
Location
Location
United States , Easton
Salary
Salary:
Not provided
victaulic.com Logo
Victaulic
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Information Systems, Data Engineering, Data Science or related technical field
  • At least 2 years of recent hands-on experience with Snowflake platform including advanced features
  • Minimum 3 years of experience in data engineering or solutions architecture roles
  • 7-10 years of experience in Data Architecture/Engineering and/or BI in a multi-dimensional environment
  • Proven track record of developing data applications or analytical solutions for business users
  • Snowflake Expertise: Advanced knowledge of Snowflake architecture including data warehousing, data lakes, and emerging lakehouse features
  • Security and Governance: Deep understanding of RBAC, row-level security, data masking, and Snowflake security best practices
  • DevOps and CI/CD: Strong experience with GitHub, SnowDDL, automated deployment pipelines, and infrastructure as code
  • Application Development: Proficiency with Snowflake Streamlit for building interactive data applications
  • SQL Proficiency: Expert-level SQL skills with experience in complex analytical queries and optimization
Job Responsibility
Job Responsibility
  • Snowflake Native Application Development (30%): Design and develop interactive data applications using Snowflake Streamlit for self-service analytics and operational workflows that enable business users to interact with data through intuitive interfaces
  • Create reusable application frameworks and component libraries for rapid solution delivery
  • Integrate Snowflake Native Apps and third-party marketplace applications to extend platform capabilities
  • Develop custom UDFs and stored procedures to support advanced application logic and business rules
  • Data Architecture and Modern Platform Design (30%): Design and implement modern data architecture solutions spanning data warehousing, data lakes, and lakehouse patterns
  • Implement and maintain medallion architecture (bronze-silver-gold) patterns for data quality and governance
  • Evaluate and recommend architecture patterns for diverse use cases including structured analytics, semi-structured data processing, and AI/ML workloads
  • Establish best practices for data organization, storage optimization, and query performance across different data architecture patterns
  • AI Support and Advanced Analytics Collaboration (15%): Support AI and data science teams with Snowflake platform capabilities and best practices
  • Collaborate on implementing Snowflake Cortex AI features for business use cases
  • Fulltime
Read More
Arrow Right

Lead Data Engineer

We're looking for a Lead Data Engineer to build the data infrastructure that pow...
Location
Location
United States
Salary
Salary:
185000.00 - 225000.00 USD / Year
zora.co Logo
Zora
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in data engineering, with at least 2 years in a technical leadership role
  • Strong proficiency in Python and SQL for building production data pipelines, complex data transformations and evolving data platforms, shared infrastructure, and internal tooling with engineering best practices.
  • Strong experience in designing, building, and maintaining cloud-based data pipelines using orchestration tools such as Airflow, Dagster, Prefect, Temporal, or similar.
  • Hands-on experience with cloud data platforms (AWS, GCP, or Azure) and modern data stack tools
  • Deep understanding of data warehousing concepts and experience with platforms like Snowflake, BigQuery, Redshift, or similar
  • Strong software engineering fundamentals including testing, CI/CD, version control, and writing maintainable, documented code
  • Track record of optimizing data systems for performance, reliability, and cost efficiency at scale
  • Excellent communication skills and ability to collaborate with cross-functional teams including product, engineering, and design
Job Responsibility
Job Responsibility
  • Design and build scalable data pipelines to ingest, process, and transform blockchain data, trading events, user activity, and market signals at high volume and low latency
  • Architect and maintain data infrastructure that powers real-time trading analytics, P&L calculations, leaderboards, market cap tracking, and liquidity monitoring across the platform
  • Own ETL/ELT processes that transform raw onchain data from multiple blockchains into clean, reliable, and performant datasets used by product, engineering, analytics, and ML teams
  • Build and optimize data models and schemas that support both operational systems (serving live trading data) and analytical use cases (understanding market dynamics and user behavior)
  • Establish data quality frameworks including monitoring, alerting, testing, and validation to ensure pipeline reliability and data accuracy at scale
  • Collaborate with backend engineers to design event schemas, data contracts, and APIs that enable real-time data flow between systems
  • Partner with product and analytics teams to understand data needs and translate them into robust engineering solutions
  • Provide technical leadership by mentoring engineers, conducting code reviews, establishing best practices, and driving architectural decisions for the data platform
  • Optimize performance and costs of data infrastructure as we scale to handle exponentially growing trading volumes
What we offer
What we offer
  • Remote-First Culture: Work from anywhere in the world!
  • Competitive Compensation: Including salary, pre-IPO stock options, token compensation, and additional financial incentives
  • Comprehensive Benefits: Robust healthcare options, including fully covered medical, dental, and vision for employees
  • Retirement Contributions: Up to 4% employer match on your 401(k) contributions
  • Health & Wellness: Free memberships to One Medical, Teladoc, and Health Advocate
  • Unlimited Time Off: Flexible vacation policies, company holidays, and recharge weeks to prioritize wellness
  • Home Office Reimbursement: To cover home office items, monthly home internet, and monthly cell phone (if applicable)
  • Ease of Life Reimbursement: To cover everything from an Uber home in the rain, childcare, or meal delivery
  • Career Development: Access to mentorship, training, and opportunities to grow your career
  • Inclusive Environment: A culture dedicated to diversity, equity, inclusion, and belonging
  • Fulltime
Read More
Arrow Right

Data Engineering & Analytics Lead

Premium Health is seeking a highly skilled, hands-on Data Engineering & Analytic...
Location
Location
United States , Brooklyn
Salary
Salary:
Not provided
premiumhealth.org Logo
Premium Health
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Engineering, or a related field. Master's degree preferred
  • Proven track record and progressively responsible experience in data engineering, data architecture, or related technical roles
  • healthcare experience preferred
  • Strong knowledge of data engineering principles, data integration, ETL processes, and semantic mapping techniques and best practices
  • Experience implementing data quality management processes, data governance frameworks, cataloging, and master data management concepts
  • Familiarity with healthcare data standards (e.g., HL7, FHIR, etc), health information management principles, and regulatory requirements (e.g., HIPAA)
  • Understanding of healthcare data, including clinical, operational, and financial data models, preferred
  • Advanced proficiency in SQL, data modeling, database design, optimization, and performance tuning
  • Experience designing and integrating data from disparate systems into harmonized data models or semantic layers
  • Hands-on experience with modern cloud-based data platforms (e.g Azure, AWS, GCP)
Job Responsibility
Job Responsibility
  • Collaborate with the CDIO and Director of Technology to define a clear data vision aligned with the organization's goals and execute the enterprise data roadmap
  • Serve as a thought leader for data engineering and analytics, guiding the evolution of our data ecosystem and championing data-driven decision-making across the organization
  • Build and mentor a small data team, providing technical direction and performance feedback, fostering best practices and continuous learning, while remaining a hands-on implementor
  • Define and implement best practices, standards, and processes for data engineering, analytics, and data management across the organization
  • Design, implement, and maintain a scalable, reliable, and high-performing modern data infrastructure, aligned with the organizational needs and industry best practices
  • Architect and maintain data lake/lakehouse, warehouse, and related platform components to support analytics, reporting, and operational use cases
  • Establish and enforce data architecture standards, governance models, naming conventions ,and documentation
  • Develop, optimize, and maintain scalable ETL/ELT pipelines and data workflows to collect, transform, normalize, and integrate data from diverse systems
  • Implement robust data quality processes, validation, monitoring, and error-handling frameworks
  • Ensure data is accurate, timely, secure, and ready for self-service analytics and downstream applications
What we offer
What we offer
  • Paid Time Off, Medical, Dental and Vision plans, Retirement plans
  • Public Service Loan Forgiveness (PSLF)
  • Fulltime
Read More
Arrow Right

Lead Data Product Engineer

We are looking for an experienced Lead Data Product Engineer to drive the develo...
Location
Location
United States , Atlanta
Salary
Salary:
Not provided
https://www.roberthalf.com Logo
Robert Half
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years of experience in data architecture, engineering, and technical leadership
  • Strong proficiency in SQL, Python, and DevOps methodologies
  • Expertise in AWS cloud data services such as Spark, Glue, Lambda, Redshift, Snowflake, and Kafka
  • Hands-on experience in data modeling, ETL development, and scalable data product design
  • Prior experience as a DBA is a plus
  • Background in the manufacturing or technology industry is beneficial
Job Responsibility
Job Responsibility
  • Design and implement optimized data models for Data Lakes and Data Products
  • Develop business mapping strategies across multiple datasets
  • Lead data architecture efforts, ensuring best practices in ETL and data engineering
  • Provide mentorship and technical guidance to engineering teams
  • Build high-performance, scalable, and automated data solutions leveraging AWS cloud services
  • Collaborate with cross-functional teams to bridge Data Engineering and Product Development
What we offer
What we offer
  • Medical, vision, dental, and life and disability insurance
  • Eligibility to enroll in company 401(k) plan
  • Fulltime
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.