CrawlJobs Logo

Sr. Data Engineer - Snowflake

dataideology.com Logo

Data Ideology

Location Icon

Location:

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Data Ideology is seeking a Sr. Snowflake Data Engineer to join our growing team in a client-facing, consultative role that blends solution design with hands-on delivery. You will work directly with clients and our internal teams to design, implement, and optimize modern data platforms in Snowflake. This role is ideal for experienced data engineers who want to expand their impact by engaging with clients early in the process to understand requirements, shape solutions, and then lead the delivery of high-quality implementations.

Job Responsibility:

  • Design and build scalable, secure, and cost-effective data solutions in Snowflake
  • Develop and optimize data pipelines using tools such as dbt, Python, CloverDX, and cloud-native services
  • Participate in discovery sessions with clients to gather requirements and translate them into solution designs and project plans
  • Collaborate with engagement managers and account teams to help scope work and provide technical input for Statements of Work (SOWs)
  • Serve as a Snowflake subject matter expert, guiding best practices in performance tuning, cost optimization, access control, and workload management
  • Lead modernization and migration initiatives to move clients from legacy systems into Snowflake
  • Integrate Snowflake with BI tools, governance platforms, and AI/ML frameworks
  • Contribute to internal accelerators, frameworks, and proofs of concept
  • Mentor junior engineers and support knowledge sharing across the team

Requirements:

  • 7+ years of experience in data engineering, data warehousing, or data architecture
  • 3+ years of hands-on Snowflake experience (performance tuning, data sharing, Snowpark, Snowpipe, etc.)
  • Strong SQL and Python skills, with production experience using dbt
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and modern data tooling (Airflow, Fivetran, Power BI, Looker, Informatica, etc.)
  • Prior experience in a consulting or client-facing delivery role
  • Excellent communication skills, with the ability to collaborate across technical and business stakeholders
  • SnowPro Core Certification required (or willingness to obtain upon hire)
  • advanced Snowflake certifications preferred
What we offer:
  • Flexible Time Off Policy
  • Eligibility for Health Benefits
  • Retirement Plan with Company Match
  • Training and Certification Reimbursement
  • Utilization Based Incentive Program
  • Commission Incentive Program
  • Referral Bonuses
  • Work from Home

Additional Information:

Job Posted:
December 08, 2025

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Sr. Data Engineer - Snowflake

Sr. Data Engineer

We are looking for a skilled Sr. Data Engineer to join our team in Oklahoma City...
Location
Location
United States , Oklahoma City
Salary
Salary:
Not provided
https://www.roberthalf.com Logo
Robert Half
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven experience with Snowflake data warehousing and schema design
  • proficiency in ETL tools such as Matillion or similar platforms
  • strong knowledge of Python and PowerShell for data automation
  • experience working with Microsoft SQL Server and related technologies
  • familiarity with cloud technologies, particularly AWS
  • understanding of data visualization and analytics tools
  • background in working with big data technologies such as Apache Kafka, Hadoop, Spark, or Pig
  • ability to design and implement APIs for data integration and management.
Job Responsibility
Job Responsibility
  • Design, implement, and maintain Snowflake data warehousing solutions to support business needs
  • assist in the migration of in-house data to Snowflake, ensuring a seamless transition
  • develop data pipelines and workflows using tools such as Matillion or equivalent ETL solutions
  • collaborate with teams to optimize and manage the existing data warehouse built on Microsoft SQL Server
  • utilize Python and PowerShell to automate data processes and enhance system efficiency
  • partner with the implementation team to shadow and learn best practices for Snowflake deployment
  • ensure data integrity, scalability, and security across all data engineering processes
  • provide insights into data visualization and analytics to support decision-making
  • work with cloud technologies, including AWS, to enhance data storage and accessibility
  • implement and manage APIs to enable seamless data integration and sharing.
What we offer
What we offer
  • Medical, vision, dental, and life and disability insurance
  • eligibility to enroll in 401(k) plan
  • access to competitive compensation and free online training.
  • Fulltime
Read More
Arrow Right

Sr. Data Engineer

We are looking for a Sr. Data Engineer to join our growing Quality Engineering t...
Location
Location
Salary
Salary:
Not provided
dataideology.com Logo
Data Ideology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Information Systems, or a related field (or equivalent experience)
  • 5+ years of experience in data engineering, data warehousing, or data architecture
  • Expert-level experience with Snowflake, including data modeling, performance tuning, security, and migration from legacy platforms
  • Hands-on experience with Azure Data Factory (ADF) for building, orchestrating, and optimizing data pipelines
  • Strong experience with Informatica (PowerCenter and/or IICS) for ETL/ELT development, workflow management, and performance optimization
  • Deep knowledge of data modeling techniques (dimensional, tabular, and modern cloud-native patterns)
  • Proven ability to translate business requirements into scalable, high-performance data solutions
  • Experience designing and supporting end-to-end data pipelines across cloud and hybrid architectures
  • Strong proficiency in SQL and experience optimizing large-scale analytic workloads
  • Experience working within SDLC frameworks, CI/CD practices, and version control
Job Responsibility
Job Responsibility
  • Ability to collect and understand business requirements and translate those requirements into data models, integration strategies, and implementation plans
  • Lead modernization and migration initiatives to move clients from legacy systems into Snowflake, ensuring functionality, performance and data integrity
  • Ability to work within the SDLC framework in multiple environments and understand the complexities and dependencies of the data warehouse
  • Optimize and troubleshoot ETL/ELT workflows, applying best practices for scheduling, orchestration, and performance tuning
  • Maintain documentation, architecture diagrams, and migration plans to support knowledge transfer and project tracking
What we offer
What we offer
  • PTO Policy
  • Eligibility for Health Benefits
  • Retirement Plan
  • Work from Home
  • Fulltime
Read More
Arrow Right

Sr Data Engineer

(Locals or Nearby resources only). You will work with technologies that include ...
Location
Location
United States , Glendale
Salary
Salary:
Not provided
enormousenterprise.com Logo
Enormous Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of data engineering experience developing large data pipelines
  • Proficiency in at least one major programming language (e.g. Python, Java, Scala)
  • Hands-on production environment experience with distributed processing systems such as Spark
  • Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
  • Experience with at least one major Massively Parallel Processing (MPP) or cloud database technology (Snowflake, Databricks, Big Query)
  • Experience in developing APIs with GraphQL
  • Advance understanding of OLTP vs OLAP environments
  • Candidates must work W2, no Corp 2 Corp
  • US Citizen, Green Card Holder, H4-EAD, TN-Visa
  • Airflow
Job Responsibility
Job Responsibility
  • Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines
  • Build and maintain APIs to expose data to downstream applications
  • Develop real-time streaming data pipelines
  • Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform
  • Contribute to developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, and more
  • Ensure high operational efficiency and quality of the Core Data platform datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our stakeholders (Engineering, Data Science, Operations, and Analytics teams)
What we offer
What we offer
  • 3 levels of medical insurance for you and your family
  • Dental insurance for you and your family
  • 401k
  • Overtime
  • Sick leave policy: accrue 1 hour for every 30 hours worked up to 48 hours
Read More
Arrow Right

Sr Engineers, Software

T-Mobile is America’s supercharged Un-carrier, delivering an advanced 4G LTE and...
Location
Location
United States , Frisco
Salary
Salary:
131300.00 - 177700.00 USD / Year
https://www.t-mobile.com Logo
T-Mobile
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Master's degree in Computer Science, Electrical Engineering, or related, and 1 year of relevant work experience
  • Bachelor's degree in Computer Science, Electrical Engineering, or related, and 3 years of relevant work experience
  • Developing high performance, scalable Business Intelligence Extract-Transform-Load pipelines using ETL tools and scripting, including Unix Shell Scripting, BTEQ Scripting, Oracle SQL Scripting
  • Performing PL/SQL and SQL queries, data analysis, creation of stored procedures, and validation of revenue for prepaid or postpaid data in Teradata and Oracle Databases that is capable of processing at least 4 million transactions per day
  • Monitor and optimize the performance of ETL processes, including query performance and data load times, identify and resolve performance bottlenecks
  • On premise Teradata to Snowflake cloud database migration using modern technology stack – Snowflake, Azure Data Factory, Liquibase data model deployments, dbt for transformations, jinja scripting, Gitlab for code management and DevOps CICD
  • Coding in a DevOps environment with Gitlab within a SCRUM framework to deliver Continuous Integration Continuous Delivery automations for market growth
  • Creating and scheduling daily ETL jobs using Control M to process previous-day files using the provided reference and transactional postpaid or prepaid data for revenue recognition. Posting accurate financial results to SAP following SOX controls and audit check mechanisms
Job Responsibility
Job Responsibility
  • Analyze user needs and develop software solutions, applying principles and techniques of computer science, engineering, and mathematical analysis
  • Develop and automate large scale, high-performance ETL processes and data pipelines to drive Data Standardization and reporting utilizing Azure Technologies
  • Possess ability to effectively communicate with non-technical business customers and other stakeholders regarding design, requirements, functionality, and limitations of systems/applications
  • Possess ability to clearly articulate platform options and risks to non-technical team members
  • Contribute to a team that is responsible for the design, development, and launch of Accounting & Finance capabilities
  • Help translate business and functional requirements into well documented technical specifications and user stories
  • Utilize development skills to build (code) and unit test new systems functionality per technical specifications, with deliverables to include code builds and documented unit test results
  • Collaborate with other teams including Accounting & Finance Business Partners, Enterprise Testing, Release Planning and Management, Project Management, and Product Operations on successful delivery of product enhancements and support
  • Maintain technical skills and expertise through continuing education and training
What we offer
What we offer
  • Competitive base salary and compensation package
  • Annual stock grant
  • Employee stock purchase plan
  • 401(k)
  • Access to free, year-round money coaches
  • Medical, dental and vision insurance
  • Flexible spending account
  • Paid time off
  • Up to 12 paid holidays
  • Paid parental and family leave
  • Fulltime
Read More
Arrow Right

Sr Staff Data Engineer

Role : Sr Staff Data Engineer (Snowflake expert). Blue Yonder's Platform Data Cl...
Location
Location
United States , Dallas; Scottsdale
Salary
Salary:
122219.00 - 189615.00 USD / Year
blueyonder.com Logo
Blue Yonder
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Masters/Bachelor’s Degree in Computer Science or related field REQUIRED
  • Min of 12 to 15 years of experience in software development
  • 5+ years of experience in Snowflake data engineering and Python/Streamlit preferred
  • Expertise with Snowflake Architecture - Compute/Storage/Cloud Services
  • Ability to write Advanced SQL
  • Bulk data copy using snowflake /Snowpipe / Tasks/Streams
  • Time travel, Zero Copy Cloning
  • Optimizer, Metadata Manager
  • Role based access control (RBAC)
  • Data Sharing/Marketplace/ Listings/ Data Exchanges
Job Responsibility
Job Responsibility
  • Consistently delivers solid quality in both design and implementation and helps the team shape what is built how
  • Core responsibilities includes working experience with building productionized data ingestion and processing data pipelines in Snowflake
  • Core responsibilities includes focusing on innovation and improving delivery effectiveness by driving product development features across the organization in software development, deployment, and infrastructure consistency
  • Provide standardized enterprise solutions for cloud infrastructure and application deployment across multiple products
  • Apply the appropriate software engineering patterns utilizing a meta-driven systems approach to build robust and scalable systems
  • Expert in Object Oriented and Functional programming and the ability to apply your skills in developing Blue Yonder products
  • Influence fellow engineers by proposing software designs, providing feedback on software designs and/or implementation
  • Demonstrated experience in large-scale Kubernetes systems, including how to sufficiently scale very large services in Kubernetes
  • Modular and Service based Architectural Design
What we offer
What we offer
  • Comprehensive Medical, Dental and Vision
  • 401K with Matching
  • Flexible Time Off
  • Corporate Fitness Program
  • Legal Plans, Accident and Hospital Indemnity, Pet Insurance
  • Fulltime
Read More
Arrow Right

Sr Data Engineer

The Sr Data Engineer is essential for designing and developing data architecture...
Location
Location
United States , Overland Park; Atlanta; Frisco
Salary
Salary:
105100.00 - 189600.00 USD / Year
https://www.t-mobile.com Logo
T-Mobile
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree plus 5 years of related work experience OR Advanced degree with 3 years of related experience
  • Acceptable areas of study include Computer Engineering, Computer Science, a related subject area
  • 4-7+ years Developing cloud solutions using data series
  • experience with cloud platforms (Azure Data Factory, Azure Databricks, and Snowflake)
  • 4-7+ years Hands-on development using and migrating data to cloud platforms
  • 4-7+ years Experience in SQL, NoSQL, and/or relational database design and development
  • 4-7+ years Advanced knowledge and experience in building complex data pipelines with experience in languages such as Python, SQL, Scala, and Spark
  • Analytical approach to problem-solving
  • ability to use technology to tackle business problems
  • Knowledge Of Message Queuing, Stream Processing, And Highly Scalable ‘Big Data’ Data Stores
Job Responsibility
Job Responsibility
  • Develop data engineering solutions that enable data pipelines, data transformation, data privacy and analytical tools within the T-Mobile Customer Data Platform (CDP)
  • Collaborate on analysis, architecture, design, and development of data products within the T-Mobile Customer Data Platform (CDP)
  • Design and develop data architectures across on-premise, cloud, and hybrid platforms to ensure scalable data infrastructure
  • Perform data wrangling, exploration, and discovery of heterogeneous data to generate new business insights
  • Tackling the most complex and innovative tasks of the organization usually with strict time constraints.
  • Support white-boarding sessions, workshops, design sessions, and project meetings as needed
  • Contribute to team knowledge sharing and drive the advancement of new data engineering capabilities
  • Mentor team members to build and enhance their data engineering skillsets and professional growth
  • Assist management in project definition, including estimating, planning, and scoping work to meet objectives
  • Also responsible for other duties/projects as assigned by business management as needed
What we offer
What we offer
  • Competitive base salary and compensation package
  • Annual stock grant
  • Employee stock purchase plan
  • 401(k)
  • Access to free, year-round money coaches
  • Medical, dental and vision insurance
  • Flexible spending account
  • Employee stock grants
  • Employee stock purchase plan
  • Paid time off
  • Fulltime
Read More
Arrow Right

Data Engineer Sr (AWS)

Data Engineers are responsible for designing, building, and maintaining the syst...
Location
Location
Mexico , GDL
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 6 - 8 years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects
  • 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions
  • Strong proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server)
  • Experience with data pipeline tools (e.g., Apache Airflow, Luigi, Prefect)
  • Proficiency with at least one programming language (Python, Java, or Scala)
  • Experience with cloud platforms (AWS) and their data services (e.g., Redshift, BigQuery, Snowflake, Databricks)
  • Familiarity with data modeling, warehousing, and schema design
  • Understanding of data governance, privacy, and security best practices
Job Responsibility
Job Responsibility
  • Design and implement tailored data solutions to meet customer needs and use cases
  • Provide thought leadership by recommending the most appropriate technologies and solutions
  • Demonstrate proficiency in coding skills to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations
  • Collaborate seamlessly across diverse technical stacks
  • Develop and deliver detailed presentations to effectively communicate complex technical concepts
  • Generate comprehensive solution documentation
  • Adhere to Agile practices throughout the solution development process
  • Design, build, and deploy databases and data stores to support organizational requirements
  • Ensure data quality, consistency and governance across multiple sources
Read More
Arrow Right

Sr Data Engineer

As a Data Engineer, you will be responsible for designing, building, maintaining...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
amgen.com Logo
Amgen
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Doctorate degree / Master's degree / Bachelor's degree and 8 to 13 years of Computer Science, IT or related field experience
  • Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL),Snowflake, workflow orchestration, performance tuning on big data processing
  • Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops
  • Proficient in SQL, Python for extracting, transforming, and analyzing complex datasets from relational data stores
  • Proficient in Python with strong experience in ETL tools such as Apache Spark and various data processing packages, supporting scalable data workflows and machine learning pipeline development.
  • Strong understanding of data modeling, data warehousing, and data integration concepts
  • Proven ability to optimize query performance on big data platforms
  • Knowledge on Data visualization and analytics tools like Spotfire, PowerBI
Job Responsibility
Job Responsibility
  • Design, develop, and maintain data solutions for data generation, collection, and processing
  • Be a key team member that assists in design and development of the data pipeline
  • Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems
  • Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions
  • Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency
  • Implement data security and privacy measures to protect sensitive data
  • Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions
  • Collaborate and communicate effectively with product teams
  • Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions
  • Identify and resolve complex data-related challenges
What we offer
What we offer
  • Competitive and comprehensive Total Rewards Plans that are aligned with local industry standards
Read More
Arrow Right