CrawlJobs Logo

Snowflake Platform Engineer

bvteck.com Logo

Bright Vision Technologies

Location Icon

Location:
United States , Bridgewater

Category Icon

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

Not provided

Job Description:

Bright Vision Technologies is looking for a skilled Snowflake Platform Engineer to join our dynamic team and contribute to our mission of transforming business processes through technology. We leverage cutting-edge cloud data platform technologies to design scalable, secure, and high-performance analytics environments.

Requirements:

  • Snowflake Data Cloud
  • Snowflake Architecture & Optimization
  • SQL
  • Data Warehousing
  • Cloud Platforms (AWS / Azure / GCP)
  • Data Security & Governance
  • Role-Based Access Control (RBAC)
  • Performance Tuning
  • Cost Optimization
  • Data Sharing
  • ETL/ELT Pipelines
  • CI/CD for Data
  • Python
  • Linux
  • Git
  • Agile methodologies
  • At least 3 to 5 years real time experience
  • Willing to take an AI-proctored online coding test
  • Willing to relocate nationwide
  • Looking for H-1B sponsorship for the 2026 quota
  • Must proceed exclusively with Bright Vision Technologies and commit to paying the USCIS-mandated $215 H-1B registration fee after an offer is issued
What we offer:
  • H-1B sponsorship for the 2026 quota
  • H-1B filing with level 4 prevailing wage

Additional Information:

Job Posted:
February 13, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Snowflake Platform Engineer

Data Engineer – Snowflake & ETL

We are seeking a Data Engineer in Hyderabad (WFO) with expertise in data enginee...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
rightanglesol.com Logo
Right Angle Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum 5+ years of experience in data engineering, ETL, and Snowflake development
  • Proven expertise in Snowflake including SQL scripting, performance tuning, and data warehousing concepts
  • Hands-on experience with Matillion ETL for building and maintaining ETL jobs
  • Strong understanding of cloud platforms (AWS, Azure, or GCP) and cloud-based data architectures
  • Proficiency in SQL, Python, or other scripting languages for automation and data transformation
  • Experience with API integrations and data ingestion frameworks
  • Knowledge of data governance, security policies, and access control within Snowflake environments
  • Excellent communication skills with the ability to engage both business and technical stakeholders
  • Self-motivated professional capable of working independently and delivering projects on time
  • Qualification: BE/BS/MTech/MS or equivalent work experience
  • Fulltime
Read More
Arrow Right

Senior Data Engineer – Data Engineering & AI Platforms

We are looking for a highly skilled Senior Data Engineer (L2) who can design, bu...
Location
Location
India , Chennai, Madurai, Coimbatore
Salary
Salary:
Not provided
optisolbusiness.com Logo
OptiSol Business Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong hands-on expertise in cloud ecosystems (Azure / AWS / GCP)
  • Excellent Python programming skills with data engineering libraries and frameworks
  • Advanced SQL capabilities including window functions, CTEs, and performance tuning
  • Solid understanding of distributed processing using Spark/PySpark
  • Experience designing and implementing scalable ETL/ELT workflows
  • Good understanding of data modeling concepts (dimensional, star, snowflake)
  • Familiarity with GenAI/LLM-based integration for data workflows
  • Experience working with Git, CI/CD, and Agile delivery frameworks
  • Strong communication skills for interacting with clients, stakeholders, and internal teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable ETL/ELT pipelines across cloud and big data platforms
  • Contribute to architectural discussions by translating business needs into data solutions spanning ingestion, transformation, and consumption layers
  • Work closely with solutioning and pre-sales teams for technical evaluations and client-facing discussions
  • Lead squads of L0/L1 engineers—ensuring delivery quality, mentoring, and guiding career growth
  • Develop cloud-native data engineering solutions using Python, SQL, PySpark, and modern data frameworks
  • Ensure data reliability, performance, and maintainability across the pipeline lifecycle—from development to deployment
  • Support long-term ODC/T&M projects by demonstrating expertise during technical discussions and interviews
  • Integrate emerging GenAI tools where applicable to enhance data enrichment, automation, and transformations
What we offer
What we offer
  • Opportunity to work at the intersection of Data Engineering, Cloud, and Generative AI
  • Hands-on exposure to modern data stacks and emerging AI technologies
  • Collaboration with experts across Data, AI/ML, and cloud practices
  • Access to structured learning, certifications, and leadership mentoring
  • Competitive compensation with fast-track career growth and visibility
  • Fulltime
Read More
Arrow Right

Data Platform Engineer

Finaloop revolutionizes financial operations for eCommerce businesses! Our cutti...
Location
Location
Israel , Tel Aviv
Salary
Salary:
Not provided
finaloop.com Logo
Finaloop
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years experience in data engineering or platform engineering roles
  • Strong programming skills in Python and SQL
  • Experience with orchestration platforms like Airflow/Dagster/Temporal
  • Experience with MPPs like Snowflake/Redshift/Databricks
  • Hands-on experience with cloud platforms (AWS) and their data services
  • Understanding of data modeling, data warehousing, and data lake concepts
  • Ability to optimize data infrastructure for performance and reliability
  • Experience working with containerization (Docker) in Kubernetes environments
  • Familiarity with CI/CD concepts
  • Fluent in English, both written and verbal
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable data pipelines and ETL processes for our financial data platform
  • Develop and optimize data infrastructure to support real-time analytics and reporting
  • Implement data governance, security, and privacy controls to ensure data quality and compliance
  • Create and maintain documentation for data platforms and processes
  • Collaborate with data scientists and analysts to deliver actionable insights to our customers
  • Troubleshoot and resolve data infrastructure issues efficiently
  • Monitor system performance and implement optimizations
  • Stay current with emerging technologies and implement innovative solutions
Read More
Arrow Right

Sr. Data Engineer - Snowflake

Data Ideology is seeking a Sr. Snowflake Data Engineer to join our growing team ...
Location
Location
Salary
Salary:
Not provided
dataideology.com Logo
Data Ideology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in data engineering, data warehousing, or data architecture
  • 3+ years of hands-on Snowflake experience (performance tuning, data sharing, Snowpark, Snowpipe, etc.)
  • Strong SQL and Python skills, with production experience using dbt
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and modern data tooling (Airflow, Fivetran, Power BI, Looker, Informatica, etc.)
  • Prior experience in a consulting or client-facing delivery role
  • Excellent communication skills, with the ability to collaborate across technical and business stakeholders
  • SnowPro Core Certification required (or willingness to obtain upon hire)
  • advanced Snowflake certifications preferred
Job Responsibility
Job Responsibility
  • Design and build scalable, secure, and cost-effective data solutions in Snowflake
  • Develop and optimize data pipelines using tools such as dbt, Python, CloverDX, and cloud-native services
  • Participate in discovery sessions with clients to gather requirements and translate them into solution designs and project plans
  • Collaborate with engagement managers and account teams to help scope work and provide technical input for Statements of Work (SOWs)
  • Serve as a Snowflake subject matter expert, guiding best practices in performance tuning, cost optimization, access control, and workload management
  • Lead modernization and migration initiatives to move clients from legacy systems into Snowflake
  • Integrate Snowflake with BI tools, governance platforms, and AI/ML frameworks
  • Contribute to internal accelerators, frameworks, and proofs of concept
  • Mentor junior engineers and support knowledge sharing across the team
What we offer
What we offer
  • Flexible Time Off Policy
  • Eligibility for Health Benefits
  • Retirement Plan with Company Match
  • Training and Certification Reimbursement
  • Utilization Based Incentive Program
  • Commission Incentive Program
  • Referral Bonuses
  • Work from Home
  • Fulltime
Read More
Arrow Right

Machine Learning Platform / Backend Engineer

We are seeking a Machine Learning Platform/Backend Engineer to design, build, an...
Location
Location
Serbia; Romania , Belgrade; Timișoara
Salary
Salary:
Not provided
everseen.ai Logo
Everseen
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4-5+ years of work experience in either ML infrastructure, MLOps, or Platform Engineering
  • Bachelors degree or equivalent focusing on the computer science field is preferred
  • Excellent communication and collaboration skills
  • Expert knowledge of Python
  • Experience with CI/CD tools (e.g., GitLab, Jenkins)
  • Hands-on experience with Kubernetes, Docker, and cloud services
  • Understanding of ML training pipelines, data lifecycle, and model serving concepts
  • Familiarity with workflow orchestration tools (e.g., Airflow, Kubeflow, Ray, Vertex AI, Azure ML)
  • A demonstrated understanding of the ML lifecycle, model versioning, and monitoring
  • Experience with ML frameworks (e.g., TensorFlow, PyTorch)
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable infrastructure that empowers data scientists and machine learning engineers
  • Own the design and implementation of the internal ML platform, enabling end-to-end workflow orchestration, resource management, and automation using cloud-native technologies (GCP/Azure)
  • Design and manage Kubernetes-based infrastructure for multi-tenant GPU and CPU workloads with strong isolation, quota control, and monitoring
  • Integrate and extend orchestration tools (Airflow, Kubeflow, Ray, Vertex AI, Azure ML or custom schedulers) to automate data processing, training, and deployment pipelines
  • Develop shared services for model behavior/performance tracking, data/datasets versioning, and artifact management (MLflow, DVC, or custom registries)
  • Build out documentation in relation to architecture, policies and operations runbooks
  • Share skills, knowledge, and expertise with members of the data engineering team
  • Foster a culture of collaboration and continuous learning by organizing training sessions, workshops, and knowledge-sharing sessions
  • Collaborate and drive progress with cross-functional teams to design and develop new features and functionalities
  • Ensure that the developed solutions meet project objectives and enhance user experience
  • Fulltime
Read More
Arrow Right

Senior Data Platform Engineer

At Fever, our engineering team powers the technology behind our apps and website...
Location
Location
Spain , Madrid
Salary
Salary:
Not provided
https://feverup.com/fe Logo
Fever
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Expert in Python and frameworks like FastAPI or Django
  • Experience with Snowflake, PostgreSQL, and data management best practices
  • Familiar with IaC (Terraform), orchestration tools (Airflow, Metaflow), and CI/CD (Jenkins)
  • Experience with Kubernetes, Kafka, and GitOps tools like ArgoCD
  • Skilled in observability tools like Datadog, Grafana, and Prometheus
  • Strong communicator and team player in international, cross-functional environments
  • Proactive, adaptable, and solution-oriented
  • Fluent in English for effective communication
Job Responsibility
Job Responsibility
  • Build and maintain a scalable, reliable data platform
  • Implement data governance policies to ensure data quality, consistency, and security
  • Develop observability systems for platform monitoring and reliability
  • Build and automate infrastructure: data warehouses, lakes, pipelines, and real-time systems
  • Apply Infrastructure as Code (IaC) practices using tools like Terraform
  • Promote best practices for data engineering and platform operations
  • Build internal tools that simplify data-driven application development
  • Work with teams across the company to understand and meet data needs
What we offer
What we offer
  • Attractive compensation package consisting of base salary and the potential to earn a significant bonus for top performance
  • Stock options
  • Opportunity to have a real impact in a high-growth global category leader
  • 40% discount on all Fever events and experiences
  • Home office friendly
  • Responsibility from day one and professional and personal growth
  • Great work environment with a young, international team of talented people to work with
  • Health insurance and other benefits such as Flexible remuneration with a 100% tax exemption through Cobee
  • English Lessons
  • Gympass Membership
  • Fulltime
Read More
Arrow Right

Snowflake Solutions Engineer

We are seeking an innovative Snowflake Solutions Engineer to join our growing IT...
Location
Location
United States , Easton
Salary
Salary:
Not provided
victaulic.com Logo
Victaulic
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Information Systems, Data Engineering, Data Science or related technical field
  • At least 2 years of recent hands-on experience with Snowflake platform including advanced features
  • Minimum 3 years of experience in data engineering or solutions architecture roles
  • 7-10 years of experience in Data Architecture/Engineering and/or BI in a multi-dimensional environment
  • Proven track record of developing data applications or analytical solutions for business users
  • Snowflake Expertise: Advanced knowledge of Snowflake architecture including data warehousing, data lakes, and emerging lakehouse features
  • Security and Governance: Deep understanding of RBAC, row-level security, data masking, and Snowflake security best practices
  • DevOps and CI/CD: Strong experience with GitHub, SnowDDL, automated deployment pipelines, and infrastructure as code
  • Application Development: Proficiency with Snowflake Streamlit for building interactive data applications
  • SQL Proficiency: Expert-level SQL skills with experience in complex analytical queries and optimization
Job Responsibility
Job Responsibility
  • Snowflake Native Application Development (30%): Design and develop interactive data applications using Snowflake Streamlit for self-service analytics and operational workflows that enable business users to interact with data through intuitive interfaces
  • Create reusable application frameworks and component libraries for rapid solution delivery
  • Integrate Snowflake Native Apps and third-party marketplace applications to extend platform capabilities
  • Develop custom UDFs and stored procedures to support advanced application logic and business rules
  • Data Architecture and Modern Platform Design (30%): Design and implement modern data architecture solutions spanning data warehousing, data lakes, and lakehouse patterns
  • Implement and maintain medallion architecture (bronze-silver-gold) patterns for data quality and governance
  • Evaluate and recommend architecture patterns for diverse use cases including structured analytics, semi-structured data processing, and AI/ML workloads
  • Establish best practices for data organization, storage optimization, and query performance across different data architecture patterns
  • AI Support and Advanced Analytics Collaboration (15%): Support AI and data science teams with Snowflake platform capabilities and best practices
  • Collaborate on implementing Snowflake Cortex AI features for business use cases
  • Fulltime
Read More
Arrow Right

Big Data Platform Senior Engineer

Lead Java Data Engineer to guide and mentor a talented team of engineers in buil...
Location
Location
Bahrain , Seef, Manama
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Significant hands-on experience developing high-performance Java applications (Java 11+ preferred) with strong foundation in core Java concepts, OOP, and OOAD
  • Proven experience building and maintaining data pipelines using technologies like Kafka, Apache Spark, or Apache Flink
  • Familiarity with event-driven architectures and experience in developing real-time, low-latency applications
  • Deep understanding of distributed systems concepts and experience with MPP platforms such as Trino (Presto) or Snowflake
  • Experience deploying and managing applications on container orchestration platforms like Kubernetes, OpenShift, or ECS
  • Demonstrated ability to lead and mentor engineering teams, communicate complex technical concepts effectively, and collaborate across diverse teams
  • Excellent problem-solving skills and data-driven approach to decision-making
Job Responsibility
Job Responsibility
  • Provide technical leadership and mentorship to a team of data engineers
  • Lead the design and development of highly scalable, low-latency, fault-tolerant data pipelines and platform components
  • Stay abreast of emerging open-source data technologies and evaluate their suitability for integration
  • Continuously identify and implement performance optimizations across the data platform
  • Partner closely with stakeholders across engineering, data science, and business teams to understand requirements
  • Drive the timely and high-quality delivery of data platform projects
  • Fulltime
Read More
Arrow Right