This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are looking for a talented Data Engineer to join our team in Grand Rapids, Michigan. In this role, you will focus on designing, building, and optimizing robust data solutions using Snowflake and other cloud-based technologies. You will work closely with business intelligence and analytics teams to deliver scalable, high-performance data pipelines that support organizational goals.
Job Responsibility:
Design and implement scalable data models, schemas, and tables within Snowflake, including staging, integration, and presentation layers
Develop and optimize data pipelines using Snowflake tools such as Snowpipe, Streams, Tasks, and stored procedures
Ensure data security and access through role-based controls and best practices for data sharing
Build and maintain ETL pipelines leveraging tools like dbt, Matillion, Fivetran, Informatica, or Azure-native solutions
Integrate data from diverse sources such as APIs, IoT devices, and NoSQL databases to create unified datasets
Enhance performance by utilizing clustering, partitioning, caching, and efficient warehouse sizing strategies
Collaborate with cloud technologies such as AWS, Azure, or Google Cloud to support Snowflake infrastructure and operations
Implement automated workflows and CI/CD processes for seamless deployment of data solutions
Maintain high standards for data accuracy, completeness, and reliability while supporting governance and documentation
Work closely with analytics, reporting, and business teams to troubleshoot issues and deliver scalable solutions
Requirements:
At least 3 years of hands-on experience with Snowflake, including SnowSQL, Snowpipe, Streams, Tasks, and Warehouses
Strong SQL skills with expertise in performance tuning and complex query development
Familiarity with cloud platforms such as AWS, Azure, or Google Cloud
Proficiency in ETL development using tools like dbt, Matillion, or similar technologies
Solid understanding of data modeling techniques, including star schema and dimensional modeling
Experience with scripting languages, particularly Python
Knowledge of data warehousing concepts, data lakes, and modern data architecture patterns
Preferred experience with CI/CD tools like GitHub Actions, Azure DevOps, or Jenkins