This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We have a 6 month contract to hire for a motivated Data Engineer with 2–4 years of experience building and supporting production-grade ETL pipelines in Snowflake and Teradata, with strong expertise in advanced SQL and cloud-based data processing. They are a self-starter who thrives in fast-paced, business-critical environments and can partner closely with architects and SMEs to deliver high-quality provider data solutions under tight timelines. 100% Remote.
Job Responsibility:
Develops and operationalizes data pipelines to make data available for consumption
Engages with the DevSecOps Engineer during continuous integration and continuous deployment
Designs and implements standardized data management procedures around data staging, data ingestion, data preparation, data provisioning, and data destruction
Designs, develops, implements, tests, documents, and operates large-scale, high-volume, high-performance data structures for business intelligence analytics
Designs, develops, and maintains real-time processing applications and real-time data pipelines
Ensure quality of technical solutions as data moves across the internal environments
Provides insight into the changing data environment, data processing, data storage, and utilization requirements for the company and offers suggestions for solutions
Develops, constructs, tests, and maintains architectures using programming language and tools
Identifies ways to improve data reliability, efficiency, and quality
use data to discover tasks that can be automated
Requirements:
Bachelor's degree
2-4 years of experience
Strong experience building ETL pipelines using Snowflake and IDMC
Experience with Big Data
Data Processing CI/CD pipelines for: Schema changes