This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Would you like to be part of building the data pipeline for the entire organization? Help source, organize and provide accessibility to numerous engineering teams, business stakeholders and external partners? GLG is building a next generation platform to transform the existing way we deal with data across the organization.
Job Responsibility:
Cleanse, normalize and enhance quality for both the existing operational systems as well as new data sources that flow through the data platform
Build, monitor, and maintain ETL/ELT pipelines using Python, SQL and Airflow
Design and optimize tables, datasets, and transformations in Snowflake
Develop and support data ingestion workflows, including API integrations, file ingestion, and database connectors
Ensure data quality, reliability, and performance across pipelines
Work in an Agile environment and collaborate with engineering, analytics, and product teams
Use integration tools (e.g., Fivetran, Workato, Informatica, etc.) to onboard new data sources
Requirements:
3–5 years of experience as a Data Engineer
Strong hands-on experience with Snowflake, Python, SQL, and Airflow
Experience building data ingestion pipelines across APIs, files, and databases
Strong SQL, Data Modeling skills, plus a solid understanding of ETL/ELT concepts
Experience working in Agile teams (Scrum, sprint planning, etc.)
Experience with data integration tools is preferred
Knowledge of data engineering best practices across the development lifecycle, including coding standards, code reviews, source management, build processes, testing, and operations