This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are seeking a skilled and detail-oriented Data Engineer (4–6 years) to design, build, and optimize scalable data pipelines using modern data stack technologies. The role focuses on working with DBT, Snowflake, SQL, and Python to deliver high-quality, reliable data solutions that support business analytics and decision-making. The ideal candidate will have strong expertise in data modeling, ETL/ELT processes, and cloud-based data warehousing.
Job Responsibility:
Develop, maintain, and optimize data pipelines using DBT and SQL on Snowflake DB
Collaborate with data analysts, QA and business teams to build scalable data models
Implement data transformations, testing, and documentation within the DBT framework
Work on Snowflake for data warehousing tasks, including data ingestion, query optimization, and performance tuning
Use Python (preferred) for automation, scripting, and additional data processing as needed
Requirements:
4+ years of experience in data engineering or related roles
Strong hands-on experience with DBT (Data Build Tool) and advanced SQL
Experience working with Snowflake or similar modern cloud data warehouses
Good understanding of data modeling concepts (dimensional modeling, normalization)
Strong knowledge of ETL/ELT processes and data pipeline design
Experience with Python for data processing and automation (preferred but important)
Ability to work with cross-functional teams and deliver scalable data solutions
Strong analytical and problem-solving skills
Nice to have:
Experience with other cloud platforms (AWS, Azure, GCP)
Familiarity with data orchestration tools (Airflow or similar)
Knowledge of data governance, testing frameworks, and data quality tools
Exposure to BI/reporting tools (Power BI, Tableau)