This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The DBT Engineer role requires 3-6 years of experience in Data Engineering, focusing on dbt and Snowflake. Candidates should possess strong SQL skills and be familiar with ETL processes. Responsibilities include translating Informatica mappings into dbt models, ensuring data quality, and collaborating with cross-functional teams. A bachelor's degree in Computer Science or a related field is required.
Job Responsibility:
Translate Informatica mappings, transformations, and business rules into dbt models (SQL) on Snowflake
Design and implement staging, core, and mart layers using standard dbt patterns and folder structures
Develop and maintain dbt tests (schema tests, data tests, custom tests) to ensure data quality and integrity
Implement snapshots, seeds, macros, and reusable components where appropriate
Collaborate with Snowflake developers to ensure physical data structures support dbt models efficiently
Work with functional teams to ensure functional equivalence between legacy Informatica outputs and new dbt outputs
Participate in performance tuning of dbt models and Snowflake queries
Integrate dbt with CI/CD pipelines (e.g., Azure DevOps, GitHub Actions) for automated runs and validations
Contribute to documentation of dbt models, data lineage, and business rules
Participate in defect analysis, bug fixes, and enhancements during migration and stabilization phases
Requirements:
3–6 years of experience in Data Engineering / ETL / DW
1–3+ years working with dbt (Core or Cloud)
Strong SQL skills, especially on Snowflake or another modern cloud DW