This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Role - Lead Data Engineer (Snowflake, DBT and Qlik)
Job Responsibility:
Design, develop, and maintain robust and scalable data transformation pipelines using dbt on the Snowflake platform
DBT Macro Development to Create and utilize Jinja-based DBT macros to promote code reusability, modularity, and dynamic SQL generation within DBT projects
Data Transformation & Orchestration to Implement and manage data transformation pipelines using DBT, integrating with various data sources and ensuring efficient data flow
Utilize advanced dbt concepts, including macros, materializations (e.g., incremental, view, table), snapshots, and configurations to build efficient data models
Write highly optimized and complex SQL queries for data manipulation, cleaning, aggregation, and transformation within dbt models
Implement and enforce best practices for dbt project structure, version control (Git), documentation, and testing
Collaborate with data analysts, engineers, and business stakeholders to understand data requirements and translate them into effective data models (e.g., star schema, snowflake schema)
Design and implement logical and physical data models within dbt to support analytical and reporting needs
Leverage Snowflake features and functionalities for performance optimization, including virtual warehouses, clustering, caching, and query optimization
Manage and optimize data ingestion and integration processes from various sources into Snowflake
Ensure data quality, integrity, and lineage throughout the data transformation process
Implement and maintain DBT tests to ensure data quality, integrity, and adherence to business rules
Implement and maintain data governance policies and procedures within the dbt environment
Develop and execute automated tests for dbt models to ensure data accuracy and reliability
Requirements:
Proven hands-on experience with dbt in a production environment, including extensive use of macros and advanced modeling techniques
Expert-level proficiency in SQL for data querying, manipulation, and transformation
Strong experience with Snowflake, including performance tuning and optimization
Solid understanding of data warehousing concepts and ETL/ELT processes
Experience with version control systems, particularly Git
Familiarity with data modeling principles (star schema, snowflake schema)