This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Data Engineer role involves developing and maintaining DBT models, SQL scripts, and scalable ELT/ETL pipelines. Candidates should have strong proficiency in Snowflake and SQL, with experience in Airflow development. Responsibilities include optimizing data models, managing warehouse performance, and implementing data quality checks and security measures. This position is hybrid and based in Oaks, PA.
Job Responsibility:
Develop and maintain DBT models, macros, and SQL scripts to transform data within Snowflake
Optimize data models, design star/snowflake schemas, manage warehouse performance, and implement clustering/materialized views
Create scalable ELT/ETL pipelines to ingest and transform data from diverse sources
Write modular, testable SQL code using version control and manage DBT project structures
Implement data quality checks, automated tests, anomaly detection, and data security, including RBAC, masking, and row-level access in Snowflake
Requirements:
5+ years of experience with Snowflake and Strong SQL proficiency
5+ years of experience with DBT and Hands-on experience developing with DBT