This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Role - Data Engineer Skill- Snowflake, Apache Airflow, DBT, Spark/Pyspark SQL and Data engineering concepts Exp - 4-6 years JD - Design, build, test and operationalize scalable data pipelines and cloud-native data platforms leveraging Snowflake, Apache Airflow, dbt, and Spark/PySpark -Build scalable data processing frameworks using Spark / PySpark for large-volume structured datasets -Design, implement, and optimize cloud data warehouse solutions on Snowflake -Develop modular, testable transformations using dbt, implementing reusable models, snapshots, and data tests. Perform robust testing across multiple layers of the data process pipeline -Enable CI/CD for data pipelines integrating Git and deployment workflows Duration - 3 months
Job Responsibility:
Design, build, test and operationalize scalable data pipelines and cloud-native data platforms leveraging Snowflake, Apache Airflow, dbt, and Spark/PySpark
Build scalable data processing frameworks using Spark / PySpark for large-volume structured datasets
Design, implement, and optimize cloud data warehouse solutions on Snowflake
Develop modular, testable transformations using dbt, implementing reusable models, snapshots, and data tests
Perform robust testing across multiple layers of the data process pipeline
Enable CI/CD for data pipelines integrating Git and deployment workflows