This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are looking for an experienced Data Engineer (Python & Snowflake) to design, develop, and maintain scalable data pipelines and analytical data models. The role involves working extensively with Snowflake as the core data cloud platform and Python for automation and data transformation. The ideal candidate will have strong SQL skills, hands‑on experience with cloud data ecosystems, and the ability to collaborate with business and analytics teams to deliver reliable, high‑quality data solutions.
Job Responsibility:
Design and develop end-to-end ETL/ELT data pipelines to ingest data from APIs, S3, and relational databases into Snowflake
Build and optimize data models (Star/Snowflake schemas) for reporting and analytics use cases
Leverage Python for data transformation, automation, and custom workflow development
Improve Snowflake performance using clustering, partitioning, query optimization, and warehouse tuning
Implement data quality, validation rules, and governance controls (including RBAC)
Collaborate with Data Scientists, Analysts, and cross-functional teams to deliver scalable data solutions
Manage workflow orchestration using tools such as Airflow, Prefect, or Dagster
Implement modular SQL transformations using dbt (data build tool)
Integrate Snowflake with AWS services such as S3, IAM, Lambda, etc.
Requirements:
Strong hands-on experience with Snowflake architecture, including Snowpipe, Tasks, Streams, and storage vs. compute concepts
High proficiency in SQL/SnowSQL, including complex joins, CTEs, window functions, and query optimization
Strong experience with Python for data processing, automation, and API integrations
Experience with orchestration tools such as Airflow, Prefect, or Dagster
Proficiency with dbt for SQL transformations and data modeling
Strong understanding of AWS cloud services (S3, IAM, Lambda) and integration with Snowflake
Experience using Git and CI/CD pipelines
Strong analytical and problem-solving abilities
Nice to have:
Experience with additional cloud platforms (Azure, GCP)
Knowledge of data lake frameworks such as Delta Lake or Iceberg
Experience with Docker or containerized environments
Exposure to BI tools such as Tableau, Power BI, or Looker
Knowledge of Kafka, Kinesis, or other streaming technologies
Understanding of data security, compliance frameworks, and encryption standards