This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Data Engineer role requires expertise in Python, Kafka, and Snowflake, focusing on data modeling and performance optimization. Candidates should have a Bachelor’s or Master’s degree in a related field and 5-7 years of experience in data engineering. Strong troubleshooting skills in SQL and familiarity with CI/CD practices are essential. Join our innovative team to drive data initiatives forward.
Requirements:
Bachelor’s or Master’s degree in Computer Science, Applied Mathematics, Engineering, or a related quantitative field
5–7 years of professional hands-on coding experience in collaborative, team-based environments
Strong troubleshooting skills in SQL and scripting
Proficiency in Python or Java
Deep familiarity with SDLC, CI/CD best practices, and Kubernetes deployment
Expertise in temporal data modeling (e.g., SCD Type 2)
Schema management with a focus on schema evolution (Iceberg Apache)
Performance optimization through data partitioning and clustering
Architectural theory involving normalization/denormalization and natural vs. surrogate keys
Knowledge of technologies: Python, Kafka, ANSI SQL, FTP, Apache Spark, JSON, Avro, Parquet, Hadoop (HDFS/Hive), Snowflake, Apache Iceberg, Sybase IQ