This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The DDSI Data Engineering (DE) team is seeking a Data Engineer to design, develop, enhance, and implement systems for multiple analytical applications. The work will involve various data engineering activities such as maintaining and enhancing data acquisition processes, data validation processes, ETL/ELT data pipelines, and overall process notification methods.
Job Responsibility:
Participation in data requirements gathering
source-to-target mapping
data validation scripting and review
developing and monitoring ETL/ELT data pipelines
producing datasets as input to science models and visualizations
included in technical reviews of the team to ensure quality deliverables
developing an understanding of the business domain and processes
communicating data engineering progress to the project leadership team
actively participating in meetings and discussions
Requirements:
Bachelor's degree (Computer Science, Mathematics, Software Engineering or related field, or equivalent experience)
5+ years of experience with ELT/ETL development using SQL and Python
2+ years of experience developing and maintaining data pipeline processing with a framework such as Apache Flink, Beam or Kafka Streams
Experience and understanding of one or more business domains to assist in gathering/refining data requirements and data design solutions
Experience with developing in a multi environment (Dev, QA, Prod, etc.) and DevOps procedures for code deployment/promotion
Strong understanding of database design and proficiency utilizing various database platforms, such as PostgreSQL or Snowflake
Experience managing and deploying code using a source control product such as GitLab/GitHub
Able to effectively formulate solutions and communicate complex technical concepts to non-technical team members
Heavy Snowflake/SQL/Python and Flink
Multi environment in DevOps
Must have Code promotion and Code management experience
Nice to have:
Infrastructure as code
Dockers and Kubernetes used
Master's degree (Computer Science, Mathematics, Engineering or related field preferred)
Experience with infrastructure as Code, Docker and containerization and elastic scaling with Kubernetes or a similar framework and with AWS
Experience working with large datasets and big data technologies, preferably cloud-based, such as Snowflake, Databricks, or similar
Demonstrated proficiency with API development
Knowledgeable on cloud architecture and product offerings, preferably AWS