This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The DDSI Data Engineering (DE) team is seeking a Data Engineer to design, develop, enhance, and implement systems for multiple analytical applications. The work will involve various data engineering activities such as maintaining and enhancing data acquisition processes, data validation processes, ETL/ELT data pipelines, and overall process notification methods.
Job Responsibility:
Work assignments may cover activities such as participation in data requirements gathering, source-to-target mapping, data validation scripting and review, developing and monitoring ETL/ELT data pipelines, and producing datasets as input to science models and visualizations
As part of a data engineering team, this role will be included in technical reviews of the team to ensure quality deliverables
In addition to technical capabilities, the role is responsible for developing an understanding of the business domain and processes, then applying that knowledge to the assigned deliverables
This role communicates data engineering progress to the project leadership team and actively participates in meetings and discussions
Requirements:
Bachelor's degree (Computer Science, Mathematics, Software Engineering or related field, or equivalent experience)
5+ years of experience with ELT/ETL development using SQL and Python
2+ years of experience developing and maintaining data pipeline processing with a framework such as Apache Flink, Beam or Kafka Streams
Experience and understanding of one or more business domains to assist in gathering/refining data requirements and data design solutions
Experience with developing in a multi environment (Dev, QA, Prod, etc.) and DevOps procedures for code deployment/promotion
Strong understanding of database design and proficiency utilizing various database platforms, such as PostgreSQL or Snowflake
Experience managing and deploying code using a source control product such as GitLab/GitHub
Able to effectively formulate solutions and communicate complex technical concepts to non-technical team members
Nice to have:
Master's degree (Computer Science, Mathematics, Engineering or related field preferred)
Experience with infrastructure as Code, Docker and containerization and elastic scaling with Kubernetes or a similar framework and with AWS
Experience working with large datasets and big data technologies, preferably cloud-based, such as Snowflake, Databricks, or similar
Demonstrated proficiency with API development
Knowledgeable on cloud architecture and product offerings, preferably AWS