This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are seeking a Senior Data Engineer – Ingest to help transform data into meaningful insights and power innovation across the organization. In this role, you will work with a collaborative team of technologists to build scalable data solutions, integrate diverse data sources, and strengthen the core data platform. Your engineering expertise will directly support analytics, data science, operations, and key business stakeholders. If you’re passionate about building high‑quality data systems that make a measurable impact, this role offers the opportunity to shape the future of a large, data‑driven organization.
Job Responsibility:
Maintain, update, and expand configuration‑driven data pipelines within the core data platform
Build tools and services supporting data discovery, lineage, governance, and privacy
Partner with software engineers, data engineers, architects, and product managers to deliver reliable and scalable data solutions
Help define and document data standards, naming conventions, pipeline best practices, and system guidelines
Ensure the reliability, accuracy, and operational efficiency of datasets to meet SLAs
Participate in Agile/Scrum ceremonies and contribute to ongoing process improvements
Collaborate closely with users and stakeholders to understand needs and prioritize enhancements
Maintain detailed technical documentation to support data quality, governance, and compliance requirements
Requirements:
5+ years of Data Engineering experience
Strong SQL expertise
Hands-on experience with Databricks
Proficiency with Python and Snowflake
Experience working in Agile environments
Data analysis capabilities
Bachelor’s degree in Computer Science, Information Systems, or equivalent experience
Nice to have:
7+ years developing large-scale data pipelines
Experience with at least one major programming language (Python, Java, or Scala)
Experience with pipeline orchestration tools (Airflow preferred)
Cloud experience (AWS preferred), Kubernetes a plus
Experience with ingest tools (Fivetran, Airbyte, DataFlow, Matillion, etc.)
Strong understanding of cloud infrastructure and IaC
Familiarity with data modeling and data warehousing methodologies
Strong algorithmic problem-solving skills
Excellent written and verbal communication skills
Experience with distributed data processing, data services, or data modeling
Familiarity with Scrum/Agile methodologies
Ability to learn quickly and work independently with high attention to detail