This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are seeking a Data Engineer for an assignment delivered in a hybrid setup, where approximately half of the work is expected to be carried out on-site.
Job Responsibility:
Work with technologies such as Databricks, including Apache Spark, Delta Lake, notebooks, job orchestration, and performance optimization, as well as Microsoft Azure services such as Azure Data Lake Storage
Work with DevOps and CI/CD practices, including version control, automated testing, deployment pipelines, and infrastructure as code
Work in the early stages of solution design and contributing throughout the full data engineering lifecycle, including ideation, high-level and low-level architecture, requirements specification, functional and technical design, estimation, sprint planning, development, testing, documentation, deployment, and operational follow-up
Create clean, scalable, resilient, and cost-efficient data solutions using modern cloud-based platforms, with a strong focus on maintainability and reusability
Clearly explain data architectures, pipelines, and trade-offs to stakeholders without a technical background
Requirements:
Proven experience applying DevOps methodologies, including version control (for example Git), CI/CD workflows, automated testing, promotion across environments, and infrastructure as code within data platform contexts
Strong familiarity with Microsoft Azure, especially services typically used in data platforms such as Azure Data Lake Storage
Practical experience working with Databricks, including Apache Spark, Delta Lake, orchestration of jobs, performance tuning, and management of environments
Demonstrated experience in developing and delivering complete data pipelines, both batch and streaming, covering ingestion, transformation, and serving layers, following established practices in data modeling and lakehouse architecture
Solid understanding of principles and standards within data engineering, including release workflows and quality requirements such as data validation, performance tuning, monitoring, and issue resolution in production environments