This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are looking for a highly skilled Azure Data Engineer with expert knowledge in cloud infrastructure and DevOps automation. This critical hybrid role will be responsible for designing, building, optimizing, and automating our entire end-to-end data platform within the Microsoft Azure ecosystem. The ideal candidate will ensure our data solutions are scalable, reliable, and deployed using modern Infrastructure as Code (IaC) and CI/CD practices.
Job Responsibility:
Design & Implement ETL/ELT: Develop, optimize, and maintain scalable data pipelines using Python, SQL, and core Azure data services
Azure Data Services Management: Architect and manage key Azure data components, including: Data Lakes: Provisioning and structuring data within Azure Data Lake Storage (ADLS Gen2)
Data Processing: Implementing data transformation and analysis logic using Azure Data Factory (ADF), Azure Synapse Pipelines, and Azure Databricks (using Spark/PySpark)
Data Warehousing: Designing and optimizing the enterprise Data Warehouse in Azure Synapse Analytics (SQL Pool)
Data Modeling and Quality: Define and enforce data modeling standards and implement data quality checks within the pipelines
Infrastructure as Code (IaC): Design, manage, and provision all Azure data resources (ADLS, Synapse, ADF, Databricks Clusters) using Terraform or Azure Resource Manager (ARM) Templates/Bicep
CI/CD Implementation: Build and maintain automated Continuous Integration/Continuous Deployment (CI/CD) pipelines for all code (data, infrastructure, and application) using Azure DevOps or GitHub Actions
Containerization & Compute: Utilize Docker and manage deployment environments using Azure Kubernetes Service (AKS) or Azure Container Instances (ACI) when required for data applications
Monitoring, Logging, & Security: Configure comprehensive monitoring and alerting using Azure Monitor and Log Analytics. Implement network security and access controls (RBAC) across the data platform
Requirements:
Strong hands-on experience designing and deploying end-to-end data solutions specifically within the Azure ecosystem
High proficiency in Python (including PySpark) and expert knowledge of SQL
Proven, production-level experience with Terraform (preferred) or ARM/Bicep for automating Azure infrastructure deployment
Experience setting up CI/CD workflows using Azure DevOps Pipelines or GitHub Actions
Deep working knowledge of Azure Data Factory, Azure Databricks, and Azure Synapse Analytics
Experience with workflow orchestration tools like Azure Data Factory or Apache Airflow
Nice to have:
Azure certifications such as Azure Data Engineer Associate (DP-203) or Azure DevOps Engineer Expert (AZ-400)
Familiarity with Data Governance tools such as Azure Purview
Experience with real-time data ingestion using Azure Event Hubs or Azure Stream Analytics