This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Azure Data Factory Developer role focuses on designing and maintaining data pipelines using Azure services and SQL. Candidates should have a Bachelor's degree and at least 3 years of experience in data engineering. Strong SQL skills and familiarity with Azure Data Factory are essential. Azure certifications are a plus.
Job Responsibility:
Design, develop, and deploy ETL (Extract, Transform, Load) pipelines and data-driven workflows using Azure Data Factory to ingest and process data from disparate sources
Write and optimize complex SQL queries, stored procedures, views, and functions for efficient data extraction, transformation, and loading across large datasets
Create and manage ADF linked services, datasets, data flows (mapping and wrangling), and integration runtimes to connect and transform data between various sources and targets, including SQL Server, Azure SQL Data Warehouse/Synapse, and Azure Data Lake Storage
Implement data workflows and orchestration logic in ADF, ensuring scalability and reliability
Monitor, troubleshoot, and optimize pipeline performance, resolving failures and implementing logging and retry mechanisms to meet service level agreements (SLAs)
Collaborate with data architects, analysts, and stakeholders to understand business requirements and deliver high-quality data solutions
Document ETL processes, pipeline configurations, and data flow diagrams
Ensure data security, compliance, and governance standards are met within the Azure environment, including secure coding practices and access controls
Requirements:
Bachelor's degree in a relevant field
3+ years in a data engineering role
Strong SQL proficiency
Expertise in Azure Data Factory
Knowledge of related Azure data components like Data Lake (Gen2) and Synapse Analytics