This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The SQL + ADF Developer role involves designing and maintaining ETL pipelines using Azure Data Factory. Candidates should have strong SQL skills and experience with data integration best practices. Responsibilities include optimizing SQL queries, monitoring ADF pipelines, and ensuring compliance with data governance standards.
Job Responsibility:
Design, develop, and maintain robust ETL pipelines using Azure Data Factory (ADF) for data integration
Write and optimize complex SQL queries for data extraction, transformation, and loading across large datasets
Implement data workflows and orchestrations in ADF, ensuring scalability and reliability
Collaborate with data architects and analysts to deliver high-quality data solutions aligned with business needs
Monitor and troubleshoot ADF pipelines, resolving failures and optimizing performance
Ensure data security, compliance, and governance standards are met within the Azure environment
Perform performance tuning for SQL queries and ADF activities to meet SLA requirements
Document ETL processes, pipeline configurations, and data flow diagrams for transparency and maintainability
Support production deployments and provide on-call assistance during the 2 PM–11 PM IST shift
Continuously explore and implement best practices for Azure-based data engineering solutions
Requirements:
Strong proficiency in SQL, including advanced query optimization and stored procedures
Hands-on experience with Azure Data Factory (ADF) for building and managing data pipelines
Exposure to Azure ecosystem components such as Data Lake, Synapse, and related services
Solid understanding of ETL concepts and data integration best practices
Ability to troubleshoot and optimize ADF workflows for performance and reliability
Familiarity with version control systems (Git) and CI/CD pipelines for data solutions
Knowledge of data governance, security, and compliance within cloud environments
Strong analytical and problem-solving skills with attention to detail
Excellent communication skills and ability to work effectively in a 2 PM–11 PM IST shift
A bachelor's degree in Computer Science is required, along with 3-5 years of relevant experience
Nice to have:
Experience with Microsoft Fabric (Lakehouse, OneLake) is a strong plus