CrawlJobs Logo

SQL + ADF developer

nttdata.com Logo

NTT DATA

Location Icon

Location:
India , Remote

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

The SQL + ADF Developer role involves designing and maintaining ETL pipelines using Azure Data Factory. Candidates should have strong SQL skills and experience with data integration best practices. Responsibilities include optimizing SQL queries, monitoring ADF pipelines, and ensuring compliance with data governance standards.

Job Responsibility:

  • Design, develop, and maintain robust ETL pipelines using Azure Data Factory (ADF) for data integration
  • Write and optimize complex SQL queries for data extraction, transformation, and loading across large datasets
  • Implement data workflows and orchestrations in ADF, ensuring scalability and reliability
  • Collaborate with data architects and analysts to deliver high-quality data solutions aligned with business needs
  • Monitor and troubleshoot ADF pipelines, resolving failures and optimizing performance
  • Ensure data security, compliance, and governance standards are met within the Azure environment
  • Perform performance tuning for SQL queries and ADF activities to meet SLA requirements
  • Document ETL processes, pipeline configurations, and data flow diagrams for transparency and maintainability
  • Support production deployments and provide on-call assistance during the 2 PM–11 PM IST shift
  • Continuously explore and implement best practices for Azure-based data engineering solutions

Requirements:

  • Strong proficiency in SQL, including advanced query optimization and stored procedures
  • Hands-on experience with Azure Data Factory (ADF) for building and managing data pipelines
  • Exposure to Azure ecosystem components such as Data Lake, Synapse, and related services
  • Solid understanding of ETL concepts and data integration best practices
  • Ability to troubleshoot and optimize ADF workflows for performance and reliability
  • Familiarity with version control systems (Git) and CI/CD pipelines for data solutions
  • Knowledge of data governance, security, and compliance within cloud environments
  • Strong analytical and problem-solving skills with attention to detail
  • Excellent communication skills and ability to work effectively in a 2 PM–11 PM IST shift
  • A bachelor's degree in Computer Science is required, along with 3-5 years of relevant experience

Nice to have:

Experience with Microsoft Fabric (Lakehouse, OneLake) is a strong plus

Additional Information:

Job Posted:
January 25, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for SQL + ADF developer

Software Engineer-Snowflake

Join our Snowflake Managed Services team as a Software Engineer to work on data ...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
genzeon.com Logo
Genzeon
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4+ years of hands-on experience in Snowflake development and support
  • Strong SQL, data modeling, and performance tuning experience
  • Exposure to CI/CD pipelines and scripting languages (e.g., Python, Shell)
  • Understanding of Snowflake security (RBAC), warehouse sizing, cost controls
  • Experience with data pipelines and orchestration tools (Airflow, dbt, ADF)
Job Responsibility
Job Responsibility
  • Design and develop Snowflake pipelines, data models, and transformations
  • Provide L2/L3 production support for Snowflake jobs, queries, and integrations
  • Troubleshoot failed jobs, resolve incidents, and conduct RCA
  • Tune queries, monitor warehouses, and help optimize Snowflake usage and cost
  • Handle service requests like user provisioning, access changes, and role management
  • Participate in code reviews, deployment pipelines, and continuous improvement
  • Document issues, enhancements, and standard procedures (runbooks)
Read More
Arrow Right

Cloud Big-data Engineer

An expert with 4-5 years of experience in Hadoop ecosystem and cloud- (AWS ecosy...
Location
Location
United States , Starkville; Dover; Minneapolis
Salary
Salary:
45.00 USD / Hour
phasorsoft.com Logo
PhasorSoft Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4-5 years of experience in Hadoop ecosystem and cloud (AWS ecosystem/Azure)
  • Experience working with in-memory computing using R, Python, Spark, PySpark, Kafka, and Scala
  • Experience in parsing and shredding XML and JSON, shell scripting, and SQL
  • Experience working with Hadoop ecosystem - HDFS, Hive
  • Experience working with AWS ecosystem - S3, EMR, EC2, Lambda Cloud Formation, Cloud Watch, SNS/SQS
  • Experience with Azure – Azure Data Factory (ADF)
  • Experience working with SQL and No SQL databases
  • Experience designing and developing data sourcing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking, and matching
  • Work Authorization: H1, GC, US Citizen
Read More
Arrow Right
New

SQL + ADF developer

The SQL + ADF Developer will be responsible for designing and maintaining ETL pi...
Location
Location
India , Remote
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong proficiency in SQL, including advanced query optimization and stored procedures
  • Hands-on experience with Azure Data Factory (ADF) for building and managing data pipelines
  • Exposure to Azure ecosystem components such as Data Lake, Synapse, and related services
  • Solid understanding of ETL concepts and data integration best practices
  • Ability to troubleshoot and optimize ADF workflows for performance and reliability
  • Familiarity with version control systems (Git) and CI/CD pipelines for data solutions
  • Knowledge of data governance, security, and compliance within cloud environments
  • Strong analytical and problem-solving skills with attention to detail
  • Excellent communication skills and ability to work effectively in a 2 PM–11 PM IST shift
  • A bachelor's degree in Computer Science or a related field is required, along with 3 to 5 years of relevant experience
Job Responsibility
Job Responsibility
  • Design, develop, and maintain robust ETL pipelines using Azure Data Factory (ADF) for data integration
  • Write and optimize complex SQL queries for data extraction, transformation, and loading across large datasets
  • Implement data workflows and orchestrations in ADF, ensuring scalability and reliability
  • Collaborate with data architects and analysts to deliver high-quality data solutions aligned with business needs
  • Monitor and troubleshoot ADF pipelines, resolving failures and optimizing performance
  • Ensure data security, compliance, and governance standards are met within the Azure environment
  • Perform performance tuning for SQL queries and ADF activities to meet SLA requirements
  • Document ETL processes, pipeline configurations, and data flow diagrams for transparency and maintainability
  • Support production deployments and provide on-call assistance during the 2 PM–11 PM IST shift
  • Continuously explore and implement best practices for Azure-based data engineering solutions
  • Fulltime
Read More
Arrow Right
New

SQL + ADF developer

The SQL + ADF Developer will design and maintain ETL pipelines using Azure Data ...
Location
Location
India
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong proficiency in SQL, including advanced query optimization and stored procedures
  • Hands-on experience with Azure Data Factory (ADF) for building and managing data pipelines
  • Exposure to Azure ecosystem components such as Data Lake, Synapse, and related services
  • Solid understanding of ETL concepts and data integration best practices
  • Ability to troubleshoot and optimize ADF workflows for performance and reliability
  • Familiarity with version control systems (Git) and CI/CD pipelines for data solutions
  • Knowledge of data governance, security, and compliance within cloud environments
  • Strong analytical and problem-solving skills with attention to detail
  • Excellent communication skills and ability to work effectively in a 2 PM–11 PM IST shift
  • A bachelor's degree in Computer Science is required, along with 3-5 years of relevant experience
Job Responsibility
Job Responsibility
  • Design, develop, and maintain robust ETL pipelines using Azure Data Factory (ADF) for data integration
  • Write and optimize complex SQL queries for data extraction, transformation, and loading across large datasets
  • Implement data workflows and orchestrations in ADF, ensuring scalability and reliability
  • Collaborate with data architects and analysts to deliver high-quality data solutions aligned with business needs
  • Monitor and troubleshoot ADF pipelines, resolving failures and optimizing performance
  • Ensure data security, compliance, and governance standards are met within the Azure environment
  • Perform performance tuning for SQL queries and ADF activities to meet SLA requirements
  • Document ETL processes, pipeline configurations, and data flow diagrams for transparency and maintainability
  • Support production deployments and provide on-call assistance during the 2 PM–11 PM IST shift
  • Continuously explore and implement best practices for Azure-based data engineering solutions
  • Fulltime
Read More
Arrow Right
New

SQL + ADF developer

The SQL + ADF Developer role involves designing and maintaining ETL pipelines us...
Location
Location
India , Remote
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong proficiency in SQL, including advanced query optimization and stored procedures
  • Hands-on experience with Azure Data Factory (ADF) for building and managing data pipelines
  • Exposure to Azure ecosystem components such as Data Lake, Synapse, and related services
  • Solid understanding of ETL concepts and data integration best practices
  • Ability to troubleshoot and optimize ADF workflows for performance and reliability
  • Familiarity with version control systems (Git) and CI/CD pipelines for data solutions
  • Knowledge of data governance, security, and compliance within cloud environments
  • Strong analytical and problem-solving skills with attention to detail
  • Excellent communication skills and ability to work effectively in a 2 PM–11 PM IST shift
  • A bachelor's degree in Computer Science is preferred, along with 3-5 years of relevant experience
Job Responsibility
Job Responsibility
  • Design, develop, and maintain robust ETL pipelines using Azure Data Factory (ADF) for data integration
  • Write and optimize complex SQL queries for data extraction, transformation, and loading across large datasets
  • Implement data workflows and orchestrations in ADF, ensuring scalability and reliability
  • Collaborate with data architects and analysts to deliver high-quality data solutions aligned with business needs
  • Monitor and troubleshoot ADF pipelines, resolving failures and optimizing performance
  • Ensure data security, compliance, and governance standards are met within the Azure environment
  • Perform performance tuning for SQL queries and ADF activities to meet SLA requirements
  • Document ETL processes, pipeline configurations, and data flow diagrams for transparency and maintainability
  • Support production deployments and provide on-call assistance during the 2 PM–11 PM IST shift
  • Continuously explore and implement best practices for Azure-based data engineering solutions
  • Fulltime
Read More
Arrow Right
New

SQL + ADF developer

The SQL + ADF Developer role involves designing and maintaining ETL pipelines us...
Location
Location
India
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong proficiency in SQL, including advanced query optimization and stored procedures
  • Hands-on experience with Azure Data Factory (ADF) for building and managing data pipelines
  • Exposure to Azure ecosystem components such as Data Lake, Synapse, and related services
  • Solid understanding of ETL concepts and data integration best practices
  • Ability to troubleshoot and optimize ADF workflows for performance and reliability
  • Familiarity with version control systems (Git) and CI/CD pipelines for data solutions
  • Knowledge of data governance, security, and compliance within cloud environments
  • Strong analytical and problem-solving skills with attention to detail
  • Excellent communication skills and ability to work effectively in a 2 PM–11 PM IST shift
  • A bachelor's degree in Computer Science or a related field is required, along with 3-5 years of relevant experience
Job Responsibility
Job Responsibility
  • Design, develop, and maintain robust ETL pipelines using Azure Data Factory (ADF) for data integration
  • Write and optimize complex SQL queries for data extraction, transformation, and loading across large datasets
  • Implement data workflows and orchestrations in ADF, ensuring scalability and reliability
  • Collaborate with data architects and analysts to deliver high-quality data solutions aligned with business needs
  • Monitor and troubleshoot ADF pipelines, resolving failures and optimizing performance
  • Ensure data security, compliance, and governance standards are met within the Azure environment
  • Perform performance tuning for SQL queries and ADF activities to meet SLA requirements
  • Document ETL processes, pipeline configurations, and data flow diagrams for transparency and maintainability
  • Support production deployments and provide on-call assistance during the 2 PM–11 PM IST shift
  • Continuously explore and implement best practices for Azure-based data engineering solutions
  • Fulltime
Read More
Arrow Right

Senior Core System Analyst

Allianz Technology is the global IT service provider for Allianz and delivers IT...
Location
Location
Malaysia , Kuala Lumpur
Salary
Salary:
Not provided
https://www.allianz.com Logo
Allianz
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Candidate must possess at least a Bachelor's Degree, Post Graduate Diploma, Professional Degree, Computer Science/Information Technology or equivalent
  • At least 7 year(s) of working experience in the related any Insurance Core System and in leading a development team
  • At least 5 year(s) of hands-on experience in Oracle ADF, ADF Faces (Taskflows), ADF Business Components (AM, VO, EO) or related technical knowledge
  • Experience in Oracle/DB2 database as well as SQL programming like PL/SQL
  • Experience using standard data interchange formats, such as JSON or XML and message brokers e.g. Kafka, MQ
  • Ability to guide and review the code of the developers
  • Ability to provide application solutioning, detail design and create component-based reusable and testable modules and pages
  • Familiar with and/or have experience in DevSecOps methodology
Job Responsibility
Job Responsibility
  • Responsible for timely documentation of technical design and specifications in accordance with the business specifications and Allianz project standards
  • Ensure timely completion of software development with project schedule
  • Compliance of frontline development functions to System Development Guideline and Standards
  • Quality of frontline systems developed and implemented in fulfilling system specifications and user requirements
  • Assimilation of the Group core competencies in dispensing his / her roles and responsibilities
  • Keeping up to date with advances in computer technology and how this affects the business environment
  • Software design fundamentals in object-oriented design, data structures and algorithm design, and complex analysis
  • Software development fundamentals, including testing, troubleshooting and using version control
  • Constant communication with team members/users/vendors
  • Collaborate daily with frontend developers, backend developers and engage in full software development lifecycle
  • Fulltime
Read More
Arrow Right

Data Analytics Engineer

SDG Group is expanding its global Data & Analytics practice and is seeking a mot...
Location
Location
Egypt , Cairo
Salary
Salary:
Not provided
sdggroup.com Logo
SDG
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in computer science, Engineering, Information Systems, or related field
  • Hands-on experience in DataOps / Data Engineering
  • Strong knowledge in Databricks OR Snowflake (one of them is mandatory)
  • Proficiency in Python and SQL
  • Experience with Azure data ecosystem (ADF, ADLS, Synapse, etc.)
  • Understanding of CI/CD practices and DevOps for data.
  • Knowledge of data modeling, orchestration frameworks, and monitoring tools
  • Strong analytical and troubleshooting skills
  • Eagerness to learn and grow in a global consulting environment
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable and reliable data pipelines following DataOps best practices
  • Work with modern cloud data stacks using Databricks (Spark, Delta Lake) or Snowflake (Snow pipe, tasks, streams)
  • Develop and optimize ETL/ELT workflows using Python, SQL, and orchestration tools
  • Work with Azure data services (ADF, ADLS, Azure SQL, Azure Functions)
  • Implement CI/CD practices using Azure DevOps or Git-based workflows
  • Ensure data quality, consistency, and governance across all delivered data solutions
  • Monitor and troubleshoot pipelines for performance and operational excellence
  • Collaborate with international teams, architects, and analytics consultants
  • Contribute to technical documentation and solution design assets
What we offer
What we offer
  • Remote working model aligned with international project needs
  • Opportunity to work on European and global engagements
  • Mentorship and growth paths within SDG Group
  • A dynamic, innovative, and collaborative environment
  • Access to world-class training and learning platforms
  • Fulltime
Read More
Arrow Right