CrawlJobs Logo

Offshore ETL Engineer

nttdata.com Logo

NTT DATA

Location Icon

Location:
India , Bangalore

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

The Data Engineer is responsible for the implementation of Python-based extract/load scripts and dbt transformation models following established standards and designs. This role focuses on executing migration tasks at scale, supporting wave-based delivery, and ensuring data accuracy through unit testing and validation activities.

Job Responsibility:

  • Develop Python scripts to extract data from source systems and load it into the target warehouse
  • Implement dbt models based on approved transformation designs and patterns
  • Write basic dbt tests and SQL validation queries
  • Support data reconciliation and defect resolution activities
  • Participate in SIT and UAT support cycles
  • Maintain clear and consistent documentation for developed pipelines and models
  • Collaborate with Senior Data Engineers and Technical Leads to clarify requirements and resolve issues

Requirements:

  • 3–6 years of experience in data engineering or ETL/ELT development
  • Working knowledge of Python for data processing
  • Strong SQL skills
  • Hands-on or foundational experience with dbt
  • Familiarity with data warehouses and batch processing concepts
  • Experience working in offshore delivery teams

Additional Information:

Job Posted:
May 10, 2026

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Offshore ETL Engineer

New

Offshore ETL Sr Engineer

The Senior ETL Engineer at NTT DATA will be responsible for developing complex P...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7–10 years of data engineering experience
  • Strong Python expertise for batch data ingestion and processing
  • Advanced SQL skills (complex joins, window functions, aggregations)
  • Hands-on experience with dbt Core or dbt Cloud
  • Experience working with large-scale data migration or ETL modernization programs
  • Familiarity with Informatica PowerCenter concepts (preferred)
  • Experience mentoring junior engineers in offshore delivery models
Job Responsibility
Job Responsibility
  • Develop and maintain complex Python EL pipelines to land source data as-is into the target warehouse
  • Implement advanced dbt models, including: Incremental models, Snapshot-based logic (non-CDC), Complex joins and aggregations
  • Translate high-complexity Informatica mappings into dbt SQL under guidance from SMEs
  • Design and implement reusable dbt macros and common SQL patterns
  • Perform peer code reviews and provide technical guidance to Data Engineers
  • Troubleshoot performance issues and data discrepancies during SIT and UAT
  • Support wave-based migration execution across SWP and IMS in parallel
  • Contribute to technical documentation, runbooks, and handover materials
  • Fulltime
Read More
Arrow Right
New

Dbt/Snowflake Junior Engineer

Location
Location
India , Remote, Karnātaka
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in Data Engineering / ETL / DW, with 1–3+ years working with dbt (Core or Cloud)
  • Strong SQL skills, especially on Snowflake or another modern cloud DW
  • Experience with dbt concepts: models, tests, sources, seeds, snapshots, macros, exposures
  • Prior experience with Informatica (developer-level understanding of mappings/workflows) is highly desirable
  • Understanding of CI/CD practices and integrating dbt into automated pipelines
  • Knowledge of data modeling (dimensional models, SCDs, fact/dimension design)
  • Experience working in offshore delivery with onshore coordination
  • Good communication skills and ability to read/understand existing ETL logic and requirements documentation
Job Responsibility
Job Responsibility
  • Translate Informatica mappings, transformations, and business rules into dbt models (SQL) on Snowflake
  • Design and implement staging, core, and mart layers using standard dbt patterns and folder structures
  • Develop and maintain dbt tests (schema tests, data tests, custom tests) to ensure data quality and integrity
  • Implement snapshots, seeds, macros, and reusable components where appropriate
  • Collaborate with Snowflake developers to ensure physical data structures support dbt models efficiently
  • Work with functional teams to ensure functional equivalence between legacy Informatica outputs and new dbt outputs
  • Participate in performance tuning of dbt models and Snowflake queries
  • Integrate dbt with CI/CD pipelines (e.g., Azure DevOps, GitHub Actions) for automated runs and validations
  • Contribute to documentation of dbt models, data lineage, and business rules
  • Participate in defect analysis, bug fixes, and enhancements during migration and stabilization phases
  • Fulltime
Read More
Arrow Right

DBT/ Snowflake Junior Engineer

NTT DATA strives to hire exceptional, innovative and passionate individuals who ...
Location
Location
India , Remote
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in Data Engineering / ETL / DW
  • 1–3+ years working with dbt (Core or Cloud)
  • Strong SQL skills, especially on Snowflake or another modern cloud DW
  • Experience with dbt concepts: models, tests, sources, seeds, snapshots, macros, exposures
  • Prior experience with Informatica (developer-level understanding of mappings/workflows) is highly desirable
  • Understanding of CI/CD practices and integrating dbt into automated pipelines
  • Knowledge of data modeling (dimensional models, SCDs, fact/dimension design)
  • Experience working in offshore delivery with onshore coordination
  • Good communication skills and ability to read/understand existing ETL logic and requirements documentation
Job Responsibility
Job Responsibility
  • Translate Informatica mappings, transformations, and business rules into dbt models (SQL) on Snowflake
  • Design and implement staging, core, and mart layers using standard dbt patterns and folder structures
  • Develop and maintain dbt tests (schema tests, data tests, custom tests) to ensure data quality and integrity
  • Implement snapshots, seeds, macros, and reusable components where appropriate
  • Collaborate with Snowflake developers to ensure physical data structures support dbt models efficiently
  • Work with functional teams to ensure functional equivalence between legacy Informatica outputs and new dbt outputs
  • Participate in performance tuning of dbt models and Snowflake queries
  • Integrate dbt with CI/CD pipelines (e.g., Azure DevOps, GitHub Actions) for automated runs and validations
  • Contribute to documentation of dbt models, data lineage, and business rules
  • Participate in defect analysis, bug fixes, and enhancements during migration and stabilization phases
Read More
Arrow Right

DBT Junior Engineer

The DBT Junior Engineer role involves translating Informatica mappings into dbt ...
Location
Location
India , Remote
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3–6 years of experience in Data Engineering / ETL / DW
  • 1–3+ years working with dbt (Core or Cloud)
  • Strong SQL skills, especially on Snowflake or another modern cloud DW
  • Experience with dbt concepts: models, tests, sources, seeds, snapshots, macros, exposures
  • Prior experience with Informatica (developer-level understanding of mappings/workflows) is highly desirable
  • Understanding of CI/CD practices and integrating dbt into automated pipelines
  • Knowledge of data modeling (dimensional models, SCDs, fact/dimension design)
  • Experience working in offshore delivery with onshore coordination
  • Good communication skills and ability to read/understand existing ETL logic and requirements documentation
Job Responsibility
Job Responsibility
  • Translate Informatica mappings, transformations, and business rules into dbt models (SQL) on Snowflake
  • Design and implement staging, core, and mart layers using standard dbt patterns and folder structures
  • Develop and maintain dbt tests (schema tests, data tests, custom tests) to ensure data quality and integrity
  • Implement snapshots, seeds, macros, and reusable components where appropriate
  • Collaborate with Snowflake developers to ensure physical data structures support dbt models efficiently
  • Work with functional teams to ensure functional equivalence between legacy Informatica outputs and new dbt outputs
  • Participate in performance tuning of dbt models and Snowflake queries
  • Integrate dbt with CI/CD pipelines (e.g., Azure DevOps, GitHub Actions) for automated runs and validations
  • Contribute to documentation of dbt models, data lineage, and business rules
  • Participate in defect analysis, bug fixes, and enhancements during migration and stabilization phases
  • Fulltime
Read More
Arrow Right

DBT Engineer

The DBT Engineer role requires 3-6 years of experience in Data Engineering, focu...
Location
Location
India , Remote
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3–6 years of experience in Data Engineering / ETL / DW
  • 1–3+ years working with dbt (Core or Cloud)
  • Strong SQL skills, especially on Snowflake or another modern cloud DW
  • Experience with dbt concepts: models, tests, sources, seeds, snapshots, macros, exposures
  • Prior experience with Informatica (developer-level understanding of mappings/workflows) is highly desirable
  • Understanding of CI/CD practices and integrating dbt into automated pipelines
  • Knowledge of data modeling (dimensional models, SCDs, fact/dimension design)
  • Experience working in offshore delivery with onshore coordination
  • Good communication skills and ability to read/understand existing ETL logic and requirements documentation
  • Bachelor's degree in Computer Science or a related field
Job Responsibility
Job Responsibility
  • Translate Informatica mappings, transformations, and business rules into dbt models (SQL) on Snowflake
  • Design and implement staging, core, and mart layers using standard dbt patterns and folder structures
  • Develop and maintain dbt tests (schema tests, data tests, custom tests) to ensure data quality and integrity
  • Implement snapshots, seeds, macros, and reusable components where appropriate
  • Collaborate with Snowflake developers to ensure physical data structures support dbt models efficiently
  • Work with functional teams to ensure functional equivalence between legacy Informatica outputs and new dbt outputs
  • Participate in performance tuning of dbt models and Snowflake queries
  • Integrate dbt with CI/CD pipelines (e.g., Azure DevOps, GitHub Actions) for automated runs and validations
  • Contribute to documentation of dbt models, data lineage, and business rules
  • Participate in defect analysis, bug fixes, and enhancements during migration and stabilization phases
  • Fulltime
Read More
Arrow Right
New

Offshore Test Engineer

The Offshore Test Engineer position at NTT DATA involves developing Python scrip...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3–6 years of experience in data engineering or ETL/ELT development
  • Working knowledge of Python for data processing
  • Strong SQL skills
  • Hands-on or foundational experience with dbt
  • Familiarity with data warehouses and batch processing concepts
  • Experience working in offshore delivery teams
Job Responsibility
Job Responsibility
  • Develop Python scripts to extract data from source systems and load it into the target warehouse
  • Implement dbt models based on approved transformation designs and patterns
  • Write basic dbt tests and SQL validation queries
  • Support data reconciliation and defect resolution activities
  • Participate in SIT and UAT support cycles
  • Maintain clear and consistent documentation for developed pipelines and models
  • Collaborate with Senior Data Engineers and Technical Leads to clarify requirements and resolve issues
  • Fulltime
Read More
Arrow Right
New

Offshore Test Engineer

The Offshore Test Engineer role at NTT DATA involves executing test cases and va...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3–5 years of experience in data testing or ETL testing roles
  • Strong SQL querying skills
  • Experience validating data pipelines and transformations
  • Basic familiarity with Python-based data pipelines (reading logs, outputs)
  • Understanding of ETL/ELT concepts and data reconciliation techniques
  • Experience working in Agile or wave-based delivery models
Job Responsibility
Job Responsibility
  • Execute test cases for Python-based extract/load processes, validating record counts, field-level data accuracy, schema and data type consistency
  • Perform dbt transformation validation using SQL-based reconciliation queries
  • Compare legacy Informatica outputs with dbt-generated results based on agreed criteria
  • Log, track, and re-test defects through resolution
  • Support SIT and UAT test cycles across multiple migration waves
  • Maintain test artifacts, evidence, and execution logs
  • Assist in regression testing as new waves are onboarded
  • Collaborate with Data Engineers to clarify expected outputs and resolve issues
  • Fulltime
Read More
Arrow Right
New

Offshore Test Engineer

The Offshore Test Engineer role at NTT DATA involves executing test cases and va...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3–5 years of experience in data testing or ETL testing roles
  • Strong SQL querying skills
  • Experience validating data pipelines and transformations
  • Basic familiarity with Python-based data pipelines (reading logs, outputs)
  • Understanding of ETL/ELT concepts and data reconciliation techniques
  • Experience working in Agile or wave-based delivery models
Job Responsibility
Job Responsibility
  • Execute test cases for Python-based extract/load processes, validating record counts, field-level data accuracy, schema and data type consistency
  • Perform dbt transformation validation using SQL-based reconciliation queries
  • Compare legacy Informatica outputs with dbt-generated results based on agreed criteria
  • Log, track, and re-test defects through resolution
  • Support SIT and UAT test cycles across multiple migration waves
  • Maintain test artifacts, evidence, and execution logs
  • Assist in regression testing as new waves are onboarded
  • Collaborate with Data Engineers to clarify expected outputs and resolve issues
  • Fulltime
Read More
Arrow Right