CrawlJobs Logo

DBT/ Snowflake Junior Engineer

nttdata.com Logo

NTT DATA

Location Icon

Location:
India , Remote

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a DBT/ Snowflake Junior Engineer to join our team in Remote, Karnātaka (IN-KA), India (IN).

Job Responsibility:

  • Translate Informatica mappings, transformations, and business rules into dbt models (SQL) on Snowflake
  • Design and implement staging, core, and mart layers using standard dbt patterns and folder structures
  • Develop and maintain dbt tests (schema tests, data tests, custom tests) to ensure data quality and integrity
  • Implement snapshots, seeds, macros, and reusable components where appropriate
  • Collaborate with Snowflake developers to ensure physical data structures support dbt models efficiently
  • Work with functional teams to ensure functional equivalence between legacy Informatica outputs and new dbt outputs
  • Participate in performance tuning of dbt models and Snowflake queries
  • Integrate dbt with CI/CD pipelines (e.g., Azure DevOps, GitHub Actions) for automated runs and validations
  • Contribute to documentation of dbt models, data lineage, and business rules
  • Participate in defect analysis, bug fixes, and enhancements during migration and stabilization phases

Requirements:

  • 7+ years of experience in Data Engineering / ETL / DW
  • 1–3+ years working with dbt (Core or Cloud)
  • Strong SQL skills, especially on Snowflake or another modern cloud DW
  • Experience with dbt concepts: models, tests, sources, seeds, snapshots, macros, exposures
  • Prior experience with Informatica (developer-level understanding of mappings/workflows) is highly desirable
  • Understanding of CI/CD practices and integrating dbt into automated pipelines
  • Knowledge of data modeling (dimensional models, SCDs, fact/dimension design)
  • Experience working in offshore delivery with onshore coordination
  • Good communication skills and ability to read/understand existing ETL logic and requirements documentation

Additional Information:

Job Posted:
April 19, 2026

Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for DBT/ Snowflake Junior Engineer

Sr. Data Engineer - Snowflake

Data Ideology is seeking a Sr. Snowflake Data Engineer to join our growing team ...
Location
Location
Salary
Salary:
Not provided
dataideology.com Logo
Data Ideology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in data engineering, data warehousing, or data architecture
  • 3+ years of hands-on Snowflake experience (performance tuning, data sharing, Snowpark, Snowpipe, etc.)
  • Strong SQL and Python skills, with production experience using dbt
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and modern data tooling (Airflow, Fivetran, Power BI, Looker, Informatica, etc.)
  • Prior experience in a consulting or client-facing delivery role
  • Excellent communication skills, with the ability to collaborate across technical and business stakeholders
  • SnowPro Core Certification required (or willingness to obtain upon hire)
  • advanced Snowflake certifications preferred
Job Responsibility
Job Responsibility
  • Design and build scalable, secure, and cost-effective data solutions in Snowflake
  • Develop and optimize data pipelines using tools such as dbt, Python, CloverDX, and cloud-native services
  • Participate in discovery sessions with clients to gather requirements and translate them into solution designs and project plans
  • Collaborate with engagement managers and account teams to help scope work and provide technical input for Statements of Work (SOWs)
  • Serve as a Snowflake subject matter expert, guiding best practices in performance tuning, cost optimization, access control, and workload management
  • Lead modernization and migration initiatives to move clients from legacy systems into Snowflake
  • Integrate Snowflake with BI tools, governance platforms, and AI/ML frameworks
  • Contribute to internal accelerators, frameworks, and proofs of concept
  • Mentor junior engineers and support knowledge sharing across the team
What we offer
What we offer
  • Flexible Time Off Policy
  • Eligibility for Health Benefits
  • Retirement Plan with Company Match
  • Training and Certification Reimbursement
  • Utilization Based Incentive Program
  • Commission Incentive Program
  • Referral Bonuses
  • Work from Home
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer – Dublin (Hybrid) Contract Role | 3 Days Onsite. We are see...
Location
Location
Ireland , Dublin
Salary
Salary:
Not provided
solasit.ie Logo
Solas IT Recruitment
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience as a Data Engineer working with distributed data systems
  • 4+ years of deep Snowflake experience, including performance tuning, SQL optimization, and data modelling
  • Strong hands-on experience with the Hadoop ecosystem: HDFS, Hive, Impala, Spark (PySpark preferred)
  • Oozie, Airflow, or similar orchestration tools
  • Proven expertise with PySpark, Spark SQL, and large-scale data processing patterns
  • Experience with Databricks and Delta Lake (or equivalent big-data platforms)
  • Strong programming background in Python, Scala, or Java
  • Experience with cloud services (AWS preferred): S3, Glue, EMR, Redshift, Lambda, Athena, etc.
Job Responsibility
Job Responsibility
  • Build, enhance, and maintain large-scale ETL/ELT pipelines using Hadoop ecosystem tools including HDFS, Hive, Impala, and Oozie/Airflow
  • Develop distributed data processing solutions with PySpark, Spark SQL, Scala, or Python to support complex data transformations
  • Implement scalable and secure data ingestion frameworks to support both batch and streaming workloads
  • Work hands-on with Snowflake to design performant data models, optimize queries, and establish solid data governance practices
  • Collaborate on the migration and modernization of current big-data workloads to cloud-native platforms and Databricks
  • Tune Hadoop, Spark, and Snowflake systems for performance, storage efficiency, and reliability
  • Apply best practices in data modelling, partitioning strategies, and job orchestration for large datasets
  • Integrate metadata management, lineage tracking, and governance standards across the platform
  • Build automated validation frameworks to ensure accuracy, completeness, and reliability of data pipelines
  • Develop unit, integration, and end-to-end testing for ETL workflows using Python, Spark, and dbt testing where applicable
Read More
Arrow Right

Senior Analytics Engineer II

Articulate is looking for a Sr. Analytics Engineer to join our amazing Data team...
Location
Location
United States
Salary
Salary:
137700.00 - 206500.00 USD / Year
articulate.com Logo
Articulate
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8+ years of experience in data or analytics roles
  • At least 5 years in analytics engineering, preferably within a fast-paced tech company or data-driven organization
  • Expert in end-to-end data workflows, from data collection to analysis and presentation, with most expertise in data modeling
  • Confident in translating raw data to reusable, business-friendly data models
  • Proven experience owning and leading the creation of reliable, flexible data models in an enterprise data warehouse
  • Expertise in SQL (writing and analyzing complex queries)
  • Expertise in Data Build Tool (dbt)
  • Expertise in Looker/Tableau (defining data models and measures)
  • Experience building semantic layers using dbt, Looker, or Snowflake’s semantic tables
  • Ability to write complex SQL, run ad-hoc data discovery, and build data models
Job Responsibility
Job Responsibility
  • Provide flexible, trustworthy data models by transforming data from multiple sources with DBT, testing for quality, deploying to visualization tools such as Looker or Tableau, and publishing documentation
  • Collaborate directly with stakeholders to define problems and determine requirements for a solution
  • Set and maintain best practices for data models and processes, including project architecture and QA
  • Proactively identify gaps or design flaws in our data models, and bring recommendations for how to fix them
  • Own documentation of our tools and data, both for our team and for external users
  • Lead discovery on data tools and infrastructure, including setting goals for tooling, continuous analysis of existing tooling, and exploring new tools and features we may adopt
  • Participate as a technical leader in project planning, helping to create well-defined tasks to address data initiatives
  • Mentor more junior members of the data team
  • Represent the data team in some technical architecture and planning conversations
  • Identify and share data risks and dependencies in these contexts
What we offer
What we offer
  • Bonus eligible
  • Robust suite of benefits
  • Fulltime
Read More
Arrow Right

DBT Junior Engineer

The DBT Junior Engineer role involves translating Informatica mappings into dbt ...
Location
Location
India
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3–6 years of experience in Data Engineering / ETL / DW
  • 1–3+ years working with dbt (Core or Cloud)
  • Strong SQL skills, especially on Snowflake or another modern cloud DW
  • Experience with dbt concepts: models, tests, sources, seeds, snapshots, macros, exposures
  • Prior experience with Informatica (developer-level understanding of mappings/workflows) is highly desirable
  • Understanding of CI/CD practices and integrating dbt into automated pipelines
  • Knowledge of data modeling (dimensional models, SCDs, fact/dimension design)
  • Experience working in offshore delivery with onshore coordination
  • Good communication skills and ability to read/understand existing ETL logic and requirements documentation
Job Responsibility
Job Responsibility
  • Translate Informatica mappings, transformations, and business rules into dbt models (SQL) on Snowflake
  • Design and implement staging, core, and mart layers using standard dbt patterns and folder structures
  • Develop and maintain dbt tests (schema tests, data tests, custom tests) to ensure data quality and integrity
  • Implement snapshots, seeds, macros, and reusable components where appropriate
  • Collaborate with Snowflake developers to ensure physical data structures support dbt models efficiently
  • Work with functional teams to ensure functional equivalence between legacy Informatica outputs and new dbt outputs
  • Participate in performance tuning of dbt models and Snowflake queries
  • Integrate dbt with CI/CD pipelines (e.g., Azure DevOps, GitHub Actions) for automated runs and validations
  • Contribute to documentation of dbt models, data lineage, and business rules
  • Participate in defect analysis, bug fixes, and enhancements during migration and stabilization phases
Read More
Arrow Right

DBT Junior Engineer

The DBT Junior Engineer role involves translating Informatica mappings into dbt ...
Location
Location
India , Remote
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3–6 years of experience in Data Engineering / ETL / DW
  • 1–3+ years working with dbt (Core or Cloud)
  • Strong SQL skills, especially on Snowflake or another modern cloud DW
  • Experience with dbt concepts: models, tests, sources, seeds, snapshots, macros, exposures
  • Prior experience with Informatica (developer-level understanding of mappings/workflows) is highly desirable
  • Understanding of CI/CD practices and integrating dbt into automated pipelines
  • Knowledge of data modeling (dimensional models, SCDs, fact/dimension design)
  • Experience working in offshore delivery with onshore coordination
  • Good communication skills and ability to read/understand existing ETL logic and requirements documentation
Job Responsibility
Job Responsibility
  • Translate Informatica mappings, transformations, and business rules into dbt models (SQL) on Snowflake
  • Design and implement staging, core, and mart layers using standard dbt patterns and folder structures
  • Develop and maintain dbt tests (schema tests, data tests, custom tests) to ensure data quality and integrity
  • Implement snapshots, seeds, macros, and reusable components where appropriate
  • Collaborate with Snowflake developers to ensure physical data structures support dbt models efficiently
  • Work with functional teams to ensure functional equivalence between legacy Informatica outputs and new dbt outputs
  • Participate in performance tuning of dbt models and Snowflake queries
  • Integrate dbt with CI/CD pipelines (e.g., Azure DevOps, GitHub Actions) for automated runs and validations
  • Contribute to documentation of dbt models, data lineage, and business rules
  • Participate in defect analysis, bug fixes, and enhancements during migration and stabilization phases
  • Fulltime
Read More
Arrow Right

Snowflake Junior Engineer

The Snowflake Junior Engineer role involves designing and implementing Snowflake...
Location
Location
India , Remote
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2-4 years of experience in Data Engineering / DW development, with 2+ years on Snowflake
  • Strong hands-on SQL skills and experience with large-scale DW solutions
  • Solid understanding of Snowflake architecture (warehouses, databases, schemas, stages, virtual warehouses, security roles)
  • Experience with cloud platforms, ideally Azure and its integration with Snowflake (ADLS/Blob, ADF/Synapse, Key Vault)
  • Prior exposure to migration from MPP platforms (Yellowbricks, Teradata, Netezza, etc.) to Snowflake is a plus
  • Familiarity with dbt, Databricks, or ETL tools (Informatica) is an advantage
  • Experience working in offshore delivery models, collaborating with onshore teams
Job Responsibility
Job Responsibility
  • Design and implement Snowflake schemas, tables, views, materialized views, and stages to support migrated workloads
  • Recreate/translate Yellowbricks tables, views, and logic into Snowflake with functional equivalence
  • Collaborate with dbt developers to ensure dbt models are aligned with Snowflake best practices (clustering, micro-partitioning, warehouses)
  • Develop and optimize SQL scripts, stored procedures (Snowflake Scripting), and views used by dbt, Databricks, and BI tools
  • Implement and manage Snowflake roles, grants, and security models in line with enterprise standards
  • Support performance tuning for complex queries, including warehouse sizing, result caching, clustering, and statistics
  • Assist with data migration and validation between Yellowbricks and Snowflake (row counts, aggregates, and spot checks)
  • Contribute to CI/CD implementation for Snowflake objects (using Azure DevOps or similar)
  • Work closely with onshore architects and leads, attending overlap meetings in US time zones as required
  • Fulltime
Read More
Arrow Right

Snowflake Junior Engineer

We are currently seeking a Snowflake Junior Engineer to join our team in Remote,...
Location
Location
India , Remote
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2-4 years of experience in Data Engineering / DW development, with 2+ years on Snowflake
  • Strong hands-on SQL skills and experience with large-scale DW solutions
  • Solid understanding of Snowflake architecture (warehouses, databases, schemas, stages, virtual warehouses, security roles)
  • Experience with cloud platforms, ideally Azure and its integration with Snowflake (ADLS/Blob, ADF/Synapse, Key Vault)
  • Prior exposure to migration from MPP platforms (Yellowbricks, Teradata, Netezza, etc.) to Snowflake is a plus
  • Familiarity with dbt, Databricks, or ETL tools (Informatica) is an advantage
  • Experience working in offshore delivery models, collaborating with onshore teams
Job Responsibility
Job Responsibility
  • Design and implement Snowflake schemas, tables, views, materialized views, and stages to support migrated workloads
  • Recreate/translate Yellowbricks tables, views, and logic into Snowflake with functional equivalence
  • Collaborate with dbt developers to ensure dbt models are aligned with Snowflake best practices (clustering, micro-partitioning, warehouses)
  • Develop and optimize SQL scripts, stored procedures (Snowflake Scripting), and views used by dbt, Databricks, and BI tools
  • Implement and manage Snowflake roles, grants, and security models in line with enterprise standards
  • Support performance tuning for complex queries, including warehouse sizing, result caching, clustering, and statistics
  • Assist with data migration and validation between Yellowbricks and Snowflake (row counts, aggregates, and spot checks)
  • Contribute to CI/CD implementation for Snowflake objects (using Azure DevOps or similar)
  • Work closely with onshore architects and leads, attending overlap meetings in US time zones as required
Read More
Arrow Right

Snowflake Junior Engineer

The Snowflake Junior Engineer role involves designing and implementing Snowflake...
Location
Location
India , Remote
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2-4 years of experience in Data Engineering / DW development
  • 2+ years of experience on Snowflake
  • Strong hands-on SQL skills and experience with large-scale DW solutions
  • Solid understanding of Snowflake architecture (warehouses, databases, schemas, stages, virtual warehouses, security roles)
  • Experience with cloud platforms, ideally Azure and its integration with Snowflake (ADLS/Blob, ADF/Synapse, Key Vault)
  • Prior exposure to migration from MPP platforms (Yellowbricks, Teradata, Netezza, etc.) to Snowflake is a plus
  • Familiarity with dbt, Databricks, or ETL tools (Informatica) is an advantage
  • Experience working in offshore delivery models, collaborating with onshore teams
Job Responsibility
Job Responsibility
  • Design and implement Snowflake schemas, tables, views, materialized views, and stages to support migrated workloads
  • Recreate/translate Yellowbricks tables, views, and logic into Snowflake with functional equivalence
  • Collaborate with dbt developers to ensure dbt models are aligned with Snowflake best practices (clustering, micro-partitioning, warehouses)
  • Develop and optimize SQL scripts, stored procedures (Snowflake Scripting), and views used by dbt, Databricks, and BI tools
  • Implement and manage Snowflake roles, grants, and security models in line with enterprise standards
  • Support performance tuning for complex queries, including warehouse sizing, result caching, clustering, and statistics
  • Assist with data migration and validation between Yellowbricks and Snowflake (row counts, aggregates, and spot checks)
  • Contribute to CI/CD implementation for Snowflake objects (using Azure DevOps or similar)
  • Work closely with onshore architects and leads, attending overlap meetings in US time zones as required
Read More
Arrow Right