CrawlJobs Logo

DBT Junior Engineer

nttdata.com Logo

NTT DATA

Location Icon

Location:
India , Remote

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

The DBT Junior Engineer role involves translating Informatica mappings into dbt models on Snowflake, ensuring data quality through various tests, and collaborating with teams for effective data solutions. Candidates should have 3-6 years of experience in Data Engineering, strong SQL skills, and familiarity with dbt concepts. This position offers an opportunity to work with cutting-edge technologies in a dynamic environment.

Job Responsibility:

  • Translate Informatica mappings, transformations, and business rules into dbt models (SQL) on Snowflake
  • Design and implement staging, core, and mart layers using standard dbt patterns and folder structures
  • Develop and maintain dbt tests (schema tests, data tests, custom tests) to ensure data quality and integrity
  • Implement snapshots, seeds, macros, and reusable components where appropriate
  • Collaborate with Snowflake developers to ensure physical data structures support dbt models efficiently
  • Work with functional teams to ensure functional equivalence between legacy Informatica outputs and new dbt outputs
  • Participate in performance tuning of dbt models and Snowflake queries
  • Integrate dbt with CI/CD pipelines (e.g., Azure DevOps, GitHub Actions) for automated runs and validations
  • Contribute to documentation of dbt models, data lineage, and business rules
  • Participate in defect analysis, bug fixes, and enhancements during migration and stabilization phases

Requirements:

  • 3–6 years of experience in Data Engineering / ETL / DW
  • 1–3+ years working with dbt (Core or Cloud)
  • Strong SQL skills, especially on Snowflake or another modern cloud DW
  • Experience with dbt concepts: models, tests, sources, seeds, snapshots, macros, exposures
  • Prior experience with Informatica (developer-level understanding of mappings/workflows) is highly desirable
  • Understanding of CI/CD practices and integrating dbt into automated pipelines
  • Knowledge of data modeling (dimensional models, SCDs, fact/dimension design)
  • Experience working in offshore delivery with onshore coordination
  • Good communication skills and ability to read/understand existing ETL logic and requirements documentation

Additional Information:

Job Posted:
February 19, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for DBT Junior Engineer

Sr. Data Engineer - Snowflake

Data Ideology is seeking a Sr. Snowflake Data Engineer to join our growing team ...
Location
Location
Salary
Salary:
Not provided
dataideology.com Logo
Data Ideology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in data engineering, data warehousing, or data architecture
  • 3+ years of hands-on Snowflake experience (performance tuning, data sharing, Snowpark, Snowpipe, etc.)
  • Strong SQL and Python skills, with production experience using dbt
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and modern data tooling (Airflow, Fivetran, Power BI, Looker, Informatica, etc.)
  • Prior experience in a consulting or client-facing delivery role
  • Excellent communication skills, with the ability to collaborate across technical and business stakeholders
  • SnowPro Core Certification required (or willingness to obtain upon hire)
  • advanced Snowflake certifications preferred
Job Responsibility
Job Responsibility
  • Design and build scalable, secure, and cost-effective data solutions in Snowflake
  • Develop and optimize data pipelines using tools such as dbt, Python, CloverDX, and cloud-native services
  • Participate in discovery sessions with clients to gather requirements and translate them into solution designs and project plans
  • Collaborate with engagement managers and account teams to help scope work and provide technical input for Statements of Work (SOWs)
  • Serve as a Snowflake subject matter expert, guiding best practices in performance tuning, cost optimization, access control, and workload management
  • Lead modernization and migration initiatives to move clients from legacy systems into Snowflake
  • Integrate Snowflake with BI tools, governance platforms, and AI/ML frameworks
  • Contribute to internal accelerators, frameworks, and proofs of concept
  • Mentor junior engineers and support knowledge sharing across the team
What we offer
What we offer
  • Flexible Time Off Policy
  • Eligibility for Health Benefits
  • Retirement Plan with Company Match
  • Training and Certification Reimbursement
  • Utilization Based Incentive Program
  • Commission Incentive Program
  • Referral Bonuses
  • Work from Home
  • Fulltime
Read More
Arrow Right

Lead Analytics Engineer

As a Analytics Engineer in the Data Infrastructure Team at Prolific, you'll be a...
Location
Location
United Kingdom
Salary
Salary:
Not provided
prolific.com Logo
Prolific
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Expertise in dbt & SQL: Deep experience with dbt and SQL to design, build, and maintain scalable data models
  • Cloud Technology Knowledge: Strong familiarity with cloud platforms like AWS, GCP etc
  • Data Accuracy Focus: Passion for ensuring high data quality through tests/assertions and robust documentation
  • Commercial Acumen: Ability to understand business needs and communicate effectively with non-technical stakeholders
  • Mentorship Ability: Advocate for best practices in logging and data modeling that supports robust and effective analysis, reporting , and experimentation
  • Collaboration: Skilled at working cross-functionally and translating complex technical concepts into actionable insights for the business
  • Process-Driven: Proficiency in designing repeatable and scalable workflows for data transformation
Job Responsibility
Job Responsibility
  • Building Data Models: Create complex dbt models, custom macros, and reusable packages. Optimise transformations and implement robust testing strategies to ensure data integrity and model performance
  • Ownership: Monitoring and maintaining dbt workflow jobs, ensuring smooth data refreshes and up-to-date pipelines.You will also be responsible for data models for BI analytics & company level reporting
  • Ensuring Data Accuracy: Writing tests and assertions to validate data integrity and consistency across models
  • Documenting and Standardizing: Creating and maintaining thorough documentation of dbt processes to ensure best practices within the BI team
  • Translating Complex Data Concepts: Acting as a key communicator, translating technical data issues into understandable business terms for stakeholders
  • Mentoring Team Members: Supporting junior analysts and data engineers, especially in setting up experimentation platforms and data best practices
  • Collaborating Across Teams: Working closely with the product, engineering, and BI teams to ensure data infrastructure supports evolving business needs
Read More
Arrow Right

Senior Analytics Engineer II

Articulate is looking for a Sr. Analytics Engineer to join our amazing Data team...
Location
Location
United States
Salary
Salary:
137700.00 - 206500.00 USD / Year
articulate.com Logo
Articulate
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8+ years of experience in data or analytics roles
  • At least 5 years in analytics engineering, preferably within a fast-paced tech company or data-driven organization
  • Expert in end-to-end data workflows, from data collection to analysis and presentation, with most expertise in data modeling
  • Confident in translating raw data to reusable, business-friendly data models
  • Proven experience owning and leading the creation of reliable, flexible data models in an enterprise data warehouse
  • Expertise in SQL (writing and analyzing complex queries)
  • Expertise in Data Build Tool (dbt)
  • Expertise in Looker/Tableau (defining data models and measures)
  • Experience building semantic layers using dbt, Looker, or Snowflake’s semantic tables
  • Ability to write complex SQL, run ad-hoc data discovery, and build data models
Job Responsibility
Job Responsibility
  • Provide flexible, trustworthy data models by transforming data from multiple sources with DBT, testing for quality, deploying to visualization tools such as Looker or Tableau, and publishing documentation
  • Collaborate directly with stakeholders to define problems and determine requirements for a solution
  • Set and maintain best practices for data models and processes, including project architecture and QA
  • Proactively identify gaps or design flaws in our data models, and bring recommendations for how to fix them
  • Own documentation of our tools and data, both for our team and for external users
  • Lead discovery on data tools and infrastructure, including setting goals for tooling, continuous analysis of existing tooling, and exploring new tools and features we may adopt
  • Participate as a technical leader in project planning, helping to create well-defined tasks to address data initiatives
  • Mentor more junior members of the data team
  • Represent the data team in some technical architecture and planning conversations
  • Identify and share data risks and dependencies in these contexts
What we offer
What we offer
  • Bonus eligible
  • Robust suite of benefits
  • Fulltime
Read More
Arrow Right

DBT Junior Engineer

The DBT Junior Engineer role involves translating Informatica mappings into dbt ...
Location
Location
India
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3–6 years of experience in Data Engineering / ETL / DW
  • 1–3+ years working with dbt (Core or Cloud)
  • Strong SQL skills, especially on Snowflake or another modern cloud DW
  • Experience with dbt concepts: models, tests, sources, seeds, snapshots, macros, exposures
  • Prior experience with Informatica (developer-level understanding of mappings/workflows) is highly desirable
  • Understanding of CI/CD practices and integrating dbt into automated pipelines
  • Knowledge of data modeling (dimensional models, SCDs, fact/dimension design)
  • Experience working in offshore delivery with onshore coordination
  • Good communication skills and ability to read/understand existing ETL logic and requirements documentation
Job Responsibility
Job Responsibility
  • Translate Informatica mappings, transformations, and business rules into dbt models (SQL) on Snowflake
  • Design and implement staging, core, and mart layers using standard dbt patterns and folder structures
  • Develop and maintain dbt tests (schema tests, data tests, custom tests) to ensure data quality and integrity
  • Implement snapshots, seeds, macros, and reusable components where appropriate
  • Collaborate with Snowflake developers to ensure physical data structures support dbt models efficiently
  • Work with functional teams to ensure functional equivalence between legacy Informatica outputs and new dbt outputs
  • Participate in performance tuning of dbt models and Snowflake queries
  • Integrate dbt with CI/CD pipelines (e.g., Azure DevOps, GitHub Actions) for automated runs and validations
  • Contribute to documentation of dbt models, data lineage, and business rules
  • Participate in defect analysis, bug fixes, and enhancements during migration and stabilization phases
Read More
Arrow Right

DBT Junior Engineer

The DBT Junior Engineer role involves translating Informatica mappings into dbt ...
Location
Location
India
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3–6 years of experience in Data Engineering / ETL / DW
  • 1–3+ years working with dbt (Core or Cloud)
  • Strong SQL skills, especially on Snowflake or another modern cloud DW
  • Experience with dbt concepts: models, tests, sources, seeds, snapshots, macros, exposures
  • Prior experience with Informatica (developer-level understanding of mappings/workflows) is highly desirable
  • Understanding of CI/CD practices and integrating dbt into automated pipelines
  • Knowledge of data modeling (dimensional models, SCDs, fact/dimension design)
  • Experience working in offshore delivery with onshore coordination
  • Good communication skills and ability to read/understand existing ETL logic and requirements documentation
Job Responsibility
Job Responsibility
  • Translate Informatica mappings, transformations, and business rules into dbt models (SQL) on Snowflake
  • Design and implement staging, core, and mart layers using standard dbt patterns and folder structures
  • Develop and maintain dbt tests (schema tests, data tests, custom tests) to ensure data quality and integrity
  • Implement snapshots, seeds, macros, and reusable components where appropriate
  • Collaborate with Snowflake developers to ensure physical data structures support dbt models efficiently
  • Work with functional teams to ensure functional equivalence between legacy Informatica outputs and new dbt outputs
  • Participate in performance tuning of dbt models and Snowflake queries
  • Integrate dbt with CI/CD pipelines (e.g., Azure DevOps, GitHub Actions) for automated runs and validations
  • Contribute to documentation of dbt models, data lineage, and business rules
  • Participate in defect analysis, bug fixes, and enhancements during migration and stabilization phases
Read More
Arrow Right

DBT Junior Engineer

The DBT Junior Engineer role involves translating Informatica mappings into dbt ...
Location
Location
India , Remote
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3–6 years of experience in Data Engineering / ETL / DW
  • 1–3+ years working with dbt (Core or Cloud)
  • Strong SQL skills, especially on Snowflake or another modern cloud DW
  • Experience with dbt concepts: models, tests, sources, seeds, snapshots, macros, exposures
  • Prior experience with Informatica (developer-level understanding of mappings/workflows) is highly desirable
  • Understanding of CI/CD practices and integrating dbt into automated pipelines
  • Knowledge of data modeling (dimensional models, SCDs, fact/dimension design)
  • Experience working in offshore delivery with onshore coordination
  • Good communication skills and ability to read/understand existing ETL logic and requirements documentation
Job Responsibility
Job Responsibility
  • Translate Informatica mappings, transformations, and business rules into dbt models (SQL) on Snowflake
  • Design and implement staging, core, and mart layers using standard dbt patterns and folder structures
  • Develop and maintain dbt tests (schema tests, data tests, custom tests) to ensure data quality and integrity
  • Implement snapshots, seeds, macros, and reusable components where appropriate
  • Collaborate with Snowflake developers to ensure physical data structures support dbt models efficiently
  • Work with functional teams to ensure functional equivalence between legacy Informatica outputs and new dbt outputs
  • Participate in performance tuning of dbt models and Snowflake queries
  • Integrate dbt with CI/CD pipelines (e.g., Azure DevOps, GitHub Actions) for automated runs and validations
  • Contribute to documentation of dbt models, data lineage, and business rules
  • Participate in defect analysis, bug fixes, and enhancements during migration and stabilization phases
  • Fulltime
Read More
Arrow Right

Snowflake Junior Engineer

The Snowflake Junior Engineer role involves designing and implementing Snowflake...
Location
Location
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2-4 years of experience in Data Engineering
  • experience with Snowflake
  • experience with SQL
Job Responsibility
Job Responsibility
  • designing and implementing Snowflake schemas
  • optimizing SQL scripts
  • collaborating with dbt developers
Read More
Arrow Right

IT Cloud Data Engineer - HVR & DBT

Location
Location
United States , Oak Brook
Salary
Salary:
40.30 - 60.45 USD / Hour
advocatehealth.com Logo
Advocate Health Care
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree in Computer Science or related field
  • Typically requires 5 years of experience in at least two IT disciplines, including database management, cloud engineering, data engineering and middleware technologies
  • Includes 2 years of work experience with cloud platforms, including experience with data integration, performance optimization, and platform administration
  • Experience defining, designing, and developing solutions with data integration platforms/tools
  • Proven experience building and optimizing data pipelines, and data sets
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • Must have experience in data transformation and data pipeline development using GUI based tools or programming languages like SQL and Python
  • Must have experience with DevOps tool chains and processes
Job Responsibility
Job Responsibility
  • Drive scope definition, requirements analysis, data and technical design, pipeline build, product configuration, unit testing, and production deployment
  • Design scalable ingestion processes to bring on-prem, API drive, 3rd party, end user generated data sources to integrate in common cloud infrastructure
  • Design reusable assets, components, standards, frameworks, and processes to accelerate and facilitate data integration projects
  • Develop data integration and transformation jobs using Python, SQL and ETL /ELT tools
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management
  • Develop and implement scripts for data process maintenance, monitoring, and performance tuning
  • Test and document data processes through data validation and verification procedures
  • Ensure delivered solutions meet/perform to technical and functional/non-functional requirements
  • Provide technical guidance and mentorship to junior engineers, ensuring best practices in data engineering
What we offer
What we offer
  • Paid Time Off programs
  • Health and welfare benefits such as medical, dental, vision, life, and Short- and Long-Term Disability
  • Flexible Spending Accounts for eligible health care and dependent care expenses
  • Family benefits such as adoption assistance and paid parental leave
  • Defined contribution retirement plans with employer match and other financial wellness programs
  • Educational Assistance Program
  • Fulltime
Read More
Arrow Right