CrawlJobs Logo

Informatica QA/ Data Engineer

signifytechnology.com Logo

Signify Technology

Location Icon

Location:
United States

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

75.00 USD / Hour

Job Description:

Client have a huge migration and need someone to transfer the data over this role is a combination of data engineering and QA

Job Responsibility:

  • Transfer the data over for a huge migration
  • Combination of data engineering and QA

Requirements:

Data Engineering

Additional Information:

Job Posted:
January 20, 2026

Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Informatica QA/ Data Engineer

Data Architect

We are seeking a highly skilled and experienced Data Architect to join our team....
Location
Location
Poland , Warsaw
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 6+ years of professional experience in data-centric roles
  • Proven experience in a technical leadership position such as Solution Architect, Technical Lead, or Principal Engineer
  • Expert-level knowledge of Snowflake and Azure Data Factory (ADF)
  • Deep experience with SQL Server
  • Familiarity with Informatica and IBM DB2
  • Hands-on experience with Control-M
  • Experience with SAP Business Objects and Power BI
  • Expert-level SQL
  • Proficiency in Python
  • Experience with XML and JSON data formats
Job Responsibility
Job Responsibility
  • Design and document end-to-end solutions for new data integrations, major enhancements, and complex data products
  • Contribute to the technical roadmap for the Enterprise Data Warehouse (EDW)
  • Define and enforce data engineering and development standards across the program
  • Serve as the final technical approver for all significant code changes, pull requests, and architectural modifications
  • Act as the highest point of technical escalation for critical P1/P2 incidents and complex problems
  • Lead technical investigation during major outages
  • Drive technical resolution for systemic issues
  • Mentor Data Engineers and Data Visualization Engineers
  • Champion creation and maintenance of technical documentation
  • Work closely with client architects and source system owners
What we offer
What we offer
  • Flexible working hours
  • Hybrid work model
  • Cafeteria system
  • Referral bonuses up to PLN6,000
  • Additional revenue sharing opportunities
  • Ongoing guidance from dedicated Team Manager
  • Tailored technical mentoring
  • Dedicated team-building budget
  • Opportunities to participate in charitable initiatives and local sports programs
  • Supportive and inclusive work culture
  • Fulltime
Read More
Arrow Right

Data Architect

We are seeking a highly skilled and experienced Data Architect to join our team....
Location
Location
Poland , Warsaw
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 6+ years of professional experience in data-centric roles
  • Proven experience in a technical leadership position such as Solution Architect, Technical Lead, or Principal Engineer
  • Expert-level knowledge of Snowflake and Azure Data Factory (ADF)
  • Deep experience with SQL Server
  • Familiarity with Informatica and IBM DB2
  • Hands-on experience with Control-M
  • Experience with SAP Business Objects and Power BI
  • Expert-level SQL
  • Proficiency in Python
  • Experience with XML and JSON data formats
Job Responsibility
Job Responsibility
  • Design and document end-to-end solutions for new data integrations, major enhancements, and complex data products
  • Contribute to the technical roadmap for the Enterprise Data Warehouse (EDW)
  • Define and enforce data engineering and development standards across the program
  • Serve as the final technical approver for all significant code changes, pull requests, and architectural modifications
  • Act as the highest point of technical escalation for critical P1/P2 incidents and complex problems
  • Lead technical investigation during major outages
  • Drive technical resolution for systemic issues
  • Mentor Data Engineers and Data Visualization Engineers
  • Champion creation and maintenance of technical documentation
  • Work closely with client architects, source system owners, and other vendor teams
What we offer
What we offer
  • Flexible working hours
  • Hybrid work model
  • Cafeteria system
  • Referral bonuses up to PLN6,000
  • Additional revenue sharing opportunities
  • Ongoing guidance from dedicated Team Manager
  • Tailored technical mentoring
  • Dedicated team-building budget
  • Opportunities to participate in charitable initiatives and local sports programs
  • Supportive and inclusive work culture
  • Fulltime
Read More
Arrow Right

Staff Engineer - Finance Data Engineer

GEICO is seeking Finance Data specialists to support the build out of a Finance ...
Location
Location
United States , Palo Alto
Salary
Salary:
110000.00 - 260000.00 USD / Year
geico.com Logo
Geico
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 15+ years of Finance Systems experience with industry leading ERP solutions (e.g. implementing and supporting Oracle
  • Workday
  • SAP or PeopleSoft)
  • 3+ years of experience in implementing and supporting FP&A applications (Preferred)
  • 3+ years working with or supporting a Finance Data Lake/Warehouse/Mart along with various Financial Reporting tools
  • Good understanding of Dimensional Data Modeling
  • Strong working knowledge of Data processing/data transformation using ETL/ELT tools such as Informatica, DBT, etc.
  • Strong working knowledge of SQL and the ability to write, debug and optimize SQL queries and ETL jobs
  • Experience working with Financials via the major vendors (e.g. PeopleSoft
  • Oracle EBS
Job Responsibility
Job Responsibility
  • Leverage your strong Functional and Technical systems expertise to drive towards the right solution to support the department and GEICO’s current and future needs
  • Engage in cross-functional collaboration throughout the entire software lifecycle
  • Support design sessions with peers to ensure systems are well designed, efficient, and meet Business expectations
  • Collaborate with Finance Leaders within the FP&A, Controllers, Financial Reporting, and Finance QA & Systems team along with the Data Engineering and Finance Technology leaders to ensure organizational goals are met
  • Have a deep understanding of a few Finance functions to support the establishment of a vision and technology roadmap to build out a Finance Data Lake/Warehouse
  • Have a good understanding of finance data lake/mart with Data processing/data transformation using ETL/ELT tools such as Informatica, DBT, etc.
  • Experience with designing, developing, implementing, and maintaining solutions for data ingestion and transformation projects
  • Experience working with cloud data solutions (Delta Lake, Iceberg, Hudi, Snowflake, Redshift or equivalent)
  • Support the development of a roadmap and then work towards implementing Revenue and Expense Analytical solutions
  • Support and try to influence customers and stakeholders, and work through divergent expectations
What we offer
What we offer
  • Comprehensive Total Rewards program that offers personalized coverage tailor-made for you and your family’s overall well-being
  • Financial benefits including market-competitive compensation
  • a 401K savings plan vested from day one that offers a 6% match
  • performance and recognition-based incentives
  • and tuition assistance
  • Access to additional benefits like mental healthcare as well as fertility and adoption assistance
  • Supports flexibility- We provide workplace flexibility as well as our GEICO Flex program, which offers the ability to work from anywhere in the US for up to four weeks per year
  • Fulltime
Read More
Arrow Right

Test Engineer

The Test Engineer role involves executing data validation queries, performing re...
Location
Location
India , Remote
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3–6 years of experience in Software Testing / QA
  • At least 2+ years in Data Warehouse / ETL testing
  • Practical experience in data migration / data platform modernization projects is strongly preferred
  • Strong SQL skills for writing complex queries and validations (joins, aggregations, window functions, filters)
  • Hands-on experience testing on at least one data platform such as: Snowflake, Databricks, or other cloud DW/Big Data platforms – experience with Snowflake/Databricks is a plus
  • Exposure to Informatica (understanding mappings/workflows and expected outputs)
  • Exposure to dbt or willingness to learn dbt-based validations (model outputs, dbt test results)
  • Good understanding of data warehouse concepts: facts/dimensions, SCDs, aggregations, grain, source-to-target mapping
  • Familiarity with test management and defect tracking tools (e.g., Azure DevOps, JIRA, TestRail, qTest)
  • Experience with Excel or similar tools for managing validation results and reconciliation artifacts
Job Responsibility
Job Responsibility
  • Execute data validation queries to compare: Source (Yellowbricks / Informatica outputs) vs Target (Databricks / Snowflake / dbt outputs)
  • Perform: Row-count checks between source and target tables
  • Aggregate validations (sums, counts, averages, distinct counts, min/max)
  • Sample and cell-level comparisons for critical entities and fields
  • Validate schema consistency (data types, lengths, precision/scale, nullable fields, constraints)
  • Log discrepancies, analyze patterns, and work with developers to support root-cause analysis
  • Understand Informatica mappings and workflows and their expected outputs
  • Validate dbt models (staging, core, marts) on Snowflake against legacy Informatica outputs
  • Test Databricks notebooks/jobs that replicate Yellowbricks logic — ensuring correct transformations, joins, filters, and aggregations
  • Validate CDC / incremental loads: new, changed, and deleted records are handled correctly
Read More
Arrow Right

Junior Test Engineer

The Junior Test Engineer role involves executing data validation queries, perfor...
Location
Location
India , Remote
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2-4 years of experience in Software Testing / QA
  • At least 2+ years in Data Warehouse / ETL testing
  • Practical experience in data migration / data platform modernization projects is strongly preferred
  • Strong SQL skills for writing complex queries and validations (joins, aggregations, window functions, filters)
  • Hands-on experience testing on at least one data platform such as Snowflake, Databricks, or other cloud DW/Big Data platforms
  • Exposure to Informatica (understanding mappings/workflows and expected outputs)
  • Exposure to dbt or willingness to learn dbt-based validations
  • Good understanding of data warehouse concepts: facts/dimensions, SCDs, aggregations, grain, source-to-target mapping
  • Familiarity with test management and defect tracking tools (e.g., Azure DevOps, JIRA, TestRail, qTest)
  • Experience with Excel or similar tools for managing validation results and reconciliation artifacts
Job Responsibility
Job Responsibility
  • Execute data validation queries to compare Source (Yellowbricks / Informatica outputs) vs Target (Databricks / Snowflake / dbt outputs)
  • Perform row-count checks between source and target tables
  • Perform aggregate validations (sums, counts, averages, distinct counts, min/max)
  • Perform sample and cell-level comparisons for critical entities and fields
  • Validate schema consistency (data types, lengths, precision/scale, nullable fields, constraints)
  • Log discrepancies, analyze patterns, and work with developers to support root-cause analysis
  • Understand Informatica mappings and workflows and their expected outputs
  • Validate dbt models (staging, core, marts) on Snowflake against legacy Informatica outputs
  • Test Databricks notebooks/jobs that replicate Yellowbricks logic — ensuring correct transformations, joins, filters, and aggregations
  • Validate CDC / incremental loads: new, changed, and deleted records are handled correctly
Read More
Arrow Right
New

Junior Test Engineer

The Junior Test Engineer will execute data validation queries and perform checks...
Location
Location
India , Remote
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2-4 years of experience in Software Testing / QA
  • At least 2+ years in Data Warehouse / ETL testing
  • Practical experience in data migration / data platform modernization projects is strongly preferred
  • Strong SQL skills for writing complex queries and validations (joins, aggregations, window functions, filters)
  • Hands-on experience testing on at least one data platform such as Snowflake, Databricks, or other cloud DW/Big Data platforms
  • Exposure to Informatica (understanding mappings/workflows and expected outputs)
  • Exposure to dbt or willingness to learn dbt-based validations
  • Good understanding of data warehouse concepts: facts/dimensions, SCDs, aggregations, grain, source-to-target mapping
  • Familiarity with test management and defect tracking tools (e.g., Azure DevOps, JIRA, TestRail, qTest)
  • Experience with Excel or similar tools for managing validation results and reconciliation artifacts
Job Responsibility
Job Responsibility
  • Execute data validation queries to compare Source (Yellowbricks / Informatica outputs) vs Target (Databricks / Snowflake / dbt outputs)
  • Perform row-count checks between source and target tables
  • Perform aggregate validations (sums, counts, averages, distinct counts, min/max)
  • Perform sample and cell-level comparisons for critical entities and fields
  • Validate schema consistency (data types, lengths, precision/scale, nullable fields, constraints)
  • Log discrepancies, analyze patterns, and work with developers to support root-cause analysis
  • Understand Informatica mappings and workflows and their expected outputs
  • Validate dbt models (staging, core, marts) on Snowflake against legacy Informatica outputs
  • Test Databricks notebooks/jobs that replicate Yellowbricks logic — ensuring correct transformations, joins, filters, and aggregations
  • Validate CDC / incremental loads: new, changed, and deleted records are handled correctly
Read More
Arrow Right

Junior Test Engineer

The Junior Test Engineer role involves executing data validation queries, perfor...
Location
Location
India
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2-4 years of experience in Software Testing / QA
  • At least 2+ years in Data Warehouse / ETL testing
  • Practical experience in data migration / data platform modernization projects is strongly preferred
  • Strong SQL skills for writing complex queries and validations (joins, aggregations, window functions, filters)
  • Hands-on experience testing on at least one data platform such as Snowflake, Databricks, or other cloud DW/Big Data platforms
  • Exposure to Informatica (understanding mappings/workflows and expected outputs)
  • Exposure to dbt or willingness to learn dbt-based validations
  • Good understanding of data warehouse concepts: facts/dimensions, SCDs, aggregations, grain, source-to-target mapping
  • Familiarity with test management and defect tracking tools (e.g., Azure DevOps, JIRA, TestRail, qTest)
  • Experience with Excel or similar tools for managing validation results and reconciliation artifacts
Job Responsibility
Job Responsibility
  • Execute data validation queries to compare Source (Yellowbricks / Informatica outputs) vs Target (Databricks / Snowflake / dbt outputs)
  • Perform row-count checks between source and target tables
  • Perform aggregate validations (sums, counts, averages, distinct counts, min/max)
  • Perform sample and cell-level comparisons for critical entities and fields
  • Validate schema consistency (data types, lengths, precision/scale, nullable fields, constraints)
  • Log discrepancies, analyze patterns, and work with developers to support root-cause analysis
  • Understand Informatica mappings and workflows and their expected outputs
  • Validate dbt models (staging, core, marts) on Snowflake against legacy Informatica outputs
  • Test Databricks notebooks/jobs that replicate Yellowbricks logic — ensuring correct transformations, joins, filters, and aggregations
  • Validate CDC / incremental loads: new, changed, and deleted records are handled correctly
Read More
Arrow Right

Software Developer

Location
Location
United States , Alpharetta
Salary
Salary:
Not provided
s2itgroup.com Logo
S2 IT Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Master’s degree in Electrical Engineering or Equivalent
  • Anysuitable combination of education, training or experience accepted
Job Responsibility
Job Responsibility
  • Perform Functional Analysis of MDM Implementation
  • Generate reports on Customer Integrated Master Data using PowerBI
  • Perform DataAnalysis using IDQ
  • Perform Analytics and validate applications on AWS S3, SNS,RDS and validate connections, NoSQL dB in Microsoft MySQL workbenches
  • Analyzeand Validate Integrated Hub of Hubs
  • Support Analysis and monitor SFDCIntegration, Oracle RMB Integration and Oracle BPM Integration using MuleSoft
  • Analyze, Develop and test IDD application, Entity 360, Smart Search, BES,IDD/MDM User Exits
  • Perform web services/micro services testing using SpringBoot, JAX RS and JAX WS, JIRA or HPQC and Postman
  • Analyze and Validate Eventdriven JSON based distribution model
  • Load data required for Functional orRegression testing using Informatica
What we offer
What we offer
  • This position is eligible for our employee referral program
  • Fulltime
Read More
Arrow Right