CrawlJobs Logo

Informatica QA/ Data Engineer

signifytechnology.com Logo

Signify Technology

Location Icon

Location:
United States

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

75.00 USD / Hour

Job Description:

Client have a huge migration and need someone to transfer the data over this role is a combination of data engineering and QA

Job Responsibility:

  • Transfer the data over for a huge migration
  • Combination of data engineering and QA

Requirements:

Data Engineering

Additional Information:

Job Posted:
January 20, 2026

Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Informatica QA/ Data Engineer

Data Architect

We are seeking a highly skilled and experienced Data Architect to join our team....
Location
Location
Poland , Warsaw
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 6+ years of professional experience in data-centric roles
  • Proven experience in a technical leadership position such as Solution Architect, Technical Lead, or Principal Engineer
  • Expert-level knowledge of Snowflake and Azure Data Factory (ADF)
  • Deep experience with SQL Server
  • Familiarity with Informatica and IBM DB2
  • Hands-on experience with Control-M
  • Experience with SAP Business Objects and Power BI
  • Expert-level SQL
  • Proficiency in Python
  • Experience with XML and JSON data formats
Job Responsibility
Job Responsibility
  • Design and document end-to-end solutions for new data integrations, major enhancements, and complex data products
  • Contribute to the technical roadmap for the Enterprise Data Warehouse (EDW)
  • Define and enforce data engineering and development standards across the program
  • Serve as the final technical approver for all significant code changes, pull requests, and architectural modifications
  • Act as the highest point of technical escalation for critical P1/P2 incidents and complex problems
  • Lead technical investigation during major outages
  • Drive technical resolution for systemic issues
  • Mentor Data Engineers and Data Visualization Engineers
  • Champion creation and maintenance of technical documentation
  • Work closely with client architects and source system owners
What we offer
What we offer
  • Flexible working hours
  • Hybrid work model
  • Cafeteria system
  • Referral bonuses up to PLN6,000
  • Additional revenue sharing opportunities
  • Ongoing guidance from dedicated Team Manager
  • Tailored technical mentoring
  • Dedicated team-building budget
  • Opportunities to participate in charitable initiatives and local sports programs
  • Supportive and inclusive work culture
  • Fulltime
Read More
Arrow Right

Data Architect

We are seeking a highly skilled and experienced Data Architect to join our team....
Location
Location
Poland , Warsaw
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 6+ years of professional experience in data-centric roles
  • Proven experience in a technical leadership position such as Solution Architect, Technical Lead, or Principal Engineer
  • Expert-level knowledge of Snowflake and Azure Data Factory (ADF)
  • Deep experience with SQL Server
  • Familiarity with Informatica and IBM DB2
  • Hands-on experience with Control-M
  • Experience with SAP Business Objects and Power BI
  • Expert-level SQL
  • Proficiency in Python
  • Experience with XML and JSON data formats
Job Responsibility
Job Responsibility
  • Design and document end-to-end solutions for new data integrations, major enhancements, and complex data products
  • Contribute to the technical roadmap for the Enterprise Data Warehouse (EDW)
  • Define and enforce data engineering and development standards across the program
  • Serve as the final technical approver for all significant code changes, pull requests, and architectural modifications
  • Act as the highest point of technical escalation for critical P1/P2 incidents and complex problems
  • Lead technical investigation during major outages
  • Drive technical resolution for systemic issues
  • Mentor Data Engineers and Data Visualization Engineers
  • Champion creation and maintenance of technical documentation
  • Work closely with client architects, source system owners, and other vendor teams
What we offer
What we offer
  • Flexible working hours
  • Hybrid work model
  • Cafeteria system
  • Referral bonuses up to PLN6,000
  • Additional revenue sharing opportunities
  • Ongoing guidance from dedicated Team Manager
  • Tailored technical mentoring
  • Dedicated team-building budget
  • Opportunities to participate in charitable initiatives and local sports programs
  • Supportive and inclusive work culture
  • Fulltime
Read More
Arrow Right

Staff Engineer - Finance Data Engineer

GEICO is seeking Finance Data specialists to support the build out of a Finance ...
Location
Location
United States , Palo Alto
Salary
Salary:
110000.00 - 260000.00 USD / Year
geico.com Logo
Geico
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 15+ years of Finance Systems experience with industry leading ERP solutions (e.g. implementing and supporting Oracle
  • Workday
  • SAP or PeopleSoft)
  • 3+ years of experience in implementing and supporting FP&A applications (Preferred)
  • 3+ years working with or supporting a Finance Data Lake/Warehouse/Mart along with various Financial Reporting tools
  • Good understanding of Dimensional Data Modeling
  • Strong working knowledge of Data processing/data transformation using ETL/ELT tools such as Informatica, DBT, etc.
  • Strong working knowledge of SQL and the ability to write, debug and optimize SQL queries and ETL jobs
  • Experience working with Financials via the major vendors (e.g. PeopleSoft
  • Oracle EBS
Job Responsibility
Job Responsibility
  • Leverage your strong Functional and Technical systems expertise to drive towards the right solution to support the department and GEICO’s current and future needs
  • Engage in cross-functional collaboration throughout the entire software lifecycle
  • Support design sessions with peers to ensure systems are well designed, efficient, and meet Business expectations
  • Collaborate with Finance Leaders within the FP&A, Controllers, Financial Reporting, and Finance QA & Systems team along with the Data Engineering and Finance Technology leaders to ensure organizational goals are met
  • Have a deep understanding of a few Finance functions to support the establishment of a vision and technology roadmap to build out a Finance Data Lake/Warehouse
  • Have a good understanding of finance data lake/mart with Data processing/data transformation using ETL/ELT tools such as Informatica, DBT, etc.
  • Experience with designing, developing, implementing, and maintaining solutions for data ingestion and transformation projects
  • Experience working with cloud data solutions (Delta Lake, Iceberg, Hudi, Snowflake, Redshift or equivalent)
  • Support the development of a roadmap and then work towards implementing Revenue and Expense Analytical solutions
  • Support and try to influence customers and stakeholders, and work through divergent expectations
What we offer
What we offer
  • Comprehensive Total Rewards program that offers personalized coverage tailor-made for you and your family’s overall well-being
  • Financial benefits including market-competitive compensation
  • a 401K savings plan vested from day one that offers a 6% match
  • performance and recognition-based incentives
  • and tuition assistance
  • Access to additional benefits like mental healthcare as well as fertility and adoption assistance
  • Supports flexibility- We provide workplace flexibility as well as our GEICO Flex program, which offers the ability to work from anywhere in the US for up to four weeks per year
  • Fulltime
Read More
Arrow Right

Informatica MDM Senior Software Development Engineer 1

We are seeking a Senior Informatica MDM Developer to build, enhance, and support...
Location
Location
India , Noida
Salary
Salary:
Not provided
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5–8 years of IT experience
  • 3–5 years of Informatica MDM development experience
  • Strong hands-on experience in: Informatica MDM Hub
  • Match rules, load jobs, cleanse rules
  • Data staging and data loads
  • Good knowledge of: SQL, PL/SQL
  • Informatica PowerCenter / IICS
  • Understanding of MDM concepts and data governance
  • Experience working with business analysts and QA teams
Job Responsibility
Job Responsibility
  • Develop and configure Informatica MDM Hub components
  • Implement match & merge logic, validation rules, and survivorship rules
  • Build and maintain MDM data models and relationships
  • Develop integrations using: Informatica PowerCenter / IICS
  • APIs / batch processes
  • Support data loads, data quality checks, and issue resolution
  • Participate in design discussions and technical troubleshooting
  • Follow best practices for performance, scalability, and security
  • Work in Agile teams and support sprint deliveries
Read More
Arrow Right

Software Development Advisor

We are currently seeking a Software Development Advisor to join our team in Cant...
Location
Location
United States , Canton
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Recent experience and clear responsibilities in an ETL / Extracts / Data Engineering lead developer capacity
  • 8 plus years of experience in data integration (ETL) with 6 or more in solution development using Informatica suite
  • 6 or more years of experience in RDBMS / Data Warehousing platforms on Oracle
  • Must be an expert in Linux shell scripting to introduce automation as part of solution delivery
  • Strong SQL knowledge and ability to write most complex queries
  • Conceptual understanding of logical and physical data modelling is a must, and a working experience is a plus
  • Ability to develop and maintain data integration/ETL specifications that states integration rules and mappings
  • Ability to use documentation tools such as MS Visio and MS PowerPoint to diagram integration data flows
  • Ability to communicate effectively with business and technical staff members
  • Bachelor’s degree in Computer Engineering, Computer Science, Engineering, or related field of study
Job Responsibility
Job Responsibility
  • Perform Data analysis, collect business requirements and identify business rules
  • Analyze, develop and support Data Integration, EDW and Data Management using Informatica, SQL, Oracle etc. platforms and tools
  • Collaborate with data modelling team to define data integration logic
  • Create data mappings necessary to fulfill the requirements
  • Build and test new ETL programs to fulfill the service requests
  • Automate the ETL processes and pipelines using scripting and Schedulers
  • Participate in Unit Testing, assist QA team in Integration Testing, and support Business team for UAT
  • Participate in code deployment and coordinate with different infrastructure teams during go-live, post-production, and continued support of end product
  • Debug data quality issues by analyzing the upstream sources and provide guidance to data integration team resolutions
  • Work closely work with DBAs to fix performance bottlenecks
Read More
Arrow Right

Test Engineer

The Test Engineer role involves executing data validation queries, performing re...
Location
Location
India , Remote
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3–6 years of experience in Software Testing / QA
  • At least 2+ years in Data Warehouse / ETL testing
  • Practical experience in data migration / data platform modernization projects is strongly preferred
  • Strong SQL skills for writing complex queries and validations (joins, aggregations, window functions, filters)
  • Hands-on experience testing on at least one data platform such as: Snowflake, Databricks, or other cloud DW/Big Data platforms – experience with Snowflake/Databricks is a plus
  • Exposure to Informatica (understanding mappings/workflows and expected outputs)
  • Exposure to dbt or willingness to learn dbt-based validations (model outputs, dbt test results)
  • Good understanding of data warehouse concepts: facts/dimensions, SCDs, aggregations, grain, source-to-target mapping
  • Familiarity with test management and defect tracking tools (e.g., Azure DevOps, JIRA, TestRail, qTest)
  • Experience with Excel or similar tools for managing validation results and reconciliation artifacts
Job Responsibility
Job Responsibility
  • Execute data validation queries to compare: Source (Yellowbricks / Informatica outputs) vs Target (Databricks / Snowflake / dbt outputs)
  • Perform: Row-count checks between source and target tables
  • Aggregate validations (sums, counts, averages, distinct counts, min/max)
  • Sample and cell-level comparisons for critical entities and fields
  • Validate schema consistency (data types, lengths, precision/scale, nullable fields, constraints)
  • Log discrepancies, analyze patterns, and work with developers to support root-cause analysis
  • Understand Informatica mappings and workflows and their expected outputs
  • Validate dbt models (staging, core, marts) on Snowflake against legacy Informatica outputs
  • Test Databricks notebooks/jobs that replicate Yellowbricks logic — ensuring correct transformations, joins, filters, and aggregations
  • Validate CDC / incremental loads: new, changed, and deleted records are handled correctly
Read More
Arrow Right

Junior Test Engineer

The Junior Test Engineer will execute data validation queries and perform checks...
Location
Location
India , Remote
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2-4 years of experience in Software Testing / QA
  • At least 2+ years in Data Warehouse / ETL testing
  • Practical experience in data migration / data platform modernization projects is strongly preferred
  • Strong SQL skills for writing complex queries and validations (joins, aggregations, window functions, filters)
  • Hands-on experience testing on at least one data platform such as Snowflake, Databricks, or other cloud DW/Big Data platforms
  • Exposure to Informatica (understanding mappings/workflows and expected outputs)
  • Exposure to dbt or willingness to learn dbt-based validations
  • Good understanding of data warehouse concepts: facts/dimensions, SCDs, aggregations, grain, source-to-target mapping
  • Familiarity with test management and defect tracking tools (e.g., Azure DevOps, JIRA, TestRail, qTest)
  • Experience with Excel or similar tools for managing validation results and reconciliation artifacts
Job Responsibility
Job Responsibility
  • Execute data validation queries to compare Source (Yellowbricks / Informatica outputs) vs Target (Databricks / Snowflake / dbt outputs)
  • Perform row-count checks between source and target tables
  • Perform aggregate validations (sums, counts, averages, distinct counts, min/max)
  • Perform sample and cell-level comparisons for critical entities and fields
  • Validate schema consistency (data types, lengths, precision/scale, nullable fields, constraints)
  • Log discrepancies, analyze patterns, and work with developers to support root-cause analysis
  • Understand Informatica mappings and workflows and their expected outputs
  • Validate dbt models (staging, core, marts) on Snowflake against legacy Informatica outputs
  • Test Databricks notebooks/jobs that replicate Yellowbricks logic — ensuring correct transformations, joins, filters, and aggregations
  • Validate CDC / incremental loads: new, changed, and deleted records are handled correctly
Read More
Arrow Right

Junior Test Engineer

The Junior Test Engineer role involves executing data validation queries, perfor...
Location
Location
India
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2-4 years of experience in Software Testing / QA
  • At least 2+ years in Data Warehouse / ETL testing
  • Practical experience in data migration / data platform modernization projects is strongly preferred
  • Strong SQL skills for writing complex queries and validations (joins, aggregations, window functions, filters)
  • Hands-on experience testing on at least one data platform such as Snowflake, Databricks, or other cloud DW/Big Data platforms
  • Exposure to Informatica (understanding mappings/workflows and expected outputs)
  • Exposure to dbt or willingness to learn dbt-based validations
  • Good understanding of data warehouse concepts: facts/dimensions, SCDs, aggregations, grain, source-to-target mapping
  • Familiarity with test management and defect tracking tools (e.g., Azure DevOps, JIRA, TestRail, qTest)
  • Experience with Excel or similar tools for managing validation results and reconciliation artifacts
Job Responsibility
Job Responsibility
  • Execute data validation queries to compare Source (Yellowbricks / Informatica outputs) vs Target (Databricks / Snowflake / dbt outputs)
  • Perform row-count checks between source and target tables
  • Perform aggregate validations (sums, counts, averages, distinct counts, min/max)
  • Perform sample and cell-level comparisons for critical entities and fields
  • Validate schema consistency (data types, lengths, precision/scale, nullable fields, constraints)
  • Log discrepancies, analyze patterns, and work with developers to support root-cause analysis
  • Understand Informatica mappings and workflows and their expected outputs
  • Validate dbt models (staging, core, marts) on Snowflake against legacy Informatica outputs
  • Test Databricks notebooks/jobs that replicate Yellowbricks logic — ensuring correct transformations, joins, filters, and aggregations
  • Validate CDC / incremental loads: new, changed, and deleted records are handled correctly
Read More
Arrow Right