This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Data Quality Engineer Tester role involves validating data sources and ensuring data integrity through rigorous testing. The ideal candidate will have strong proficiency in SQL and Python, with experience in ETL testing and automated testing frameworks. Responsibilities include designing test cases, managing defects, and conducting performance testing. A minimum of 2 years of experience is required, with a focus on data quality assurance and data governance.
Job Responsibility:
Data Validation (ETL Testing): Validating data sources, extraction, transformation logic, and loading (ETL) to ensure data integrity
Pipeline Testing: Designing and executing test cases for data pipelines (e.g., in AWS Glue, Azure Data Factory, or Apache Airflow) to ensure smooth, secure, and accurate data flow
Test Automation: Developing automated test scripts using Python, PySpark, or SQL to validate large datasets and reduce manual testing effort
Data Quality Rules & Checks: Creating and enforcing checks based on data completeness, transformation logic, and data mapping documents
Defect Management: Identifying, documenting, and tracking data-related defects, and working with engineers to troubleshoot root causes
Performance Testing: Testing for scalability and performance bottlenecks to ensure data systems handle high-volume data
Requirements:
2-4 years SQL Mastery: Strong proficiency in writing complex SQL queries for backend data validation, including joins, subqueries, and aggregations
2-4 years Scripting/Programming: data manipulation, automation, and testing
2-4 years ETL & Database Knowledge: Understanding of data warehousing, data modeling (star/snowflake schemas), and ETL tools (e.g., Informatica, Talend)
2-4 years Big Data Frameworks: Familiarity with distributed computing systems like Apache Spark or Hadoop
2-4 years Cloud Platforms: Familiarity with cloud data services AWS S3/Redshift
2-4 years Testing Tools: Experience with testing tools like Great Expectations, dbt, or automated testing frameworks