This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The objective of this project is to enhance the QA processes through the implementation of automation within the Data landscape. This includes developing and maintaining regression testing suites, and creating test scripts and automation plans to improve the overall efficiency and quality of data testing practices.
Job Responsibility:
Design and execute data validation tests to ensure completeness in Azure Data Lake Storage (ADLS), Azure Synapse, and Databricks
Verify data ingestion, transformation, and loading (ETL/ELT) processes in Azure Data Factory (ADF)
Validate data schema, constraints, and format consistency across different storage layers
Conduct performance testing on data pipelines
Optimize query performance by working with data engineers
Identify, log, and track defects in JIRA
Collaborate with Data Engineers and Business Analysts to resolve data inconsistencies
Generate detailed test reports, dashboards, and documentation for stakeholders
Requirements:
3+ years of QA experience with a strong focus on Big Data testing, particularly with hands-on experience in Data Lake environments on any cloud platform (preferably Azure)
Experience with Azure
Hands-on experience in Azure Data Factory, Azure Synapse Analytics, or similar services
Proficiency in SQL, capable of writing and optimizing both simple and complex queries for data validation and testing purposes
Experienced in PySpark, with experience in data manipulation and transformation, and a demonstrated ability to write and execute test scripts for data processing and validation (ability to understand the code and convert the logic to SQL)
Hands-on experience with Functional & System Integration Testing in big data environments, ensuring seamless data flow and accuracy across multiple systems
Knowledge and ability to design and execute test cases in a behavior-driven development environment
Fluency in Agile methodologies, with active participation in Scrum ceremonies and a strong understanding of Agile principles
Familiarity with tools like Jira, including experience with X-Ray for defect management and test case management
Proven experience working on high-traffic and large-scale software products, ensuring data quality, reliability, and performance under demanding conditions
Nice to have:
Familiarity with database administration and optimization techniques
Experience with Databricks
What we offer:
Flexible working format - remote, office-based or flexible
A competitive salary and good compensation package
Personalized career growth
Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
Active tech communities with regular knowledge sharing