This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Testing Analyst 2 is a developing professional role. Applies specialty area knowledge in monitoring, assessing, analyzing and/or evaluating processes and data. Identifies policy gaps and formulates policies. Interprets data and makes recommendations. Researches and interprets factual information. Identifies inconsistencies in data or results, defines business issues and formulates recommendations on policies, procedures or practices. Integrates established disciplinary knowledge within own specialty area with basic understanding of related industry practices. Good understanding of how the team interacts with others in accomplishing the objectives of the area. Develops working knowledge of industry practices and standards. Limited but direct impact on the business through the quality of the tasks/services provided. Impact of the job holder is restricted to own team.
Job Responsibility:
Build Data Pipelines: Create programs using PySpark (a powerful data processing tool) to extract data from various sources (like databases and data lakes), clean and transform it, and load it into target systems
Testing and Validation: Develop automated tests to ensure the data pipelines are working correctly and the data is accurate
Work with Hive, HDFS, and Oracle data sources to extract, transform, and load large-scale datasets
Leverage AWS services such as S3, Lambda, and Airflow for data ingestion, event-driven processing, and orchestration
Create reusable frameworks, libraries, and templates to accelerate automation and testing of ETL jobs
Participate in code reviews, CI/CD pipelines, and maintain best practices in Spark and cloud-native development
Ensures tooling can be run in CICD providing real-time on demand test execution shortening the feedback loop to fully support Handsfree execution
Regression , Integration, Sanity testing, Regression automated suites, reports issues – provide solutions and ensures timely completion
Own and drive automation in Data and Analytics Team to achieve 90% automation in Data, ETL space
Design and develop integrated portal to consolidate utilities and cater to user needs
Supports initiatives related to automation on Data & Analytics testing requirements for process and product rollout into production
Specialists who can work with technology team to design and implement appropriate automation scripts/plans for an application testing, meeting required KPI and automation effectiveness
Ensures new utilities are documented and transitioned to testers for execution and supports for troubleshooting in case required
Monitors and reviews code check-ins from peers and helps maintain project repository
Ability to work independently as well as collaborate within groups on various projects assigned
Ability to work in a fast-paced, dynamic environment and manage multiple priorities effectively
Experience and understanding of Wealth domain specifically in private bank(banking) , lending services and related Tech applications
Supports and contributes to automated test data generation and sufficiency
Requirements:
2-4 years of experience on automation testing across UI, Data analytics and BI reports in the Financial Service industry
Hands on experience on Selenium BDD Cucumber using Java, Python
Extensive knowledge on developing and maintaining automation frameworks, AI/ ML related solutions
Experience on automating BI reports e.g., Tableau dashboards and views validation
Hands on experience in Python for developing utilities for Data Analysis using Pandas, NumPy etc
Exposure and some experience on AI related solutions, ML which can help automate faster
Experience with mobile testing using perfecto, API Testing-SoapUI, Postman/Rest Assured will be added advantage
Detailed knowledge data flows in relational database and Bigdata systems
Strong knowledge of Oracle SQL and HiveQL and understanding of ETL/Data Testing
Experience using Bitbucket or equivalent code version control tool
Experience with CI/CD tools like Jenkins
Detailed knowledge data flows in relational database and Bigdata (Familiarity with Hadoop (a platform for processing massive datasets))
Strong experience with PySpark for batch and stream processing deploying PySpark workloads to AWS EKS (Kubernetes)
Proficiency in working on Cloudera Hadoop ecosystem (HDFS, Hive, YARN)
Hands-on experience with ETL automation and validation framework
Solid understanding of AWS services like S3, Lambda, EKS, Airflow, and IAM
Scripting knowledge in Bash, Python, and YAML
Strong problem-solving and debugging skills
Excellent communication and collaboration abilities to lead and mentor a large techno-functional team across different geographical locations
Strong Acumen and presentation skills
Able to work in an Agile environment and deliver results independently
Bachelor’s/University degree or equivalent experience
Nice to have:
Experience with mobile testing using perfecto, API Testing-SoapUI, Postman/Rest Assured
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.