CrawlJobs Logo

Test Engineer - Data & Analytics

https://www.randstad.com Logo

Randstad

Location Icon

Location:
New Zealand , Auckland

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

A leading New Zealand organisation is seeking a skilled Test Engineer to join a high-performing technology team. This role offers the opportunity to work at the forefront of data quality, contributing to modern analytics pipelines and ensuring the integrity of data that powers strategic decision-making.

Job Responsibility:

  • Design and execute data quality tests to ensure processed data meets functional and business requirements
  • Validate the accuracy and completeness of transformed datasets across ETL pipelines
  • Identify and escalate data anomalies, inconsistencies, and pipeline quality issues
  • Develop and run test cases for data pipelines, including business logic validation
  • Implement automated validation scripts using Python/PySpark within reusable frameworks
  • Contribute to CI/CD testing processes to enable continuous delivery of reliable data products
  • Document test plans and findings, collaborating with technical teams to resolve issues

Requirements:

  • Hands-on experience in data quality testing, QA, or related data-centric roles
  • Solid understanding of ETL concepts, data transformations, and pipeline logic
  • Proficiency in SQL and relational database principles for querying and validating datasets
  • Exposure to Python scripting for building automation and validation frameworks
  • Strong grasp of functional testing principles and a methodical, edge-case-driven approach
  • Proactive problem-solver with a mindset geared toward improving test reliability
  • Excellent collaboration skills, with the ability to work effectively across technical teams
  • Commitment to continuous learning and adapting to evolving data technologies

Nice to have:

  • Experience with Databricks, Delta Lake, or Lakeflow platforms
  • Exposure to data-centric CI/CD workflows and observability tooling
What we offer:
  • Competitive market rates
  • Chance to work on modern data stacks
  • Hybrid working (at least 3 days in-office)

Additional Information:

Job Posted:
February 28, 2026

Expiration:
March 20, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Test Engineer - Data & Analytics

Data Engineer, Enterprise Data, Analytics and Innovation

Are you passionate about building robust data infrastructure and enabling innova...
Location
Location
United States
Salary
Salary:
110000.00 - 125000.00 USD / Year
vaniamgroup.com Logo
Vaniam Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional experience in data engineering, ETL, or related roles
  • Strong proficiency in Python and SQL for data engineering
  • Hands-on experience building and maintaining pipelines in a lakehouse or modern data platform
  • Practical understanding of Medallion architectures and layered data design
  • Familiarity with modern data stack tools, including: Spark or PySpark
  • Workflow orchestration (Airflow, dbt, or similar)
  • Testing and observability frameworks
  • Containers (Docker) and Git-based version control
  • Excellent communication skills, problem-solving mindset, and a collaborative approach
Job Responsibility
Job Responsibility
  • Design, build, and operate reliable ETL and ELT pipelines in Python and SQL
  • Manage ingestion into Bronze, standardization and quality in Silver, and curated serving in Gold layers of our Medallion architecture
  • Maintain ingestion from transactional MySQL systems into Vaniam Core to keep production data flows seamless
  • Implement observability, data quality checks, and lineage tracking to ensure trust in all downstream datasets
  • Develop schemas, tables, and views optimized for analytics, APIs, and product use cases
  • Apply and enforce best practices for security, privacy, compliance, and access control, ensuring data integrity across sensitive healthcare domains
  • Maintain clear and consistent documentation for datasets, pipelines, and operating procedures
  • Lead the integration of third-party datasets, client-provided sources, and new product-generated data into Vaniam Core
  • Partner with product and innovation teams to build repeatable processes for onboarding new data streams
  • Ensure harmonization, normalization, and governance across varied data types (scientific, engagement, operational)
What we offer
What we offer
  • 100% remote environment with opportunities for local meet-ups
  • Positive, diverse, and supportive culture
  • Passionate about serving clients focused on Cancer and Blood diseases
  • Investment in you with opportunities for professional growth and personal development through Vaniam Group University
  • Health benefits – medical, dental, vision
  • Generous parental leave benefit
  • Focused on your financial future with a 401(k) Plan and company match
  • Work-Life Balance and Flexibility
  • Flexible Time Off policy for rest and relaxation
  • Volunteer Time Off for community involvement
  • Fulltime
Read More
Arrow Right

Data and Analytics Engineer Intern

The Data and Analytics Engineer Intern will assist in designing, building, and t...
Location
Location
United States , Irvine
Salary
Salary:
17.00 - 22.00 USD / Hour
trace3.com Logo
Trace3
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Enrollment in the Junior or Senior year of an undergraduate program or master’s program at an accredited college or university
  • Candidates should be pursuing a field of study applicable to the Data Intelligence internship
  • Cumulative grade point average (GPA) of 3.0 or better
  • Ability to work independently on assigned tasks and accepts direction on given assignments
  • Self-motivated individuals with a customer mindset and desire to help people
  • Enthusiasm for technical problem solving with attention to detail and strong communication skills
  • Ability to learn and research in a dynamic and engaging environment
  • Availability to work 40 hours per week throughout the internship
Job Responsibility
Job Responsibility
  • Assist in designing, building, and testing data platforms and analytics solutions to generate actionable insights for our customers
  • Partner with our Data Intelligence Team to determine the best approach around data ingestion, structure, and storage, then work with the team to ensure these are implemented accurately
  • Contribute ideas on how to make our customers’ data more valuable and work with members of Trace3’s Engineering Team to implement solutions
What we offer
What we offer
  • Comprehensive medical, dental and vision plans for you and your dependents
  • 401(k) Retirement Plan with Employer Match, 529 College Savings Plan, Health Savings Account, Life Insurance, and Long-Term Disability
  • Competitive Compensation
  • Training and development programs
  • Major offices stocked with snacks and beverages
  • Collaborative and cool culture
  • Work-life balance and generous paid time off
  • Fulltime
Read More
Arrow Right

Data Analytics Engineer

Beacon Biosignals is seeking an experienced data analytics engineer to strengthe...
Location
Location
United States , Boston
Salary
Salary:
Not provided
beacon.bio Logo
Beacon Biosignals
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Extensive experience developing, testing, and maintaining SQL transformations to deliver reproducible analytic results in a production setting atop data platforms such as Snowflake or Databricks
  • Proficiency in a scientific programming language such as Julia, Python, or R
  • A rigorous approach to documentation and automated testing
  • Experience deploying automated analytics pipelines to a production environment, monitoring their performance, and maintaining them over time
  • A collaborative mindset
  • Strong asynchronous communication skills and a knack for making the most of synchronous collaboration
  • Product-oriented intuition for turning individual stakeholder needs into configurable solutions and broadly applicable features
  • Familiarity with or interest in learning about analyses that power clinical trials, clinical diagnostics, or other medical applications that leverage data organization, transformation, and statistics
Job Responsibility
Job Responsibility
  • Design, implement, and maintain data models and transformations that enrich biosignal metrics with clinical context, powered by Beacon's biosignal data warehouse
  • Ensure that data pipelines and products produce reproducible results by owning CI test suites and versioning strategies for data models, dashboards, and reports
  • Build reusable, scalable customer-facing analytics products, such as automated scientific reports and dashboards that can be configured and reused without custom engineering
  • Improve user documentation and tooling to enable customers and internal scientists to use the analytics layer surface to answer new scientific questions
  • Deploy and monitor analytics and feature computation services that transform raw biosignal data and clinical metadata to generate meaningful scientific results
  • Collaborate with scientific subject matter experts, product managers, and customers to identify high-impact analytics improvements that accelerate therapy development
What we offer
What we offer
  • Equity
  • PTO
Read More
Arrow Right

Analytics Engineer, Product Analytics

As an Analytics Engineer at Airtable, you’ll play a pivotal role in shaping our ...
Location
Location
United States , San Francisco; New York City
Salary
Salary:
143600.00 - 177200.00 USD / Year
airtable.com Logo
Airtable
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in computer science, data science, mathematics/statistics, or a related field (or related experience)
  • 3–5 years of experience working with data, with at least 1 year partnering with product stakeholders
  • Curiosity and fluency with AI/LLM tools (ChatGPT, Claude, Cursor, etc.)
  • Experience in SaaS, consumer tech, or data-driven product environments
  • Proficiency with SQL and data modeling best practices (e.g., dbt, Databricks, Snowflake, BigQuery)
  • Experience with BI tools and BI modeling best practices (e.g., Looker, Omni Analytics, Tableau, Mode, Hex)
  • Understanding of user funnels, retention metrics, and growth analytics
  • Strong ability to ensure data accuracy, reliability, and consistency
  • Ability to translate business questions into analytical approaches, interpret results, and communicate actionable insights
  • Knowledge of product analytics tracking frameworks (e.g., Segment, Amplitude, Mixpanel, GA4) and event taxonomy design
Job Responsibility
Job Responsibility
  • Own and maintain core product data pipelines across tools such as dbt, Databricks, Looker, and Omni Analytics, ensuring reliability and scalability
  • Build and refine dashboards that deliver self-serve, real-time insights for high-priority product areas
  • Partner with product and engineering teams to define tracking requirements, implement instrumentation, validate data, and deliver launch-specific dashboards or reports
  • Establish trusted partnerships with product managers, engineers, analysts, and leadership, serving as the go-to resource for product data insights and technical guidance
  • Lead analytics engineering efforts for high-impact product launches, including documentation of tracking plans, launch pipelines, and post-launch reporting
  • Participate in or lead cross-functional projects where analytics engineering contributions directly influence product strategy decisions
What we offer
What we offer
  • Restricted stock units
  • Incentive compensation
  • Comprehensive benefits package
  • Fulltime
Read More
Arrow Right

Software Engineer - Data Engineering

Akuna Capital is a leading proprietary trading firm specializing in options mark...
Location
Location
United States , Chicago
Salary
Salary:
130000.00 USD / Year
akunacapital.com Logo
AKUNA CAPITAL
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • BS/MS/PhD in Computer Science, Engineering, Physics, Math, or equivalent technical field
  • 5+ years of professional experience developing software applications
  • Java/Scala experience required
  • Highly motivated and willing to take ownership of high-impact projects upon arrival
  • Prior hands-on experience with data platforms and technologies such as Delta Lake, Spark, Kubernetes, Kafka, ClickHouse, and/or Presto/Trino
  • Experience building large-scale batch and streaming pipelines with strict SLA and data quality requirements
  • Must possess excellent communication, analytical, and problem-solving skills
  • Recent hands-on experience with AWS Cloud development, deployment and monitoring necessary
  • Demonstrated experience working on an Agile team employing software engineering best practices, such as GitOps and CI/CD, to deliver complex software projects
  • The ability to react quickly and accurately to rapidly changing market conditions, including the ability to quickly and accurately respond and/or solve math and coding problems are essential functions of the role
Job Responsibility
Job Responsibility
  • Work within a growing Data Engineering division supporting the strategic role of data at Akuna
  • Drive the ongoing design and expansion of our data platform across a wide variety of data sources, supporting an array of streaming, operational and research workflows
  • Work closely with Trading, Quant, Technology & Business Operations teams throughout the firm to identify how data is produced and consumed, helping to define and deliver high impact projects
  • Build and deploy batch and streaming pipelines to collect and transform our rapidly growing Big Data set within our hybrid cloud architecture utilizing Kubernetes/EKS, Kafka/MSK and Databricks/Spark
  • Mentor junior engineers in software and data engineering best practices
  • Produce clean, well-tested, and documented code with a clear design to support mission critical applications
  • Build automated data validation test suites that ensure that data is processed and published in accordance with well-defined Service Level Agreements (SLA’s) pertaining to data quality, data availability and data correctness
  • Challenge the status quo and help push our organization forward, as we grow beyond the limits of our current tech stack
What we offer
What we offer
  • Discretionary performance bonus
  • Comprehensive benefits package that may encompass employer-paid medical, dental, vision, retirement contributions, paid time off, and other benefits
  • Fulltime
Read More
Arrow Right

Pyspark Data Engineer

The Data Analytics Intmd Analyst is a developing professional role. Deals with m...
Location
Location
India , Chennai
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4-8 years relevant experience in Data Analytics and Big Data
  • SQL, Python, Pyspark, with Spark components
  • Minimum 4 years of experience as a python developer with expertise in automation testing to design, develop, and automate robust software solutions and testing frameworks like Pytest, Behave
  • 2-4 years of experience as Big Data Engineer to develop, optimize, and manage large-scale data processing systems and analytics platforms
  • 4 years of experience in distributed data processing & near real-time data analytics using PySpark
  • Strong understanding of PySpark execution plans, partitioning & optimization techniques
Job Responsibility
Job Responsibility
  • Integrates in-depth data analysis knowledge with a solid understanding of industry standards and practices
  • Demonstrates a Good understanding of how data analytics teams and area integrate with others in accomplishing objectives
  • Applies project management skills
  • Applies analytical thinking and knowledge of data analysis tools and methodologies
  • Analyzes factual information to make accurate judgments and recommendations focused on local operations and broader impacts
  • Applies professional judgment when interpreting data and results breaking down information in a systematic and communicable manner
  • Employs developed communication and diplomacy skills to exchange potentially complex/sensitive information
  • Demonstrates attention to quality and timeliness of service to ensure the effectiveness of the team and group
  • Provides informal guidance or on-the-job-training to new team members
  • Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency
  • Fulltime
Read More
Arrow Right

Head of Data Infrastructure & Analytics

As our remote Head of Data Infrastructure & Analytics, you will take the helm of...
Location
Location
United States
Salary
Salary:
200000.00 USD / Year
puffy.com Logo
Puffy
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years in digital analytics with proven track record of owning data quality in high-stakes B2C environments
  • Expert-level knowledge: GA4, GTM, server-side tracking, data pipeline architecture, e-commerce tracking
  • Can diagnose and fix tracking setup errors across the entire stack
  • Strong technical skills: SQL expert, Python or R proficiency for data validation
  • Data engineering knowledge: Understand ETL/ELT, data warehousing, API integrations, event-driven architecture
  • Experience building data quality systems: Automated validation checks, data observability, monitoring frameworks
  • Proven team leadership: Can manage 3+ person analytics team
  • Experience with data observability tools (Monte Carlo, Great Expectations, dbt tests)
  • Background in analytics engineering or data reliability engineering
  • Track record in high-growth e-commerce (sleep, wellness, DTC)
Job Responsibility
Job Responsibility
  • Data Quality Infrastructure (Your #1 Priority)
  • Audit existing tracking setup (GA4, GTM, Shopify, Klaviyo, ad platforms) and fix every configuration error
  • Build automated validation systems that catch bugs before humans see them
  • Implement pre-leadership QA processes with hard gates
  • Create monitoring and alerting
  • Technical Architecture & Reliability
  • Design and maintain server-side tracking implementation for attribution accuracy
  • Architect data pipelines that handle $2M+ monthly ad spend tracking without errors
  • Own end-to-end data flow: website events → collection → transformation → warehouse → reporting
  • Establish technical standards
What we offer
What we offer
  • Continuous learning
  • 10% monthly bonus
  • Premium insurance
  • Achievement recognition
  • Free snacks and lunches
  • Generous annual leave
  • AI tools and tech stack
  • 18+ nationalities
  • Social events & activities
  • Learning and development support (we pay for courses you need to upskill)
  • Fulltime
Read More
Arrow Right

Sr. Data Engineer

We are looking for a Sr. Data Engineer to join our team.
Location
Location
Salary
Salary:
Not provided
bostondatapro.com Logo
Boston Data Pro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Data Engineering: 8 years (Preferred)
  • Data Programming languages: 5 years (Preferred)
  • Data Developers: 5 years (Preferred)
Job Responsibility
Job Responsibility
  • Designs and implements standardized data management procedures around data staging, data ingestion, data preparation, data provisioning, and data destruction
  • Ensures quality of technical solutions as data moves across multiple zones and environments
  • Provides insight into the changing data environment, data processing, data storage and utilization requirements for the company, and offer suggestions for solutions
  • Ensures managed analytic assets to support the company’s strategic goals by creating and verifying data acquisition requirements and strategy
  • Develops, constructs, tests, and maintains architectures
  • Aligns architecture with business requirements and use programming language and tools
  • Identifies ways to improve data reliability, efficiency, and quality
  • Conducts research for industry and business questions
  • Deploys sophisticated analytics programs, machine learning, and statistical methods to efficiently implement solutions
  • Prepares data for predictive and prescriptive modeling and find hidden patterns using data
Read More
Arrow Right