This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Global Technology Solutions (GTS) at ResMed is a division dedicated to creating innovative, scalable, and secure platforms and services for patients, providers, and people across ResMed. The primary goal of GTS is to accelerate well-being and growth by transforming the core, enabling patient, people, and partner outcomes, and building future-ready operations. We are looking for a high-impact, hands-on Senior Data Analytics & AI Engineer to shape the next generation of Resmed’s data ecosystem. This is a Senior level technical role for someone who can collaborate with peers to solve complex data pipeline, data models, and classical and generative AI challenges.
Job Responsibility:
Design data and AI products and applications consisting of system integrations, data model transformations and user interfaces while aligning with engineering best practices across teams
Build and evolve data pipelines using Python, Spark, APIs, connector frameworks, and other ingestion technologies while considering automation, observability, and resilient design patterns
Design high-quality Snowflake/dbt models, implement governance and testing standards, and mentoring less experienced engineers in scalable modeling and system design
Collaborate cross-functionally with product, engineering, and data science to shape impactful, scalable solutions
Drive future advanced analytics and ML capabilities by defining feature pipelines, supporting classical ML models, and enabling new AI-driven workloads including LLM-based and hybrid ML/AI architectures
Requirements:
Extensive hands-on experience as a senior IC in data engineering, analytics engineering, or data architecture (typically 6+ years)
Expert-level SQL and data modeling skills on large-scale platforms (Snowflake preferred)
Strong experience building production data pipelines and models using Python, cloud services, and modern data stack tools
Proficiency with dbt or similar transformation frameworks
Demonstrated ability to set technical direction, define architectural patterns, and establish engineering best practices
Solid experience with Git/GitHub workflows, including branching strategies and collaborative development
Experience building and maintaining CI/CD pipelines in GitHub Actions, including automated testing and secure deployments
Ability to operate across both analytics engineering and data engineering responsibilities
Experience with cloud platforms such as AWS or GCP
Nice to have:
Bachelor’s degree in a STEM field or equivalent experience or Master’s degree in a computer science or engineer field
Experience with Dagster, Airflow, or similar orchestration tools
Familiarity with streaming or event-based processing (Kafka, Fink, Kinesis)
Familiarity with IaC such as Terraform
Experience with classical machine learning, advanced analytics, and generative AI development
Experience supporting ML/AI workflows or integrating ML into data products
What we offer:
Bonus plan
Working from home flexibility
Referral bonus
Preferred shareholding programme
Competitive benefits (Pension, Long-Term Illness Protection, Health Insurance...)