This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Being part of Air Canada is to become part of an iconic Canadian symbol, recently ranked the best Airline in North America. Let your career take flight by joining our diverse and vibrant team at the leading edge of passenger aviation. Join Air Canada—an iconic Canadian brand and North America’s top-ranked airline—as we lead the next wave of digital transformation in aviation. With major IT initiatives underway and a strong commitment to innovation, this is a unique opportunity to help shape the future of travel technology. We’re looking for a highly skilled and experienced Data Engineering Technical Analyst to drive the design, development, and delivery of scalable data solutions. This role involves leading technical assessments, mentoring data engineers, and actively developing and reviewing data pipelines to ensure robustness, scalability, and performance across enterprise data platforms. The ideal candidate will possess deep expertise in cloud-based data engineering, DevOps practices, and modern data architecture, with a strong focus on Azure and Snowflake ecosystems. As a champion of technical excellence, the Technical Analyst contributes to a culture of innovation, collaboration, and continuous improvement. You’ll work closely with architects, developers, and product teams to enhance operational efficiency, ensure platform stability, and explore emerging technologies—including AI and data-driven insights—that help keep our airline systems modern, scalable, and customer-focused. Let your career take flight by joining a team that’s redefining the digital passenger experience at the forefront of global aviation.
Job Responsibility:
Design & Implementation: Manage implementation of solutions for exploratory data analysis and conducts implementation assessments based on Architecture Overview Documents (AOD) and Detailed Data Solutions
Provide accurate effort estimations and assess feasibility of proposed architectures and data models
Design, undertake and analyze data to determine patterns and insights that can optimize data ingestion and data presentation
Pipeline Design & Development: Design and develop cost-effective, scalable ELT pipelines using reusable components and frameworks
Support the delivery of AI and data projects by designing and implementing streaming and/or batch pipelines
Design and document pipeline implementation strategies including, scheduling of jobs and workflows, dependencies, error handling, monitoring and alerting and unit testing
Monitor and resolve alerts/failures in prod/non-prod environments to ensure operational reliability
Participate in peer reviews and oversee the review and approval of code and pipelines
Environment Setup & Collaboration: Set up non-production and production environments including MFT flows, RBAC, cloud services, Git repositories, and defining appropriate branching strategies
Design, test, and maintain scalable data architectures, including databases and distributed processing systems
Collaborate with stakeholders including product owner(s), data science, and digital teams to assist with data-related technical issues, support data infrastructure needs and production release strategies
Technical Leadership & Mentorship: Mentor and guide data engineers on best practices, business logic implementation, and quality assurance
Identify and document technical debt
differentiate between defects and scope changes and effectively communicate with product team
Educate the organization both from IT (including vendors) and the business perspectives on data ingestion and data streaming tools, processes and best practices
Establish strong relationship with vendor partners to ensure strong delivery, innovation and ongoing improvement in receiving high value services
Provide technology or services ownership direction on all matters related to a key functional area – to associated functional lead and peers
DevOps & Data Modeling: Lead deployment and transition-to-operations (TTO) activities for UAT and production
Apply strong understanding of DevOps processes, Star Schema, Data Vault, and data warehousing principles
Requirements:
3-5 years of experience leading enterprise data warehouse development teams
Proven success in Agile environments and cloud-based data platforms, especially Azure and Snowflake
Expertise in building robust, scalable data pipelines for batch and streaming data
Proficiency in SQL, Python, stored procedures, and scheduling tools
Hands-on experience with ETL/ELT tools such as Azure Data Factory (ADF), Databricks, Snowflake, DBT and Talend
Skilled in implementing monitoring and alerting mechanisms for data pipelines
Strong capability in reviewing engineering deliverables for performance, scalability, and maintainability
Experience with prompt engineering and leveraging Generative AI (GenAI) to accelerate development and automate engineering workflows
Bachelor’s degree in Engineering, Computer Science, Mathematics, or a related field
Excellent communication, problem-solving, and analytical skills
Proven leadership and mentoring capabilities
Strong collaboration skills with cross-functional teams
Demonstrate punctuality and dependability to support overall team success in a fast-paced environment
Candidates must be eligible to work in the country of interest at the time any offer of employment is made and are responsible for obtaining any required work permits, visas, or other authorizations necessary for employment
Prior to their start date, candidates will also need to provide proof of their eligibility to work in the country of interest
Nice to have:
Based on equal qualifications, preference will be given to bilingual candidates