This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are seeking a Principal‑level Data Architect with deep expertise in enterprise data architecture, AI enablement, and cloud‑native development paradigms. This role goes beyond hands‑on engineering—requiring a leader who can define platform strategy, security posture, influence senior clients, and guide teams across multiple concurrent programs.
Job Responsibility:
Design and own complex, enterprise-scale data architectures across MS Fabric, Azure, GCP, AWS, or Databricks serverless or hosted environments
Define and enforce architectural standards, patterns, and governance frameworks across ingestion, modeling, lineage, security, and orchestration
Shape AI‑enabled architecture approaches, including data foundations for ML, feature engineering, and low-latency operationalization pipelines
Act as a principal advisor to client technical leadership, helping shape long-term strategy, roadmaps, and modernization initiatives
Lead architectural direction during pre-sales cycles, including solutioning, scoping, estimation, and executive-level presentations
Anticipate downstream impacts of architectural decisions
maintain ownership when delivery teams or constraints require deviation from the original design
Architect highly available, distributed, fault‑tolerant data pipelines supporting batch and streaming workloads
Oversee migration and integration of complex, diverse data sources into Fabric, Azure, GCP, or Databricks platforms
Define medallion/lakehouse modeling patterns across Bronze/Silver/Gold zones or cloud equivalents
Set enterprise standards for ingestion → transformation → serving layers across multi-cloud environments
Optimize performance of large-scale data processing across Spark, Databricks, and Fabric-native engines
Provide leadership across 2–3 concurrent projects with variable allocation, ensuring architectural consistency and delivery quality while also contributing to in-depth technical work where needed
Develop advanced transformations, pipelines, and frameworks using SQL, Python, dbt, and Fabric/Databricks cloud-native tooling
Implement automation using REST APIs, Data Factory, orchestration services, and multi-cloud workflow tools
Guide engineers on distributed architecture, metadata design, data quality frameworks, and platform hygiene
Conduct root cause analysis on complex platform or performance issues, driving permanent and scalable remediation
Serve as the senior-most technical voice in architecture reviews, solution design, and cross-functional planning
Lead strategic conversations with director-level and executive stakeholders
Translate complex architecture into clear business implications for non-technical audiences
Represent architecture during pre-sales conversations, capability pitches, and proposal cycles
Requirements:
10+ years of experience in data engineering, data architecture, or platform engineering
Experience designing or building enterprise data platforms on at least one of Azure, GCP, or AWS and Databricks
Deep expertise in SQL, Python, distributed data processing, and cloud-native data design
Significant experience with medallion/lakehouse architecture patterns
Strong knowledge of modern data platforms: Databricks, Azure Synapse, Microsoft Fabric, Delta Lake, BigQuery, etc.
Proven experience leading architecture across large programs and multiple concurrent projects
Experience with enterprise automation and integration using REST APIs
Strong communication skills and ability to engage confidently with senior leadership and clients
Experience in pre-sales, technical solutioning, or client-facing architecture leadership
Nice to have:
Experience building AI‑enabled data architectures and ML data pipelines
Familiarity with Tableau, Power BI, and Fabric reporting tools
Demonstrated experience defining and operationalizing DevOps best practices
Experience with Spark, Hive, Impala, and other big data frameworks
Knowledge of multi-cloud architecture components (Azure, GCP, AWS)
Experience with pipeline automation tooling such as Airflow, Azure Data Factory, Prefect
Experience leading technical teams and managing data/analytics/ML projects
Familiarity with front- and back‑end application stacks and API design (REST/GraphQL)
What we offer:
A comprehensive insurance plan, where you can choose the module that best suits your needs—Gold, Silver, or Bronze. The employer may contribute up to 80% of your coverage depending on the selected module. This plan includes short- and long-term disability coverage
Dialogue via Sun Life provides virtual healthcare services, allowing you to consult with a healthcare professional for emergencies, prescription renewals, and more. You also have access to the Employee and Family Assistance Program, as well as a complete mental health support program
A $500 Personal Spending Account, which can be used for healthcare reimbursements, gym memberships, public transit passes, office supplies, or contributions to your RRSP through Valtech
A retirement plan where Valtech will match 100% of your RRSP contributions through a Deferred Profit Sharing Plan (DPSP), up to a maximum of 4%. You can start contributing to your RRSP immediately, and to the DPSP after 3 months. The vesting of the DPSP will be after a 24 months of service
Access to a flexible vacation under Valtech's policy to support your work-life balance, with 5 days available during your probation period and a prorated amount calculated for the remainder of the year
Personal Technology Reimbursement – $30/month for every employee-offered on day 1
We close during the winter holidays and offer flexible scheduling throughout the year, so you can enjoy those sunny Friday afternoons—provided your weekly hours are completed