CrawlJobs Logo

Senior AI Data Engineer

https://www.citi.com/ Logo

Citi

Location Icon

Location:
Canada , Mississauga

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

120800.00 - 170800.00 USD / Year

Job Description:

The Senior AI Data Engineer (Applications Development Technology Lead Analyst - C13) is a senior-level position responsible for designing, developing, and deploying AI agents capable of understanding goals, planning actions, and executing tasks with minimal human intervention. This role requires a strong understanding of AI principles, agent-based systems, machine learning, and software engineering best practices. The ideal candidate will be able to translate research ideas into robust and scalable production systems. In coordination with the Technology team, this role will also be responsible for establishing and implementing new or revised application systems and programs. The overall objective of this role is to lead applications systems analysis and programming activities.

Job Responsibility:

  • Design and implement intelligent agents, including their perception, reasoning, planning, and action execution modules
  • Develop scalable and robust architectures for agentic systems
  • Integrate various machine learning models to enhance agent capabilities and decision-making
  • Develop agents that can automate complex tasks, optimize workflows, and solve real-world problems
  • Utilize and contribute to agentic AI frameworks and development tools
  • Design and implement metrics and evaluation strategies for agent performance
  • Stay abreast of the latest advancements in AI
  • Work closely with cross-functional teams
  • Create comprehensive technical documentation
  • Appropriately assess risk when business decisions are made

Requirements:

  • 6+ years of professional experience in software development with a focus on AI, machine learning, or agent-based systems
  • Strong proficiency in Python, SQL
  • Solid understanding of core AI concepts, including knowledge representation, automated planning, decision-making under uncertainty, and multi-agent systems
  • Experience with machine learning frameworks (e.g., TensorFlow, PyTorch) and relevant libraries (e.g., Scikit-Learn, NumPy, Pandas)
  • Familiarity with large language models (LLMs) and their application in agentic systems
  • Familiarity with specific agent frameworks (e.g., LangChain, AutoGen, CrewAI, RAG) or research in multi-agent reinforcement learning
  • Experience in designing and implementing APIs for AI services
  • Experience with software development best practices, including version control (Git), CI/CD pipelines, testing, and code reviews
  • Excellent analytical and problem-solving skills
  • Strong written and verbal communication skills
  • Experience with cloud platforms (AWS, Azure, GCP) and containerization technologies (Docker, Kubernetes)
  • Bachelor's degree in computer science, artificial intelligence, robotics, or a related quantitative field, or equivalent experience

Nice to have:

  • Experience in finance industry a plus
  • Java is a plus
  • Master's degree preferred

Additional Information:

Job Posted:
January 22, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior AI Data Engineer

Senior ML Data Engineer

As a Senior Data Engineer, you will play a pivotal role in our AI/ML workstream,...
Location
Location
Salary
Salary:
Not provided
awin.com Logo
Awin Global
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor or Master’s degree in data science, data engineering, Computer Science with focus on math and statistics / Master’s degree is preferred
  • At least 5 years experience as AI/ML data engineer undertaking above task and accountabilities
  • Strong foundation in computer science principes and statistical methods
  • Strong experience with cloud technology (AWS or Azure)
  • Strong experience with creation of data ingestion pipeline and ET process
  • Strong knowledge of big data tool such as Spark, Databricks and Python
  • Strong understanding of common machine learning techniques and frameworks (e.g. mlflow)
  • Strong knowledge of Natural language processing (NPL) concepts
  • Strong knowledge of scrum practices and agile mindset
Job Responsibility
Job Responsibility
  • Design and maintain scalable data pipelines and storage systems for both agentic and traditional ML workloads
  • Productionise LLM- and agent-based workflows, ensuring reliability, observability, and performance
  • Build and maintain feature stores, vector/embedding stores, and core data assets for ML
  • Develop and manage end-to-end traditional ML pipelines: data prep, training, validation, deployment, and monitoring
  • Implement data quality checks, drift detection, and automated retraining processes
  • Optimise cost, latency, and performance across all AI/ML infrastructure
  • Collaborate with data scientists and engineers to deliver production-ready ML and AI systems
  • Ensure AI/ML systems meet governance, security, and compliance requirements
  • Mentor teams and drive innovation across both agentic and classical ML engineering practices
  • Participate in team meetings and contribute to project planning and strategy discussions
What we offer
What we offer
  • Flexi-Week and Work-Life Balance: We prioritise your mental health and well-being, offering you a flexible four-day Flexi-Week at full pay and with no reduction to your annual holiday allowance. We also offer a variety of different paid special leaves as well as volunteer days
  • Remote Working Allowance: You will receive a monthly allowance to cover part of your running costs. In addition, we will support you in setting up your remote workspace appropriately
  • Pension: Awin offers access to an additional pension insurance to all employees in Germany
  • Flexi-Office: We offer an international culture and flexibility through our Flexi-Office and hybrid/remote work possibilities to work across Awin regions
  • Development: We’ve built our extensive training suite Awin Academy to cover a wide range of skills that nurture you professionally and personally, with trainings conveniently packaged together to support your overall development
  • Appreciation: Thank and reward colleagues by sending them a voucher through our peer-to-peer program
Read More
Arrow Right

Senior ML Data Engineer

As a Senior Data Engineer, you will play a pivotal role in our AI/ML workstream,...
Location
Location
Poland , Warsaw
Salary
Salary:
Not provided
awin.com Logo
Awin Global
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor or Master’s degree in data science, data engineering, Computer Science with focus on math and statistics / Master’s degree is preferred
  • At least 5 years experience as AI/ML data engineer undertaking above task and accountabilities
  • Strong foundation in computer science principes and statistical methods
  • Strong experience with cloud technology (AWS or Azure)
  • Strong experience with creation of data ingestion pipeline and ET process
  • Strong knowledge of big data tool such as Spark, Databricks and Python
  • Strong understanding of common machine learning techniques and frameworks (e.g. mlflow)
  • Strong knowledge of Natural language processing (NPL) concepts
  • Strong knowledge of scrum practices and agile mindset
  • Strong Analytical and Problem-Solving Skills with attention to data quality and accuracy
Job Responsibility
Job Responsibility
  • Design and maintain scalable data pipelines and storage systems for both agentic and traditional ML workloads
  • Productionise LLM- and agent-based workflows, ensuring reliability, observability, and performance
  • Build and maintain feature stores, vector/embedding stores, and core data assets for ML
  • Develop and manage end-to-end traditional ML pipelines: data prep, training, validation, deployment, and monitoring
  • Implement data quality checks, drift detection, and automated retraining processes
  • Optimise cost, latency, and performance across all AI/ML infrastructure
  • Collaborate with data scientists and engineers to deliver production-ready ML and AI systems
  • Ensure AI/ML systems meet governance, security, and compliance requirements
  • Mentor teams and drive innovation across both agentic and classical ML engineering practices
  • Participate in team meetings and contribute to project planning and strategy discussions
What we offer
What we offer
  • Flexi-Week and Work-Life Balance: We prioritise your mental health and well-being, offering you a flexible four-day Flexi-Week at full pay and with no reduction to your annual holiday allowance. We also offer a variety of different paid special leaves as well as volunteer days
  • Remote Working Allowance: You will receive a monthly allowance to cover part of your running costs. In addition, we will support you in setting up your remote workspace appropriately
  • Pension: Awin offers access to an additional pension insurance to all employees in Germany
  • Flexi-Office: We offer an international culture and flexibility through our Flexi-Office and hybrid/remote work possibilities to work across Awin regions
  • Development: We’ve built our extensive training suite Awin Academy to cover a wide range of skills that nurture you professionally and personally, with trainings conveniently packaged together to support your overall development
  • Appreciation: Thank and reward colleagues by sending them a voucher through our peer-to-peer program
Read More
Arrow Right

Senior Data Engineer

At ANS, the Senior Data Engineer plays a key role in delivering robust, scalable...
Location
Location
United Kingdom , Manchester
Salary
Salary:
Not provided
ans.co.uk Logo
ANS Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience in building and optimising pipelines in Azure Data Factory, Synapse, or Fabric
  • Strong knowledge of Python and SQL
  • Experience in using metadata frameworks in data engineering
  • Experience in best practice data engineering principles including CI/CD via Azure DevOps or Github
  • Understanding of Azure networking and security in relation to the data platform
  • Experience of data governance and regulation, including GDPR, principle of least privilege, classification etc.
  • Experience of lakehouse architecture, data warehousing principles, and data modelling
  • Familiarity with Microsoft Purview in a data platform context
  • Base knowledge of Azure foundry
Job Responsibility
Job Responsibility
  • Build and optimise data pipelines, notebooks, and data flows in Microsoft Fabric and Synapse Analytics, connecting to a variety of on-premises and cloud based data sources
  • Support Data Architects and Cloud Engineers by implementing solutions based on provided designs and offering feedback where needed
  • Collaborate across disciplines to ensure high-quality delivery of data solutions, including working with presales, managed services, and customer teams
  • Mentor Data engineers and support their development through guidance and task distribution
  • Ensure best practice adherence in engineering processes, including CI/CD via Azure DevOps and secure data handling (e.g. Key vault, private endpoints)
  • Contribute to Agile delivery by participating in standups, user story creation, and sprint planning
  • Document implemented solutions clearly and accurately for internal and customer use
  • Troubleshoot and resolve issues across subscriptions and environments
  • Work closely with the Project Manager (where applicable) to align on delivery timelines, report progress, and manage risks, while also acting as a key point of contact for customer SMEs and engineers to support collaboration and clarify technical requirements
  • Engage in continuous learning through certifications (e.g. DP-600 and/or DP700, AI-900, AI102, etc.) and development days
What we offer
What we offer
  • 25 days’ holiday, plus you can buy up to 5 more days
  • Birthday off
  • An extra celebration day
  • 5 days’ additional holiday in the year you get married
  • 5 volunteer days
  • Private health insurance
  • Pension contribution match and 4 x life assurance
  • Flexible working and work from anywhere for up to 30 days per year (some exceptions)
  • Maternity: 16 weeks’ full pay, Paternity: 3 weeks’ full pay, Adoption: 16 weeks’ full pay
  • Company social events
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
84835.61 - 149076.17 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows, BQML, Vertex AI.
  • Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics.
  • Hands-on experience working with real-time, unstructured, and synthetic data.
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar.
  • Expert knowledge on Python programming and SQL.
  • Experience with cloud platforms (AWS, GCP, Azure) and their data services
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
  • Familiarity with synthetic data generation and unstructured data processing
Job Responsibility
Job Responsibility
  • Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets
  • Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth.
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound.
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
  • Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics
  • Optimize data pipelines for performance, scalability, and reliability.
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root.
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Eligible to participate in an annual incentive program
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer – Dublin (Hybrid) Contract Role | 3 Days Onsite. We are see...
Location
Location
Ireland , Dublin
Salary
Salary:
Not provided
solasit.ie Logo
Solas IT Recruitment
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience as a Data Engineer working with distributed data systems
  • 4+ years of deep Snowflake experience, including performance tuning, SQL optimization, and data modelling
  • Strong hands-on experience with the Hadoop ecosystem: HDFS, Hive, Impala, Spark (PySpark preferred)
  • Oozie, Airflow, or similar orchestration tools
  • Proven expertise with PySpark, Spark SQL, and large-scale data processing patterns
  • Experience with Databricks and Delta Lake (or equivalent big-data platforms)
  • Strong programming background in Python, Scala, or Java
  • Experience with cloud services (AWS preferred): S3, Glue, EMR, Redshift, Lambda, Athena, etc.
Job Responsibility
Job Responsibility
  • Build, enhance, and maintain large-scale ETL/ELT pipelines using Hadoop ecosystem tools including HDFS, Hive, Impala, and Oozie/Airflow
  • Develop distributed data processing solutions with PySpark, Spark SQL, Scala, or Python to support complex data transformations
  • Implement scalable and secure data ingestion frameworks to support both batch and streaming workloads
  • Work hands-on with Snowflake to design performant data models, optimize queries, and establish solid data governance practices
  • Collaborate on the migration and modernization of current big-data workloads to cloud-native platforms and Databricks
  • Tune Hadoop, Spark, and Snowflake systems for performance, storage efficiency, and reliability
  • Apply best practices in data modelling, partitioning strategies, and job orchestration for large datasets
  • Integrate metadata management, lineage tracking, and governance standards across the platform
  • Build automated validation frameworks to ensure accuracy, completeness, and reliability of data pipelines
  • Develop unit, integration, and end-to-end testing for ETL workflows using Python, Spark, and dbt testing where applicable
Read More
Arrow Right

Senior Software Engineer, Data Platform

We are looking for a foundational member of the Data Team to enable Skydio to ma...
Location
Location
United States , San Mateo
Salary
Salary:
180000.00 - 240000.00 USD / Year
skydio.com Logo
Skydio
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional experience
  • 2+ years in software engineering
  • 2+ years in data engineering with a bias towards getting your hands dirty
  • Deep experience with Databricks building pipelines, managing datasets, and developing dashboards or analytical applications
  • Proven track record of operating scalable data platforms, defining company-wide patterns that ensure reliability, performance, and cost effectiveness
  • Proficiency in SQL and at least one modern programming language (we use Python)
  • Comfort working across the full data stack — from ingestion and transformation to orchestration and visualization
  • Strong communication skills, with the ability to collaborate effectively across all levels and functions
  • Demonstrated ability to lead technical direction, mentor teammates, and promote engineering excellence and best practices across the organization
  • Familiarity with AI-assisted data workflows, including tools that accelerate data transformations or enable natural-language interfaces for analytics
Job Responsibility
Job Responsibility
  • Design and scale the data infrastructure that ingests live telemetry from tens of thousands of autonomous drones
  • Build and evolve our Databricks and Palantir Foundry environments to empower every Skydian to query data, define jobs, and build dashboards
  • Develop data systems that make our products truly data-driven — from predictive analytics that anticipate hardware failures, to 3D connectivity mapping, to in-depth flight telemetry analysis
  • Create and integrate AI-powered tools for data analysis, transformation, and pipeline generation
  • Champion a data-driven culture by defining and enforcing best practices for data quality, lineage, and governance
  • Collaborate with autonomy, manufacturing, and operations teams to unify how data flows across the company
  • Lead and mentor data engineers, analysts, and stakeholders across Skydio
  • Ensure platform reliability by implementing robust monitoring, observability, and contributing to the on-call rotation for critical data systems
What we offer
What we offer
  • Equity in the form of stock options
  • Comprehensive benefits packages
  • Relocation assistance may also be provided for eligible roles
  • Paid vacation time
  • Sick leave
  • Holiday pay
  • 401K savings plan
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

We are looking for a foundational member of the Data Team to enable Skydio to ma...
Location
Location
United States , San Mateo
Salary
Salary:
170000.00 - 230000.00 USD / Year
skydio.com Logo
Skydio
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional experience
  • 2+ years in software engineering
  • 2+ years in data engineering with a bias towards getting your hands dirty
  • Deep experience with Databricks or Palantir Foundry, including building pipelines, managing datasets, and developing dashboards or analytical applications
  • Proven track record of operating scalable data platforms, defining company-wide patterns that ensure reliability, performance, and cost effectiveness
  • Proficiency in SQL and at least one modern programming language (for example, Python or Java)
  • Strong communication skills, with the ability to collaborate effectively across all levels and functions
  • Demonstrated ability to lead technical direction, mentor teammates, and promote engineering excellence and best practices across the organization
  • Familiarity with AI-assisted data workflows, including tools that accelerate data transformations or enable natural-language interfaces for analytics
Job Responsibility
Job Responsibility
  • Design and scale the data infrastructure that ingests live telemetry from tens of thousands of autonomous drones
  • Build and evolve our Databricks and Palantir Foundry environments
  • Develop data systems that make our products truly data-driven
  • Create and integrate AI-powered tools for data analysis, transformation, and pipeline generation
  • Champion a data-driven culture by defining and enforcing best practices for data quality, lineage, and governance
  • Collaborate with autonomy, manufacturing, and operations teams to unify how data flows across the company
  • Lead and mentor data engineers, analysts, and stakeholders across Skydio
  • Ensure platform reliability by implementing robust monitoring, observability, and contributing to the on-call rotation for critical data systems
What we offer
What we offer
  • Equity in the form of stock options
  • Comprehensive benefits packages
  • Relocation assistance may also be provided for eligible roles
  • Group health insurance plans
  • Paid vacation time
  • Sick leave
  • Holiday pay
  • 401K savings plan
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Radix is building the most trusted data and analytics platform in multifamily. J...
Location
Location
United States , Scottsdale
Salary
Salary:
Not provided
radix.com Logo
Radix (AZ)
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience in data engineering or backend systems, with 2+ years collaborating closely with analytical or scientific practitioners
  • Experience designing at least one end-to-end data application (visualization, model pipeline, etc.) for production use
  • Strong understanding of data modeling, batch processing, and streaming systems
  • Hands-on experience with SQL/NoSQL databases, cloud storage, file-based datasets, and Infrastructure-as-code
  • Experience building or operating data pipelines on AWS cloud services (Lambda, S3, RDS, ECS)
  • Understanding of AI/LLM integration and prompt engineering fundamentals
  • Proficiency with Git/GitHub, and familiarity with Spark and Kubernetes
  • Strong problem-solving skills, ownership, and ability to identify root causes of technical debt and recurring problems
  • Demonstrated ability to translate Product/Science requirements into technical plans and hold teams accountable to delivery timelines
  • Undergraduate degree in computer science, computer engineering, software engineering, or equivalent
Job Responsibility
Job Responsibility
  • Build scalable ETL/ELT pipelines for ingesting structured and unstructured data (Excel, JSON, PDFs, APIs)
  • Design and maintain data pipelines using SQL, Python, Node.js, or TypeScript
  • Work with distributed compute systems (Spark, Kubernetes), message queues, and streaming data
  • Manage and optimize data storage in MongoDB, PostgreSQL, Redis, Snowflake, and S3
  • Develop clean, standardized data schemas and event-driven transformations
  • Integrate AI-assisted parsers, mappers, and LLM-supported transformations
  • Collaborate with backend, analytics, product, and AI teams to break down requirements into well-defined engineering problems
  • Implement monitoring, data validation, and reliability checks (DQ rules, freshness, duplication)
  • Own production readiness, including on-call responsibilities and incident follow-ups
  • Mentor engineers, conduct thorough code reviews, and introduce patterns that raise the team's technical bar
What we offer
What we offer
  • Medical, dental and vision coverage designed to support your wellbeing
  • Pre-IPO Equity
  • Performance Bonus
  • Learn From the Best
  • Build Category-Defining Products
Read More
Arrow Right