CrawlJobs Logo

Senior AI Data Engineer

https://www.citi.com/ Logo

Citi

Location Icon

Location:
Canada , Mississauga

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

120800.00 - 170800.00 USD / Year

Job Description:

The Senior AI Data Engineer (Applications Development Technology Lead Analyst - C13) is a senior-level position responsible for designing, developing, and deploying AI agents capable of understanding goals, planning actions, and executing tasks with minimal human intervention. This role requires a strong understanding of AI principles, agent-based systems, machine learning, and software engineering best practices. The ideal candidate will be able to translate research ideas into robust and scalable production systems. In coordination with the Technology team, this role will also be responsible for establishing and implementing new or revised application systems and programs. The overall objective of this role is to lead applications systems analysis and programming activities.

Job Responsibility:

  • Design and implement intelligent agents, including their perception, reasoning, planning, and action execution modules
  • Develop scalable and robust architectures for agentic systems, ensuring high performance, reliability, and security
  • Integrate various machine learning models (e.g., LLMs, reinforcement learning, predictive models) to enhance agent capabilities and decision-making
  • Develop agents that can automate complex tasks, optimize workflows, and solve real-world problems across various domains
  • Utilize and contribute to agentic AI frameworks and development tools
  • Design and implement metrics and evaluation strategies for agent performance, continuously optimizing and improving agent behavior
  • Stay abreast of the latest advancements in AI, particularly in agent-based systems, autonomous AI, and related fields, and propose innovative solutions
  • Work closely with cross-functional teams including AI researchers, data scientists, product managers, and software engineers to integrate agentic solutions into broader products and services
  • Create comprehensive technical documentation for agent designs, implementations, and operational procedures
  • Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency

Requirements:

  • 6+ years of professional experience in software development with a focus on AI, machine learning, or agent-based systems
  • Strong proficiency in Python, SQL
  • Java is a plus
  • Solid understanding of core AI concepts, including knowledge representation, automated planning, decision-making under uncertainty, and multi-agent systems
  • Experience with machine learning frameworks (e.g., TensorFlow, PyTorch) and relevant libraries (e.g., Scikit-Learn, NumPy, Pandas)
  • Familiarity with large language models (LLMs) and their application in agentic systems
  • Familiarity with specific agent frameworks (e.g., LangChain, AutoGen, CrewAI, RAG) or research in multi-agent reinforcement learning
  • Experience in designing and implementing APIs for AI services
  • Experience with software development best practices, including version control (Git), CI/CD pipelines, testing, and code reviews
  • Excellent analytical and problem-solving skills with a creative approach to complex challenges
  • Strong written and verbal communication skills, with the ability to articulate complex technical concepts to diverse audiences
  • Experience with cloud platforms (AWS, Azure, GCP) and containerization technologies (Docker, Kubernetes)
  • Bachelor's degree in computer science, artificial intelligence, robotics, or a related quantitative field, or equivalent experience
  • Master's degree preferred

Nice to have:

Experience in finance industry a plus

Additional Information:

Job Posted:
December 28, 2025

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior AI Data Engineer

New

Senior AI Engineer

As part of Schwab’s AI Strategy & Transformation team (AI.x), you’ll join the ce...
Location
Location
United States , San Francisco
Salary
Salary:
211600.00 - 317400.00 USD / Year
schwab.com Logo
Charles Schwab
Expiration Date
January 20, 2026
Flip Icon
Requirements
Requirements
  • 8+ years of software development experience
  • 4+ years as a hands-on senior engineer in startups and/or large organizations
  • Bachelor’s degree in Computer Science or related field
  • 5+ years building complex products from scratch, running them in production, and ensuring operational reliability
  • 3+ years building applications leveraging AI models for measurable business impact
  • 3+ years developing applications and data pipelines interfacing with large datasets
  • 3+ years working with containers and cloud-native applications, operationalizing them in the public cloud with infrastructure as code
Job Responsibility
Job Responsibility
  • Design, build, and deliver GenAI applications that elevate client experience and generate business impact
  • Champion reliability, monitoring, observability, and operational best practices for AI systems and data pipelines
  • Collaborate with cross-functional teams to align solutions with enterprise strategy and technical standards
  • Mentor and coach junior engineers, fostering strong practices and continuous learning
  • Lead by example in solving complex technical challenges and driving rapid iteration from concept to deployment
  • Implement and maintain monitoring, alerting, and incident response frameworks to ensure system health and reliability
  • Advance engineering standards, focusing on operational excellence and quality across all deliverables
What we offer
What we offer
  • 401(k) with company match
  • Employee stock purchase plan
  • Paid time for vacation, volunteering
  • 28-day sabbatical after every 5 years of service for eligible positions
  • Paid parental leave
  • Family building benefits
  • Tuition reimbursement
  • Health, dental, and vision insurance
  • Bonus or incentive opportunities
  • Fulltime
Read More
Arrow Right
New

Senior Data & AI Innovation Engineer

We are seeking a highly proactive, self-driven Senior Data & AI Engineer to serv...
Location
Location
Singapore , Singapore
Salary
Salary:
7000.00 - 8000.00 SGD / Month
https://www.randstad.com Logo
Randstad
Expiration Date
January 08, 2026
Flip Icon
Requirements
Requirements
  • Proven, hands-on experience in implementing and supporting practical AI use cases (beyond academic study), understanding how to embed AI components into existing services
  • 4+ years of hands-on experience in implementing and operating Snowflake Data Cloud in a production environment
  • Certification (e.g., SnowPro Data Engineer) is highly desirable
  • Familiarity with MLOps concepts and tools (e.g., Docker, MLflow, LangChain) and an understanding of LLMs, RAG pipelines, and generative AI deployment
  • Strong programming skills in Python for data manipulation, scripting, and AI model support
Job Responsibility
Job Responsibility
  • Proactively identify, design, and implement initial AI Proof-of-Concepts (POCs) across the APAC region, focusing on quick-win solutions like AI-powered chatbots and intelligent inventory monitoring systems
  • Analyze business processes to identify areas where AI components can be effectively embedded to solve immediate business challenges
  • Partner with business stakeholders to understand AI data needs, perform data engineering/prep, and ensure data readiness to support and sustain deployed AI models
  • Stay ahead of technology trends, perform proactive research on Data and AI solutions, and evangelize new capabilities to regional teams
  • Act as the APAC SME, collaborating closely with cross-regional peers and global teams to contribute to and align with the company Global Data Platform roadmap (Snowflake)
  • Define and execute the complete migration strategy from legacy data warehouses/databases (e.g., PostgreSQL, MS SQL) to the Snowflake Data Cloud platform
  • Design, build, and optimize scalable, robust ETL/ELT data pipelines to curate raw data into BI and Advanced Analytics datasets
  • Implement and manage Snowflake governance, including access control, data security, usage monitoring, and performance optimization aligned with global best practices
Read More
Arrow Right
New

Senior ML Data Engineer

As a Senior Data Engineer, you will play a pivotal role in our AI/ML workstream,...
Location
Location
Salary
Salary:
Not provided
awin.com Logo
Awin Global
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor or Master’s degree in data science, data engineering, Computer Science with focus on math and statistics / Master’s degree is preferred
  • At least 5 years experience as AI/ML data engineer undertaking above task and accountabilities
  • Strong foundation in computer science principes and statistical methods
  • Strong experience with cloud technology (AWS or Azure)
  • Strong experience with creation of data ingestion pipeline and ET process
  • Strong knowledge of big data tool such as Spark, Databricks and Python
  • Strong understanding of common machine learning techniques and frameworks (e.g. mlflow)
  • Strong knowledge of Natural language processing (NPL) concepts
  • Strong knowledge of scrum practices and agile mindset
Job Responsibility
Job Responsibility
  • Design and maintain scalable data pipelines and storage systems for both agentic and traditional ML workloads
  • Productionise LLM- and agent-based workflows, ensuring reliability, observability, and performance
  • Build and maintain feature stores, vector/embedding stores, and core data assets for ML
  • Develop and manage end-to-end traditional ML pipelines: data prep, training, validation, deployment, and monitoring
  • Implement data quality checks, drift detection, and automated retraining processes
  • Optimise cost, latency, and performance across all AI/ML infrastructure
  • Collaborate with data scientists and engineers to deliver production-ready ML and AI systems
  • Ensure AI/ML systems meet governance, security, and compliance requirements
  • Mentor teams and drive innovation across both agentic and classical ML engineering practices
  • Participate in team meetings and contribute to project planning and strategy discussions
What we offer
What we offer
  • Flexi-Week and Work-Life Balance: We prioritise your mental health and well-being, offering you a flexible four-day Flexi-Week at full pay and with no reduction to your annual holiday allowance. We also offer a variety of different paid special leaves as well as volunteer days
  • Remote Working Allowance: You will receive a monthly allowance to cover part of your running costs. In addition, we will support you in setting up your remote workspace appropriately
  • Pension: Awin offers access to an additional pension insurance to all employees in Germany
  • Flexi-Office: We offer an international culture and flexibility through our Flexi-Office and hybrid/remote work possibilities to work across Awin regions
  • Development: We’ve built our extensive training suite Awin Academy to cover a wide range of skills that nurture you professionally and personally, with trainings conveniently packaged together to support your overall development
  • Appreciation: Thank and reward colleagues by sending them a voucher through our peer-to-peer program
Read More
Arrow Right
New

Senior ML Data Engineer

As a Senior Data Engineer, you will play a pivotal role in our AI/ML workstream,...
Location
Location
Poland , Warsaw
Salary
Salary:
Not provided
awin.com Logo
Awin Global
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor or Master’s degree in data science, data engineering, Computer Science with focus on math and statistics / Master’s degree is preferred
  • At least 5 years experience as AI/ML data engineer undertaking above task and accountabilities
  • Strong foundation in computer science principes and statistical methods
  • Strong experience with cloud technology (AWS or Azure)
  • Strong experience with creation of data ingestion pipeline and ET process
  • Strong knowledge of big data tool such as Spark, Databricks and Python
  • Strong understanding of common machine learning techniques and frameworks (e.g. mlflow)
  • Strong knowledge of Natural language processing (NPL) concepts
  • Strong knowledge of scrum practices and agile mindset
  • Strong Analytical and Problem-Solving Skills with attention to data quality and accuracy
Job Responsibility
Job Responsibility
  • Design and maintain scalable data pipelines and storage systems for both agentic and traditional ML workloads
  • Productionise LLM- and agent-based workflows, ensuring reliability, observability, and performance
  • Build and maintain feature stores, vector/embedding stores, and core data assets for ML
  • Develop and manage end-to-end traditional ML pipelines: data prep, training, validation, deployment, and monitoring
  • Implement data quality checks, drift detection, and automated retraining processes
  • Optimise cost, latency, and performance across all AI/ML infrastructure
  • Collaborate with data scientists and engineers to deliver production-ready ML and AI systems
  • Ensure AI/ML systems meet governance, security, and compliance requirements
  • Mentor teams and drive innovation across both agentic and classical ML engineering practices
  • Participate in team meetings and contribute to project planning and strategy discussions
What we offer
What we offer
  • Flexi-Week and Work-Life Balance: We prioritise your mental health and well-being, offering you a flexible four-day Flexi-Week at full pay and with no reduction to your annual holiday allowance. We also offer a variety of different paid special leaves as well as volunteer days
  • Remote Working Allowance: You will receive a monthly allowance to cover part of your running costs. In addition, we will support you in setting up your remote workspace appropriately
  • Pension: Awin offers access to an additional pension insurance to all employees in Germany
  • Flexi-Office: We offer an international culture and flexibility through our Flexi-Office and hybrid/remote work possibilities to work across Awin regions
  • Development: We’ve built our extensive training suite Awin Academy to cover a wide range of skills that nurture you professionally and personally, with trainings conveniently packaged together to support your overall development
  • Appreciation: Thank and reward colleagues by sending them a voucher through our peer-to-peer program
Read More
Arrow Right

Senior Data Engineer

At ANS, the Senior Data Engineer plays a key role in delivering robust, scalable...
Location
Location
United Kingdom , Manchester
Salary
Salary:
Not provided
ans.co.uk Logo
ANS Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience in building and optimising pipelines in Azure Data Factory, Synapse, or Fabric
  • Strong knowledge of Python and SQL
  • Experience in using metadata frameworks in data engineering
  • Experience in best practice data engineering principles including CI/CD via Azure DevOps or Github
  • Understanding of Azure networking and security in relation to the data platform
  • Experience of data governance and regulation, including GDPR, principle of least privilege, classification etc.
  • Experience of lakehouse architecture, data warehousing principles, and data modelling
  • Familiarity with Microsoft Purview in a data platform context
  • Base knowledge of Azure foundry
Job Responsibility
Job Responsibility
  • Build and optimise data pipelines, notebooks, and data flows in Microsoft Fabric and Synapse Analytics, connecting to a variety of on-premises and cloud based data sources
  • Support Data Architects and Cloud Engineers by implementing solutions based on provided designs and offering feedback where needed
  • Collaborate across disciplines to ensure high-quality delivery of data solutions, including working with presales, managed services, and customer teams
  • Mentor Data engineers and support their development through guidance and task distribution
  • Ensure best practice adherence in engineering processes, including CI/CD via Azure DevOps and secure data handling (e.g. Key vault, private endpoints)
  • Contribute to Agile delivery by participating in standups, user story creation, and sprint planning
  • Document implemented solutions clearly and accurately for internal and customer use
  • Troubleshoot and resolve issues across subscriptions and environments
  • Work closely with the Project Manager (where applicable) to align on delivery timelines, report progress, and manage risks, while also acting as a key point of contact for customer SMEs and engineers to support collaboration and clarify technical requirements
  • Engage in continuous learning through certifications (e.g. DP-600 and/or DP700, AI-900, AI102, etc.) and development days
What we offer
What we offer
  • 25 days’ holiday, plus you can buy up to 5 more days
  • Birthday off
  • An extra celebration day
  • 5 days’ additional holiday in the year you get married
  • 5 volunteer days
  • Private health insurance
  • Pension contribution match and 4 x life assurance
  • Flexible working and work from anywhere for up to 30 days per year (some exceptions)
  • Maternity: 16 weeks’ full pay, Paternity: 3 weeks’ full pay, Adoption: 16 weeks’ full pay
  • Company social events
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
84835.61 - 149076.17 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows, BQML, Vertex AI.
  • Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics.
  • Hands-on experience working with real-time, unstructured, and synthetic data.
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar.
  • Expert knowledge on Python programming and SQL.
  • Experience with cloud platforms (AWS, GCP, Azure) and their data services
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
  • Familiarity with synthetic data generation and unstructured data processing
Job Responsibility
Job Responsibility
  • Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets
  • Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth.
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound.
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
  • Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics
  • Optimize data pipelines for performance, scalability, and reliability.
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root.
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Eligible to participate in an annual incentive program
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer – Dublin (Hybrid) Contract Role | 3 Days Onsite. We are see...
Location
Location
Ireland , Dublin
Salary
Salary:
Not provided
solasit.ie Logo
Solas IT Recruitment
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience as a Data Engineer working with distributed data systems
  • 4+ years of deep Snowflake experience, including performance tuning, SQL optimization, and data modelling
  • Strong hands-on experience with the Hadoop ecosystem: HDFS, Hive, Impala, Spark (PySpark preferred)
  • Oozie, Airflow, or similar orchestration tools
  • Proven expertise with PySpark, Spark SQL, and large-scale data processing patterns
  • Experience with Databricks and Delta Lake (or equivalent big-data platforms)
  • Strong programming background in Python, Scala, or Java
  • Experience with cloud services (AWS preferred): S3, Glue, EMR, Redshift, Lambda, Athena, etc.
Job Responsibility
Job Responsibility
  • Build, enhance, and maintain large-scale ETL/ELT pipelines using Hadoop ecosystem tools including HDFS, Hive, Impala, and Oozie/Airflow
  • Develop distributed data processing solutions with PySpark, Spark SQL, Scala, or Python to support complex data transformations
  • Implement scalable and secure data ingestion frameworks to support both batch and streaming workloads
  • Work hands-on with Snowflake to design performant data models, optimize queries, and establish solid data governance practices
  • Collaborate on the migration and modernization of current big-data workloads to cloud-native platforms and Databricks
  • Tune Hadoop, Spark, and Snowflake systems for performance, storage efficiency, and reliability
  • Apply best practices in data modelling, partitioning strategies, and job orchestration for large datasets
  • Integrate metadata management, lineage tracking, and governance standards across the platform
  • Build automated validation frameworks to ensure accuracy, completeness, and reliability of data pipelines
  • Develop unit, integration, and end-to-end testing for ETL workflows using Python, Spark, and dbt testing where applicable
Read More
Arrow Right

Senior Software Engineer, Data Platform

We are looking for a foundational member of the Data Team to enable Skydio to ma...
Location
Location
United States , San Mateo
Salary
Salary:
180000.00 - 240000.00 USD / Year
skydio.com Logo
Skydio
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional experience
  • 2+ years in software engineering
  • 2+ years in data engineering with a bias towards getting your hands dirty
  • Deep experience with Databricks building pipelines, managing datasets, and developing dashboards or analytical applications
  • Proven track record of operating scalable data platforms, defining company-wide patterns that ensure reliability, performance, and cost effectiveness
  • Proficiency in SQL and at least one modern programming language (we use Python)
  • Comfort working across the full data stack — from ingestion and transformation to orchestration and visualization
  • Strong communication skills, with the ability to collaborate effectively across all levels and functions
  • Demonstrated ability to lead technical direction, mentor teammates, and promote engineering excellence and best practices across the organization
  • Familiarity with AI-assisted data workflows, including tools that accelerate data transformations or enable natural-language interfaces for analytics
Job Responsibility
Job Responsibility
  • Design and scale the data infrastructure that ingests live telemetry from tens of thousands of autonomous drones
  • Build and evolve our Databricks and Palantir Foundry environments to empower every Skydian to query data, define jobs, and build dashboards
  • Develop data systems that make our products truly data-driven — from predictive analytics that anticipate hardware failures, to 3D connectivity mapping, to in-depth flight telemetry analysis
  • Create and integrate AI-powered tools for data analysis, transformation, and pipeline generation
  • Champion a data-driven culture by defining and enforcing best practices for data quality, lineage, and governance
  • Collaborate with autonomy, manufacturing, and operations teams to unify how data flows across the company
  • Lead and mentor data engineers, analysts, and stakeholders across Skydio
  • Ensure platform reliability by implementing robust monitoring, observability, and contributing to the on-call rotation for critical data systems
What we offer
What we offer
  • Equity in the form of stock options
  • Comprehensive benefits packages
  • Relocation assistance may also be provided for eligible roles
  • Paid vacation time
  • Sick leave
  • Holiday pay
  • 401K savings plan
  • Fulltime
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.