CrawlJobs Logo

Cloud Data Migration & AI Systems Engineer

aquent.com Logo

Aquent

Location Icon

Location:
United States

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

43.70 USD / Hour

Job Description:

As an Agentic Evaluation Specialist, you will work directly with researchers at a top 8 Frontier large language model company to improve LLM Agent performance specific for Agentic Storage Management.

Job Responsibility:

  • Data migration workflows across cloud platforms (AWS, GCP, Azure)
  • Data migration strategies, considering trade-offs (online vs offline, performance, scalability)
  • Creation & upkeep of data pipelines and ETL/ELT workflows in distributed environments
  • Integrate and evaluate LLM/Generative AI components within workflows
  • Perform system testing and validation: Execute test scenarios, Identify edge cases and failure points, Debug and analyze issues across systems, Ensure data integrity, consistency, and performance across systems
  • Document findings, insights, and recommendations clearly for stakeholders
  • Collaborate with cross-functional teams across engineering, data, and product

Requirements:

  • 5+ years of relevant experience with cloud platforms (GCP, AWS, or Azure) and data/storage systems
  • Strong understanding of data migration processes and associated trade-offs
  • Experience with data pipelines, ETL/ELT workflows, or distributed systems
  • Hands-on exposure to LLMs / Generative AI (prompting, evaluation, or integration)
  • Experience in system testing, validation, or QA
  • Strong debugging and problem-solving skills
  • Excellent written communication skills

Nice to have:

  • Background in storage administration, SRE, or cloud infrastructure operations
  • Experience with MLOps, model evaluation frameworks, or AI testing workflows
  • Familiarity with Google Cloud Storage (GCS) or similar platforms
  • Experience working in Agile environments with cross-functional teams

Additional Information:

Job Posted:
May 04, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Cloud Data Migration & AI Systems Engineer

Principal Data Engineer

We are on the lookout for a Principal Data Engineer to help define and lead the ...
Location
Location
United Kingdom
Salary
Salary:
Not provided
dotdigital.com Logo
Dotdigital
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Extensive experience delivering python-based projects in the data engineering space
  • Extensive experience working with SQL and NoSQL database technologies (e.g. SQL Server, MongoDB & Cassandra)
  • Proven experience with modern data warehousing and large-scale data processing tools (e.g. Snowflake, DBT, BiqQuery, Clickhouse)
  • Hands on experience with data orchestration tools like Airflow, Dagster or Prefect
  • Experience using cloud environments (e.g. Azure, AWS, GCP) to process, store and surface large scale data
  • Experience using Kafka or similar event-based architectures e.g. (Pub/Sub via AWS SQS, Azure EventHubs, AWS Kinesis)
  • Strong grasp of data architecture and data modelling principles for both OLAP and OLTP workloads
  • Capable in the wider software development lifecycle in terms of agile ways of working and continuous integration/deployment of data solutions
  • Experience as a lead or Principal Engineer on large-scale data initiative or product builds
  • Demonstrated ability to architect data systems and data structures for high volume, high throughput systems
Job Responsibility
Job Responsibility
  • Lead the design and implementation of scalable, secure and resilient data systems across streaming, batch and real-time use cases
  • Architect data pipelines, model and storage solutions that power analytical and product use cases
  • using primarily Python and SQL via orchestration tooling that run workloads in the cloud
  • Leverage AI to automate both data processing and engineering processes
  • Assure and drive best practices relating to data infrastructure, governance, security and observability
  • Work with technologists across multiple teams to deliver coherent features and data outcomes
  • Support the data team to help adopt data engineering principles
  • Identify, validate and promote new tools and technologies that improve the performance and stability of data services
What we offer
What we offer
  • Parental leave
  • Medical benefits
  • Paid sick leave
  • Dotdigital day
  • Share reward
  • Wellbeing reward
  • Wellbeing Days
  • Loyalty reward
  • Fulltime
Read More
Arrow Right

Data Migration Consultant

As a Data Migration Consultant you will help our clients with their digital tran...
Location
Location
Belgium , Flanders/Brussels
Salary
Salary:
Not provided
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • At least 3 years of experience as a Database Administrator, Data Engineer or Software Engineer
  • Affinity for data, data structures and data mapping
  • Knowledge and experience with data modeling and relational algebra
  • Good knowledge and experience with SQL
  • Knowledge of a programming language like Python is a plus
  • Practical experience with Microsoft cloud or comparable alternatives
  • Experience with using and implementing AI in your and the client's workflow is a plus
  • Work accurately and precisely
  • Analytical thinking, communication skills and a practical attitude
  • Proficient in English
Job Responsibility
Job Responsibility
  • Help clients with their digital transformation
  • Build data migration
  • Ensure data from source systems is converted to fit the target system
  • Develop selections, conversion rules, controls
  • Ensure data migration is complete and correct
  • Ensure client can successfully deploy the new system filled with data
What we offer
What we offer
  • Mobility options (including a company car)
  • Insurance coverage
  • Meal vouchers
  • Eco-cheques
  • Continuous learning opportunities through the Sopra Steria Academy
  • Team events
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer – Dublin (Hybrid) Contract Role | 3 Days Onsite. We are see...
Location
Location
Ireland , Dublin
Salary
Salary:
Not provided
solasit.ie Logo
Solas IT Recruitment
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience as a Data Engineer working with distributed data systems
  • 4+ years of deep Snowflake experience, including performance tuning, SQL optimization, and data modelling
  • Strong hands-on experience with the Hadoop ecosystem: HDFS, Hive, Impala, Spark (PySpark preferred)
  • Oozie, Airflow, or similar orchestration tools
  • Proven expertise with PySpark, Spark SQL, and large-scale data processing patterns
  • Experience with Databricks and Delta Lake (or equivalent big-data platforms)
  • Strong programming background in Python, Scala, or Java
  • Experience with cloud services (AWS preferred): S3, Glue, EMR, Redshift, Lambda, Athena, etc.
Job Responsibility
Job Responsibility
  • Build, enhance, and maintain large-scale ETL/ELT pipelines using Hadoop ecosystem tools including HDFS, Hive, Impala, and Oozie/Airflow
  • Develop distributed data processing solutions with PySpark, Spark SQL, Scala, or Python to support complex data transformations
  • Implement scalable and secure data ingestion frameworks to support both batch and streaming workloads
  • Work hands-on with Snowflake to design performant data models, optimize queries, and establish solid data governance practices
  • Collaborate on the migration and modernization of current big-data workloads to cloud-native platforms and Databricks
  • Tune Hadoop, Spark, and Snowflake systems for performance, storage efficiency, and reliability
  • Apply best practices in data modelling, partitioning strategies, and job orchestration for large datasets
  • Integrate metadata management, lineage tracking, and governance standards across the platform
  • Build automated validation frameworks to ensure accuracy, completeness, and reliability of data pipelines
  • Develop unit, integration, and end-to-end testing for ETL workflows using Python, Spark, and dbt testing where applicable
Read More
Arrow Right

Ai Solution Engineer

This role exists to bring AI-powered automation into real-world use across our c...
Location
Location
Canada , Mississauga
Salary
Salary:
139500.00 - 150000.00 CAD / Year
pointclickcare.com Logo
PointClickCare
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in computer science, Engineering, Information Systems, or equivalent practical experience
  • Experience building AI-powered workflows using Azure AI Foundry, Copilot Studio and/or Now Assist
  • 2–5 years of experience in software or cloud engineering
  • Exposure to AI agent orchestration or multi-agent systems
  • Hands-on experience with Microsoft Azure services (Azure AI, Azure Functions, Logic Apps, Cognitive Services)
  • Familiarity with retrieval-augmented generation (RAG) or vector database integration
  • Experience working with multiple LLMs (OpenAI GPT, Azure OpenAI, Gemini), prompt engineering, and AI-driven automation
  • Proficient in Python, C#, or JavaScript/Type Script
  • 3-5 years' experience working in SaaS or enterprise environments
  • Familiar with CI/CD pipelines, Git, and cloud deployment practices
Job Responsibility
Job Responsibility
  • Build and support AI agents and intelligent workflows using Microsoft Azure tools such as Azure OpenAI, AI Foundry, and Copilot Studio
  • Design and implement AI-powered orchestration and automation for use cases such as configuration streamlining, onboarding automation, and data migration
  • Collaborate with cross-functional teams (integration, implementation, product, support) to deliver high-quality, scalable AI-driven solutions
  • Develop APIs, scripts, and tools to connect LLM-based agents with existing enterprise systems
  • Support testing, deployment, monitoring, and continuous improvement of AI workflows in production
  • Stay current with Microsoft’s AI platform roadmap and emerging industry trends
  • Contribute to the evolution of our internal AI delivery model and promote AI best practices across teams
What we offer
What we offer
  • Benefits starting from Day 1
  • Retirement Plan Matching
  • Flexible Paid Time Off
  • Wellness Support Programs and Resources
  • Parental & Caregiver Leaves
  • Fertility & Adoption Support
  • Continuous Development Support Program
  • Employee Assistance Program
  • Allyship and Inclusion Communities
  • Employee Recognition
  • Fulltime
Read More
Arrow Right

Forward Deployed Engineer - Data Migration & Data Consolidation Platforms

As a Forward Deployed Engineer (FDE) for Data Migration & Data Consolidation Pla...
Location
Location
United States
Salary
Salary:
Not provided
rackspace.com Logo
Rackspace
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7-10+ years of progressive experience in enterprise data engineering, data migration, or large-scale system integration roles within complex, multi-platform environments
  • 3-5+ years directly leading end-to-end data migration or multi-system consolidation programs for Global Enterprises and Industry Leaders, with full ownership of technical delivery and client outcomes
  • Demonstrated client-facing experience serving as a trusted technical advisor to C-level executives, enterprise architecture teams, and cross-functional business stakeholders
  • Proven industry depth in at least two of the following verticals: Healthcare, Financial Services, Manufacturing, Retail, Energy & Utilities, or Public Sector
  • Hands-on migration complexity: successfully delivered programs involving at least 3+ heterogeneous source systems, 100M+ records, complex master data harmonization, and multi-phase cutover execution
  • Advanced proficiency in Python and SQL with working experience in PySpark and TypeScript/JavaScript
  • Hands-on expertise with modern ETL/ELT and data integration platforms (Informatica, Talend, Matillion, Fivetran, AWS Glue, Azure Data Factory)
  • Proven ability to build scalable, version-controlled data pipelines with error handling, incremental loading, and Change Data Capture (CDC)
  • Strong working knowledge of at least one major cloud provider (AWS, Azure, or GCP), including core infrastructure, managed data services, and security configurations
  • Experience with enterprise data warehouse and lakehouse platforms (Snowflake, Databricks, BigQuery, Redshift, Synapse Analytics, Delta Lake)
Job Responsibility
Job Responsibility
  • Migration Execution & Cloud Architecture: Lead end-to-end delivery of enterprise data migrations from corporate systems (SAP, Oracle, Epic ERP) to target cloud data platforms, including the design of cloud landing zones, data governance frameworks, and system rationalization strategies. Establish migration compliance controls, automated rollback procedures, and operational readiness gates while owning full technical accountability for 12–18+ month migration roadmaps
  • Data Pipeline Engineering & Transformation: Build production-grade data connectors to SAP (RFC, IDoc, BAPI, OData), Oracle (AQ, GoldenGate, APIs), and SQL/non-relational sources. Develop ETL/ELT pipelines with LLM-enabled transformation logic, multi-layer validation and reconciliation frameworks, and optimized throughput for datasets scaling from tens of millions to billions of records with built-in CDC and incremental loading
  • Ontology Layer Development & Schema Automation: Construct semantic ontology layers translating raw ERP structures into business-consumable objects (Customer, Order, Invoice, Product, Vendor, Asset). Deploy automated schema mapping agents for source-to-target analysis and transformation logic generation. Build unified master data models with row/column-level security, cross-system lineage tracking, and AI-ready semantic structures
  • Application & Workflow Delivery: Build operational dashboards, migration control centers, and agent-driven workflows for automated validation, exception handling, and anomaly detection using low-code platform tools. Generate TypeScript/Python SDKs for custom integrations and deliver real-time monitoring and self-service interfaces for migration progress, data quality KPIs, and compliance tracking
  • Multi-System Consolidation & Master Data Management: Lead consolidation of 5–15+ fragmented ERP instances into standardized master data models. Resolve complex entity resolution challenges including customer matching, product harmonization, and chart of accounts unification. Establish golden record frameworks, data quality scorecards, survivorship rules, and data stewardship workflows for post-migration governance
  • Client Engagement, Discovery & Modernization Advisory: Serve as primary technical advisor to C-suite and enterprise architecture stakeholders across all engagement phases. Deploy discovery agents to analyze legacy data estates, conduct assessment workshops, facilitate solution design sessions, and deliver executive briefings, go/no-go readiness assessments, and prioritized modernization roadmaps
  • Knowledge Transfer, Enablement & IP Development: Build reusable migration accelerators, playbooks, and reference architectures that scale across engagements. Lead knowledge transfer to upskill client teams for post-migration ownership and collaborate with internal product and sales engineering teams to feed field insights back into platform development and delivery methodology
  • Leadership & Executive Engagement: Operate autonomously in ambiguous, high-stakes client environments, driving outcomes with minimal oversight
  • translate deeply technical concepts into clear, business-level narratives for C-suite audiences through executive briefings and stakeholder communications
  • navigate organizational complexity, competing stakeholder priorities, and enterprise change management dynamics to maintain momentum across multi-workstream engagements
Read More
Arrow Right

Senior Software Engineer, Backend

As a Senior Software Engineer, Backend specializing in database architecture and...
Location
Location
United States , San Francisco
Salary
Salary:
150000.00 - 240000.00 USD / Year
chefrobotics.ai Logo
Chef Robotics
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Engineering, or equivalent practical experience
  • 7+ years of professional experience in backend development roles with demonstrated leadership experience
  • Expert knowledge of relational databases (MySQL, PostgreSQL) including schema design, optimization, and administration
  • Strong proficiency with Python and JavaScript/TypeScript with advanced software engineering skills
  • Extensive experience leading projects with at least two web frameworks: Flask, FastAPI, Django, Node.js, or Next.js
  • Proven experience designing and implementing RESTful and GraphQL APIs at scale
  • Advanced understanding of containerization (Docker) and orchestration (Kubernetes) technologies
  • Experience with cloud infrastructure and deployment (AWS, GCP, or Azure) in production environments
  • Proven experience leading complex backend projects and mentoring junior engineers
  • Understanding of data requirements for robotics or automation systems
Job Responsibility
Job Responsibility
  • Lead the design, implementation, and optimization of database schemas to support robot operations, telemetry, recipe management, and system analytics
  • Develop robust data migration strategies and version control for database schema evolution
  • Implement efficient query optimization and indexing strategies to support high-throughput robot operations
  • Establish data integrity protocols and backup systems to ensure operational continuity across customer deployments
  • Create scalable data access layers that balance security, performance, and maintainability
  • Mentor team members on database design patterns and optimization techniques
  • Lead the development and maintenance of scalable APIs to serve robot control systems, dashboards, and monitoring tools
  • Design and implement secure authentication and authorization mechanisms across backend services
  • Develop robust middleware for processing and validating data between robotics subsystems
  • Create service interfaces that enable efficient communication between robotics components and cloud services
What we offer
What we offer
  • medical, dental, and vision insurance
  • commuter benefits
  • flexible paid time off (PTO)
  • catered lunch
  • 401(k) matching
  • early-stage equity
  • Fulltime
Read More
Arrow Right

Lead Architect, Cloud Data Architecture

At Comcast advertising, we're building a data-driven future, and we need a Senio...
Location
Location
United States , Remote
Salary
Salary:
90695.14 - 212566.73 USD / Year
comcastadvertising.com Logo
Comcast Advertising
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Implementing cloud data architecture and data integration patterns (AWS Glue, Azure Data Factory, Event Hub, Databricks, etc.), storage and processing (Redshift, Azure Synapse, BigQuery, Snowflake)
  • Infrastructure as code (CloudFormation, Terraform)
  • 3rd Party Integration (Salesforce, D365)
  • Understanding and experience with modern cloud data architectures and engineering for one or more of the following cloud providers - AWS, Azure, GCP
  • Experience building data models and semantic layers
  • Experience in defining Master Data Management strategy and exposing data securely as a through data sharing architecture and APIs
  • Leading and supporting data architecture team and leads in creation of cloud data migration/integration/warehouse plans, roadmap, success metrics, and assessment of client’s enterprise (on-premise and on-cloud) data systems
  • Designing and developing using data modeling techniques for mixed workloads, such as OLTP, OLAP, streaming using any formats (structured, semi-structured, unstructured)
  • Strong understanding of data governance practices
  • Architecting and designing data implementation patterns and engineered solutions using native cloud capabilities that span data ingestion & integration (ingress and egress), data storage (raw & cleansed), data prep & processing, master & reference data management, data virtualization & semantic layer, data consumption & visualization
Job Responsibility
Job Responsibility
  • Collaborate across application and data teams to orchestrate data solutions that meet needs from transactional, analytics and data science needs
  • Propose solutions focusing on tenants such as cost optimization, reducing data hops, ensuring high availability, performance and Comcast Advertising's strategic needs and future growth goals as it relates to becoming a ‘data driven’ organization
  • Ensure solutions adhere to policies outlined at Comcast level such as data privacy and security
  • Design and manage robust, scalable data models and semantic layer
  • Build prototypes, publish design patterns for use by the organization
  • Provide guidance on getting started in cloud and cloud migration techniques
  • Work with the data governance team to ensure critical data elements are identified and monitored
  • Work with partners like AWS, Databricks, Snowflake to ensure services meet the growing needs of Comcast Advertising
  • Facilitate in-depth architectural discussions and design exercises to create world-class cloud solutions
  • Work with internal and external stakeholders to gather requirements
What we offer
What we offer
  • Paid Time off
  • Physical Wellbeing benefits
  • Financial Wellbeing benefits
  • Emotional Wellbeing benefits
  • Life Events + Family Support benefits
  • Fulltime
Read More
Arrow Right

Cloud Solutions Architect

We’re partnering with a highly respected, long-established London firm seeking a...
Location
Location
United Kingdom , London
Salary
Salary:
Not provided
hunterbond.com Logo
Hunter Bond
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven experience as a Cloud Architect or senior technical leader in AWS environments
  • Track record of delivering large-scale data transformation and AI-focused initiatives
  • Deep knowledge of AWS-native services for modern data and AI architectures (data lakes, analytics platforms, streaming, distributed processing)
  • Experience designing systems to handle high volume, velocity, and variety of data
  • Understanding of AI workflows and how to integrate machine learning pipelines into cloud architectures
Job Responsibility
Job Responsibility
  • Design and implement AWS-based cloud platforms for large-scale, data-intensive workloads
  • Drive cloud architecture for enterprise-wide data modernisation, migration, and AI initiatives
  • Define patterns and best practices for scalability, resilience, security, and cost efficiency on AWS
  • Collaborate with data engineering, platform, security, and AI teams to deliver end-to-end solutions
  • Balance hands-on architectural problem-solving with strategic cloud direction, including AI-enabled systems
Read More
Arrow Right