CrawlJobs Logo

Lead Data Engineer - AI/ML

stanfordhealthcare.org Logo

Stanford Health Care

Location Icon

Location:
United States of America , Palo Alto

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

94.35 - 125.03 USD / Hour

Job Description:

The Lead Data Engineer will be part of a team building Stanford Health Care's (SHC) solutions incorporating Artificial Intelligence including providing health care solutions in the areas of patient care, medical research and administrative services. This group is designed to bring Artificial Intelligence (AI) and other emerging machine learning (ML) based innovations in data science into healthcare and will partner closely with individuals across clinical specialties and operations areas to deploy algorithms that can lead to better patient outcomes. Reporting to the Data Science Director and working closely with Stanford Medicine's inaugural Chief Data Scientist, this role will be responsible for building, scaling and maintaining the compute frameworks, analysis tooling, model implementations and agentic solutions that form our core AI platform.

Job Responsibility:

  • Build end-to-end data pipelines and infrastructure for ML models used by the Data Science team and others at SHC
  • Understand the requirements of data processing and analysis pipelines and make appropriate technical design and interface decisions
  • Understand data flows among the SHC applications and use this knowledge to make recommendations and design decisions for languages, tools, and platforms used in software and data projects
  • Troubleshoot and debug environment and infrastructure problems found in production and non-production environments for projects by the Data Science Team
  • Work with other groups at SHC and the Technology and Digital Solutions (TDS) group to ensure servers and system maintenance based on updates, system requirements, data usage, and security requirements.

Requirements:

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related, or equivalent working experience
  • 5+ years experience in building data infrastructure for analytics teams, including ability to write code in SQL, R, or Python for processing large datasets in distributed cloud environments
  • Experience with cloud deployment strategies and CI/CD
  • Experience building and working with data infrastructure in a SaaS environment
  • Experience overseeing, developing or implementing machine learning operations (MLOps) processes
  • Experience mentoring junior engineers and enforcing best practices around code quality
  • Knowledge of multiple programming languages, commitment to choosing languages based on project-specific requirements, and willingness to learn new programming languages as necessary
  • Knowledge of resource management and automation approaches such as workflow runners
  • Collaborative mentality and excitement for iterative design working closely with the Data Science team.

Additional Information:

Job Posted:
March 04, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Lead Data Engineer - AI/ML

Senior Data & AI/ML Engineer - GCP Specialization Lead

We are on a bold mission to create the best software services offering in the wo...
Location
Location
United States , Menlo Park
Salary
Salary:
Not provided
techjays.com Logo
techjays
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • GCP Services: BigQuery, Dataflow, Pub/Sub, Vertex AI
  • ML Engineering: End-to-end ML pipelines using Vertex AI / Kubeflow
  • Programming: Python & SQL
  • MLOps: CI/CD for ML, Model deployment & monitoring
  • Infrastructure-as-Code: Terraform
  • Data Engineering: ETL/ELT, real-time & batch pipelines
  • AI/ML Tools: TensorFlow, scikit-learn, XGBoost
  • Min Experience: 10+ Years
Job Responsibility
Job Responsibility
  • Design and implement data architectures for real-time and batch pipelines, leveraging GCP services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Vertex AI, and Cloud Storage
  • Lead the development of ML pipelines, from feature engineering to model training and deployment using Vertex AI, AI Platform, and Kubeflow Pipelines
  • Collaborate with data scientists to operationalize ML models and support MLOps practices using Cloud Functions, CI/CD, and Model Registry
  • Define and implement data governance, lineage, monitoring, and quality frameworks
  • Build and document GCP-native solutions and architectures that can be used for case studies and specialization submissions
  • Lead client-facing PoCs or MVPs to showcase AI/ML capabilities using GCP
  • Contribute to building repeatable solution accelerators in Data & AI/ML
  • Work with the leadership team to align with Google Cloud Partner Program metrics
  • Mentor engineers and data scientists toward achieving GCP certifications, especially in Data Engineering and Machine Learning
  • Organize and lead internal GCP AI/ML enablement sessions
What we offer
What we offer
  • Best in class packages
  • Paid holidays and flexible paid time away
  • Casual dress code & flexible working environment
  • Medical Insurance covering self & family up to 4 lakhs per person
Read More
Arrow Right

Technical Lead – AI/ML & Data Platforms

We are seeking a Technical Lead with strong managerial capabilities to drive the...
Location
Location
United States , Sunnyvale
Salary
Salary:
Not provided
thirdeyedata.ai Logo
Thirdeye Data
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong expertise in data pipelines, architecture, and analytics platforms (e.g., Snowflake, Tableau)
  • Experience reviewing and optimizing data transformations, aggregations, and business logic
  • Hands-on familiarity with LLMs and practical RAG implementations
  • Knowledge of AI/ML workflows, model lifecycle management, and experimentation frameworks
  • Proven experience in managing complex, multi-track projects
  • Skilled in project tracking and collaboration tools (Jira, Confluence, or equivalent)
  • Excellent communication and coordination skills with technical and non-technical stakeholders
  • Experience working with cross-functional, globally distributed teams
Job Responsibility
Job Responsibility
  • Coordinate multiple workstreams simultaneously, ensuring timely delivery and adherence to quality standards
  • Facilitate daily stand-ups and syncs across global time zones, maintaining visibility and accountability
  • Understand business domains and technical architecture to enable informed decisions and proactive risk management
  • Collaborate with data engineers, AI/ML scientists, analysts, and product teams to translate business goals into actionable plans
  • Track project progress using Agile or hybrid methodologies, escalate blockers, and resolve dependencies
  • Own task lifecycle — from planning through execution, delivery, and retrospectives
  • Perform technical reviews of data pipelines, ETL processes, and architecture, identifying quality or design gaps
  • Evaluate and optimize data aggregation logic while ensuring alignment with business semantics
  • Contribute to the design and development of RAG pipelines and workflows involving LLMs
  • Create and maintain Tableau dashboards and reports aligned with business KPIs for stakeholders
  • Fulltime
Read More
Arrow Right

Sap Btp Data Engineer

At LeverX, we have had the privilege of working on over 950 SAP projects, includ...
Location
Location
Salary
Salary:
Not provided
leverx.com Logo
LeverX
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2+ years of experience designing and developing SAP data solutions within SAP and non-SAP enterprise landscapes
  • Strong knowledge of data modeling in Data Warehouses
  • Strong knowledge of the visualization patterns, approaches, and techniques in SAP and non-SAP landscapes
  • Understanding of Data Engineering solutions within the SAP BDC landscape, such as SAP Databricks
  • Proven experience in data transformation and integration with SAP ERP, S/4HANA, and equivalent external systems
  • Good understanding of SAP data integration techniques (SDI, SDA, APIs, ODP) and protocols (OData, REST, JDBC)
  • Bachelor’s degree in Computer Science, Information Systems, or equivalent
  • English B2+
Job Responsibility
Job Responsibility
  • Design, develop, and deploy enterprise data solutions on SAP BTP, integrating SAP and non-SAP systems
  • Analyze and resolve complex data and integration challenges, ensuring reliable and scalable solutions
  • Collaborate with data architects, functional analysts, and business stakeholders to translate requirements into data models, dashboards, and analytics
  • Lead small project teams (2–3 members) and contribute to cross-regional collaboration for consistent delivery
  • Facilitate client enablement through workshops, webinars, and hands-on sessions
  • Continuously grow expertise by staying current with SAP data and analytics technologies (e.g., SAP Business Data Cloud, AI/ML) and pursuing relevant certifications
What we offer
What we offer
  • 89% of projects use the newest SAP technologies and frameworks
  • Expert communities and internal courses
  • Valuable perks to support your growth and well-being
  • Employment security: We hire for our team, not just a specific project. If your project ends, we will find you a new one
  • Healthy work atmosphere: On average, our employees stay in the company for 4+ years
Read More
Arrow Right

Sr. Data Engineer - Snowflake

Data Ideology is seeking a Sr. Snowflake Data Engineer to join our growing team ...
Location
Location
Salary
Salary:
Not provided
dataideology.com Logo
Data Ideology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in data engineering, data warehousing, or data architecture
  • 3+ years of hands-on Snowflake experience (performance tuning, data sharing, Snowpark, Snowpipe, etc.)
  • Strong SQL and Python skills, with production experience using dbt
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and modern data tooling (Airflow, Fivetran, Power BI, Looker, Informatica, etc.)
  • Prior experience in a consulting or client-facing delivery role
  • Excellent communication skills, with the ability to collaborate across technical and business stakeholders
  • SnowPro Core Certification required (or willingness to obtain upon hire)
  • advanced Snowflake certifications preferred
Job Responsibility
Job Responsibility
  • Design and build scalable, secure, and cost-effective data solutions in Snowflake
  • Develop and optimize data pipelines using tools such as dbt, Python, CloverDX, and cloud-native services
  • Participate in discovery sessions with clients to gather requirements and translate them into solution designs and project plans
  • Collaborate with engagement managers and account teams to help scope work and provide technical input for Statements of Work (SOWs)
  • Serve as a Snowflake subject matter expert, guiding best practices in performance tuning, cost optimization, access control, and workload management
  • Lead modernization and migration initiatives to move clients from legacy systems into Snowflake
  • Integrate Snowflake with BI tools, governance platforms, and AI/ML frameworks
  • Contribute to internal accelerators, frameworks, and proofs of concept
  • Mentor junior engineers and support knowledge sharing across the team
What we offer
What we offer
  • Flexible Time Off Policy
  • Eligibility for Health Benefits
  • Retirement Plan with Company Match
  • Training and Certification Reimbursement
  • Utilization Based Incentive Program
  • Commission Incentive Program
  • Referral Bonuses
  • Work from Home
  • Fulltime
Read More
Arrow Right

Data Engineer

Location: 100% remote; Years’ Experience: 10+ years professional experience; Edu...
Location
Location
United States
Salary
Salary:
Not provided
sparibis.com Logo
Sparibis
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years of IT experience focusing on enterprise data architecture and management
  • Experience with Databricks, Structured Streaming, Delta Lake concepts, and Delta Live Tables required
  • Experience with ETL and ELT tools such as SSIS, Pentaho, and/or Data Migration Services
  • Advanced level SQL experience (Joins, Aggregation, Windowing functions, Common Table Expressions, RDBMS schema design, Postgres performance optimization)
  • Must be able to obtain a Public Trust security clearance
  • Bachelor degree required
  • Experience in Conceptual/Logical/Physical Data Modeling & expertise in Relational and Dimensional Data Modeling
  • Additional experience with Spark, Spark SQL, Spark DataFrames and DataSets, and PySpark
  • Data Lake concepts such as time travel and schema evolution and optimization
  • Experience leading and architecting enterprise-wide initiatives specifically system integration, data migration, transformation, data warehouse build, data mart build, and data lakes implementation / support
Job Responsibility
Job Responsibility
  • Plan, create, and maintain data architectures, ensuring alignment with business requirements
  • Obtain data, formulate dataset processes, and store optimized data
  • Identify problems and inefficiencies and apply solutions
  • Determine tasks where manual participation can be eliminated with automation
  • Identify and optimize data bottlenecks, leveraging automation where possible
  • Create and manage data lifecycle policies (retention, backups/restore, etc)
  • In-depth knowledge for creating, maintaining, and managing ETL/ELT pipelines
  • Create, maintain, and manage data transformations
  • Maintain/update documentation
  • Create, maintain, and manage data pipeline schedules
Read More
Arrow Right

AI & Data Lead

We are seeking a highly skilled AI & Data Lead to spearhead our transformation i...
Location
Location
United Kingdom , North West
Salary
Salary:
Not provided
dynamicsearch.co.uk Logo
Dynamic Search Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven experience in a senior data/AI/analytics role (e.g., Data Lead, Analytics Lead, AI Specialist, BI Lead)
  • Strong background with Power BI, including DAX, modelling, governance, and enterprise deployments
  • Hands-on experience building AI and machine learning solutions using modern cloud tools (Azure preferred)
  • Strong understanding of data engineering concepts (ETL/ELT, pipelines, warehousing)
  • Ability to translate business challenges into technical AI/analytics solutions
  • Excellent stakeholder engagement and communication skills
  • Strategic thinker capable of setting direction and driving organisational change
Job Responsibility
Job Responsibility
  • Strategic Leadership & Planning Develop and own the company’s AI, analytics, and data strategy
  • Create a roadmap for moving from traditional BI to a modern, AI-augmented data ecosystem
  • Identify opportunities where AI, ML, and automation can drive measurable value
  • Act as a trusted advisor to senior leaders and departments regarding data, BI, and AI
  • Translate complex data and AI concepts into clear business language and use cases
  • Facilitate workshops, run discovery sessions, and guide teams on AI best practices
  • Build prototypes and production-level AI/ML solutions, such as predictive models, natural language interfaces, and automation workflows
  • Develop robust data models, pipelines, and integrations to ensure scalable AI adoption
  • Establish data governance standards, security controls, and compliance practices
  • Own data quality frameworks to ensure trusted reporting and model reliability
Read More
Arrow Right

Senior Engineering Manager - AI/ML

Hewlett Packard Enterprise is looking for a Senior Engineering Manager - AI/ML t...
Location
Location
India , Bangalore
Salary
Salary:
Not provided
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in computer science, engineering, data science, artificial intelligence, machine learning, or closely related quantitative discipline
  • 7-15 years’ experience including 5 or more years of people management experience
  • Advanced Degree (Master’s or Ph.D.) strongly preferred
  • Strong problem-solving and analytical skills, with the ability to identify business opportunities, formulate strategies, and execute projects effectively
  • Excellent communication and presentation skills, with the ability to convey complex technical concepts to technical and non-technical stakeholders
  • Proven ability to manage multiple projects and priorities in a fast-paced environment, ensuring timely delivery and high-quality results
  • Experience with cloud platforms, big data technologies, and distributed computing frameworks is a plus
  • Strong understanding of data privacy, security, and ethical considerations in AI and machine learning
  • Strong technical expertise in AI and machine learning algorithms, models, and tools, with proficiency in programming languages such as Python or R
  • Demonstrated leadership and management skills, with experience in leading and mentoring AI and machine learning professional teams.
Job Responsibility
Job Responsibility
  • Develop software algorithms to structure, analyze and leverage structured and unstructured data
  • Use machine learning and statistical modeling techniques to improve product/system performance, data management, quality, and accuracy
  • Apply, optimize, and scale deep learning technologies and algorithms
  • Document procedures for installation and maintenance
  • Perform testing and debugging
  • Define and monitor performance metrics
  • Translate customer requirements and industry trends into AI/ML products and systems improvements
  • Develop and drive the organization’s AI and machine learning strategy
  • Identify new opportunities for AI and machine learning applications
  • Oversee complex AI and machine learning projects from conception to deployment
What we offer
What we offer
  • Comprehensive suite of benefits supporting physical, financial, and emotional wellbeing
  • Personal and professional development programs
  • Career growth opportunities
  • Inclusive work environment.
  • Fulltime
Read More
Arrow Right

Senior AI/ML Engineer

Barbaricum is seeking a highly experienced Senior AI/ML Engineer to support Soft...
Location
Location
United States , Crane
Salary
Salary:
Not provided
barbaricum.com Logo
Barbaricum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Active DoD Secret Clearance (Top Secret preferred)
  • Bachelor’s degree in Computer Science, Engineering, or related technical discipline (Master’s preferred)
  • 10+ years of progressive experience in AI/ML engineering, software development, or applied data science
  • Expertise in developing, deploying, and securing AI/ML applications within mission-critical or defense environments
  • Demonstrated experience with LLMs, MLOps pipelines, and modern ML frameworks (e.g., PyTorch, TensorFlow)
  • Strong background in software and cyber engineering principles, including system hardening, secure coding, and vulnerability mitigation
  • Proven ability to lead complex technical efforts, mentor junior engineers, and interface with government stakeholders
  • DoD 8570 Advanced certification (e.g., SecurityX, GCSA, CCSP, or equivalent) must be obtained and maintained
Job Responsibility
Job Responsibility
  • Partner with project managers and engineering teams to define objectives for AI/ML systems in support of maneuver, surveillance, and engagement missions
  • Develop and prototype AI/ML systems to address mission-specific requirements, including computer vision, sensor fusion, and decision-support applications
  • Conduct rigorous testing and evaluation of AI/ML performance against operational datasets
  • Analyze test data to identify model strengths, weaknesses, and mission relevance
  • Refine and optimize systems to ensure robustness, scalability, and cyber resilience
  • Troubleshoot complex system challenges and provide technical guidance for deployed solutions
  • Deliver comprehensive documentation and technical reports to stakeholders
  • Maintain awareness of emerging AI/ML technologies, software engineering practices, and cyber defense techniques relevant to mission-critical systems
Read More
Arrow Right