CrawlJobs Logo

AI and Data Engineer

reingold.com Logo

Reingold, Inc.

Location Icon

Location:
United States , Raleigh, NC

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We’re looking for someone with real-world experience in cloud engineering and the confidence to work independently on core infrastructure tasks. You don’t need to be a domain expert, but you should be hungry to learn, comfortable navigating unfamiliar systems, and confident in your ability to figure things out. At Reingold, Google Cloud engineering isn’t just a support function; it’s a key part of how we deliver innovative, AI-powered solutions for our clients. You’ll use a code-first approach to build and manage Google Cloud infrastructure for advanced conversational AI applications, working with cross-functional teams to build systems that are secure, scalable, and repeatable. You’ll also implement robust CI/CD pipelines for deploying applications, and leverage Google Cloud's suite to help teams own their services in production, spot issues early, and fix them fast.

Job Responsibility:

  • Design, build, and maintain conversational AI solutions using cloud services (e.g., Google Cloud Vertex AI, AWS Bedrock, Azure Cognitive Services) and chatbot frameworks such as Dialogflow CX, integrating them with relevant data sources
  • Research and identify opportunities for AI integration in both internal processes and client-facing products
  • evaluate model performance and recommend improvements based on data analysis
  • Design and prototype AI-powered tools and workflows using Python and libraries such as Hugging Face Transformers, LangChain, or OpenAI APIs, including fine-tuning or customizing existing models for specific business needs
  • Collaborate with development teams to integrate, manage, and secure a suite of cloud APIs and services (e.g., search/discovery engines, object storage, and large-scale data analytics platforms) to ensure efficient data flow
  • Develop and manage API proxies, security policies, and integration pipelines between backend services and client-facing applications
  • Maintain documentation for scripts, workflows, integrations, and proof-of-concept applications aligned with operational standards
  • Stay current with emerging AI/ML frameworks, APIs, and techniques in cloud and AI ecosystems
  • Perform ad hoc technical tasks as needed, based on evolving priorities and your skills

Requirements:

  • Active Public Trust clearance or ability to obtain one
  • Bachelor’s degree in computer science, engineering, web development, data science, or a related field — or equivalent formal training in AI/ML or software engineering through an accredited program or immersive training
  • Experience designing, building, and tuning chatbots or virtual agents (e.g., Dialogflow CX, AWS Lex, Azure Bot Service), integrating them with data sources, and deploying within major cloud platforms
  • Strong Python programming skills, including use of libraries such as pandas, numpy, transformers, and tools for API integration or workflow automation
  • Familiarity with building, integrating, or customizing AI/ML models using frameworks and APIs (e.g., OpenAI API, Hugging Face, LangChain, TensorFlow, or PyTorch)
  • Understanding of NLP fundamentals such as tokenization, embeddings, retrieval augmented generation (RAG), and fine-tuning
  • Proficiency with version control (Git) and effective collaboration in team-based development environments
  • Understanding of cloud administration fundamentals — including networking, IAM, and security best practices
  • Ability to quickly learn new tools and independently implement functional prototypes
  • Ability to work independently with minimal oversight while adapting to shifting priorities

Nice to have:

  • Familiarity with managing APIs through an API gateway, including setting up proxies and authentication policies
  • Experience working with Google Cloud Platform AI and data services such as Vertex Search, DialogFlow, and BigQuery
  • Excellent problem-solving instincts and a curiosity-driven mindset, especially around performance, reliability, and automation
  • Strong communication skills and the ability to collaborate effectively with developers, engineers, project managers, leadership, and clients
What we offer:
  • competitive salaries
  • a comprehensive benefits package
  • a dynamic hybrid work environment
  • a vibrant workplace
  • growth opportunities in a variety of specialty areas

Additional Information:

Job Posted:
January 02, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for AI and Data Engineer

Data Engineer – AI Insights

We are looking for an experienced Data Engineer with AI Insights to design and d...
Location
Location
United States
Salary
Salary:
Not provided
thirdeyedata.ai Logo
Thirdeye Data
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of Data Engineering experience with exposure to AI/ML workflows
  • Advanced expertise in Python programming and SQL
  • Hands-on experience with Snowflake (data warehousing, schema design, performance tuning)
  • Experience building scalable ETL/ELT pipelines and integrating structured/unstructured data
  • Familiarity with LLM and RAG workflows, and how data supports these AI applications
  • Experience with reporting/visualization tools (Tableau)
  • Strong problem-solving, communication, and cross-functional collaboration skills
Job Responsibility
Job Responsibility
  • Develop and optimize ETL/ELT pipelines using Python, SQL, and Snowflake to ensure high-quality data for analytics, AI, and LLM workflows
  • Build and manage Snowflake data models and warehouses, focusing on performance, scalability, and security
  • Collaborate with AI/ML teams to prepare datasets for model training, inference, and LLM/RAG-based solutions
  • Automate data workflows, validation, and monitoring for reliable AI/ML execution
  • Support RAG pipelines and LLM data integration, enabling AI-driven insights and knowledge retrieval
  • Partner with business and analytics teams to transform raw data into actionable AI-powered insights
  • Contribute to dashboarding and reporting using Tableau, Power BI, or equivalent tools
  • Fulltime
Read More
Arrow Right

Senior Data & AI Innovation Engineer

We are seeking a highly proactive, self-driven Senior Data & AI Engineer to serv...
Location
Location
Singapore , Singapore
Salary
Salary:
7000.00 - 8000.00 SGD / Month
https://www.randstad.com Logo
Randstad
Expiration Date
January 08, 2026
Flip Icon
Requirements
Requirements
  • Proven, hands-on experience in implementing and supporting practical AI use cases (beyond academic study), understanding how to embed AI components into existing services
  • 4+ years of hands-on experience in implementing and operating Snowflake Data Cloud in a production environment
  • Certification (e.g., SnowPro Data Engineer) is highly desirable
  • Familiarity with MLOps concepts and tools (e.g., Docker, MLflow, LangChain) and an understanding of LLMs, RAG pipelines, and generative AI deployment
  • Strong programming skills in Python for data manipulation, scripting, and AI model support
Job Responsibility
Job Responsibility
  • Proactively identify, design, and implement initial AI Proof-of-Concepts (POCs) across the APAC region, focusing on quick-win solutions like AI-powered chatbots and intelligent inventory monitoring systems
  • Analyze business processes to identify areas where AI components can be effectively embedded to solve immediate business challenges
  • Partner with business stakeholders to understand AI data needs, perform data engineering/prep, and ensure data readiness to support and sustain deployed AI models
  • Stay ahead of technology trends, perform proactive research on Data and AI solutions, and evangelize new capabilities to regional teams
  • Act as the APAC SME, collaborating closely with cross-regional peers and global teams to contribute to and align with the company Global Data Platform roadmap (Snowflake)
  • Define and execute the complete migration strategy from legacy data warehouses/databases (e.g., PostgreSQL, MS SQL) to the Snowflake Data Cloud platform
  • Design, build, and optimize scalable, robust ETL/ELT data pipelines to curate raw data into BI and Advanced Analytics datasets
  • Implement and manage Snowflake governance, including access control, data security, usage monitoring, and performance optimization aligned with global best practices
!
Read More
Arrow Right

Senior Data Engineer – Data Engineering & AI Platforms

We are looking for a highly skilled Senior Data Engineer (L2) who can design, bu...
Location
Location
India , Chennai, Madurai, Coimbatore
Salary
Salary:
Not provided
optisolbusiness.com Logo
OptiSol Business Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong hands-on expertise in cloud ecosystems (Azure / AWS / GCP)
  • Excellent Python programming skills with data engineering libraries and frameworks
  • Advanced SQL capabilities including window functions, CTEs, and performance tuning
  • Solid understanding of distributed processing using Spark/PySpark
  • Experience designing and implementing scalable ETL/ELT workflows
  • Good understanding of data modeling concepts (dimensional, star, snowflake)
  • Familiarity with GenAI/LLM-based integration for data workflows
  • Experience working with Git, CI/CD, and Agile delivery frameworks
  • Strong communication skills for interacting with clients, stakeholders, and internal teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable ETL/ELT pipelines across cloud and big data platforms
  • Contribute to architectural discussions by translating business needs into data solutions spanning ingestion, transformation, and consumption layers
  • Work closely with solutioning and pre-sales teams for technical evaluations and client-facing discussions
  • Lead squads of L0/L1 engineers—ensuring delivery quality, mentoring, and guiding career growth
  • Develop cloud-native data engineering solutions using Python, SQL, PySpark, and modern data frameworks
  • Ensure data reliability, performance, and maintainability across the pipeline lifecycle—from development to deployment
  • Support long-term ODC/T&M projects by demonstrating expertise during technical discussions and interviews
  • Integrate emerging GenAI tools where applicable to enhance data enrichment, automation, and transformations
What we offer
What we offer
  • Opportunity to work at the intersection of Data Engineering, Cloud, and Generative AI
  • Hands-on exposure to modern data stacks and emerging AI technologies
  • Collaboration with experts across Data, AI/ML, and cloud practices
  • Access to structured learning, certifications, and leadership mentoring
  • Competitive compensation with fast-track career growth and visibility
  • Fulltime
Read More
Arrow Right

Quality Engineer - AI and Data Platforms

This is a pioneering Quality Engineer role at the intersection of data engineeri...
Location
Location
United Kingdom , Manchester
Salary
Salary:
44000.00 - 66000.00 GBP / Year
matillion.com Logo
Matillion
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Solid foundation in data engineering: including SQL, ETL/ELT design, and specific experience building data pipelines and managing data movement
  • Strong practical AI experience: you have used, experimented with, and are an advocate for an AI-first approach to quality engineering
  • Proficiency in coding in Java or JavaScript to navigate the codebase and implement quality frameworks
  • Demonstrated Autonomy, Curiosity, and Problem-solving skills, with a willingness to look at challenges in a different way and ask for assistance as needed
  • Experience in managing end-to-end testing of SaaS applications, including developing and maintaining efficient test automation tooling
Job Responsibility
Job Responsibility
  • Leveraging AI and agentic solutions, including our agentic AI product Maia, to accelerate investigation, generate test cases, and increase quality assurance across the Data Productivity Cloud
  • Performing root cause analysis on pipeline stability issues, particularly identifying why DPC pipelines run out of memory (OOM) within the agents
  • Building pipelines to automate every process, solutionizing problems to increase overall team and product productivity
  • Acting as a crucial bridge by collaborating extensively with various teams, raising problems, and ensuring that fixes are implemented effectively
  • Adopting, implementing, and championing shift-left testing practices across the team, leading an automation-first approach
What we offer
What we offer
  • Company Equity
  • 30 days holiday + bank holidays
  • 5 days paid volunteering leave
  • Health insurance
  • Life Insurance
  • Pension
  • Access to mental health support
  • Fulltime
Read More
Arrow Right

Data Engineer with Generative AI Expertise

We are looking for a skilled Data Engineer with expertise in Generative AI to jo...
Location
Location
India , Jaipur
Salary
Salary:
Not provided
infoobjects.com Logo
InfoObjects
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related fields
  • 2-6 years of hands-on experience in Data Engineering
  • Proficiency in Generative AI frameworks (e.g., GPT, DALL-E, Stable Diffusion)
  • Strong programming skills in Python, SQL, and familiarity with Java or Scala
  • Experience with data tools and platforms such as Apache Spark, Hadoop, or similar
  • Knowledge of cloud platforms like AWS, Azure, or GCP
  • Familiarity with MLOps practices and AI model deployment
  • Excellent problem-solving and communication skills
Job Responsibility
Job Responsibility
  • Design, develop, and maintain robust data pipelines and workflows
  • Integrate Generative AI models into existing data systems to enhance functionality
  • Collaborate with cross-functional teams to understand business needs and translate them into scalable data and AI solutions
  • Optimize data storage, processing, and retrieval systems for performance and scalability
  • Ensure data security, quality, and governance across all processes
  • Stay updated with the latest advancements in Generative AI and data engineering practices
Read More
Arrow Right

Principal Consulting AI / Data Engineer

As a Principal Consulting AI / Data Engineer, you will design, build, and optimi...
Location
Location
Australia , Sydney
Salary
Salary:
Not provided
dyflex.com.au Logo
DyFlex Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven expertise in delivering enterprise-grade data engineering and AI solutions in production environments
  • Strong proficiency in Python and SQL, plus experience with Spark, Airflow, dbt, Kafka, or Flink
  • Experience with cloud platforms (AWS, Azure, or GCP) and Databricks
  • Ability to confidently communicate and present at C-suite level, simplifying technical concepts into business impact
  • Track record of engaging senior executives and influencing strategic decisions
  • Strong consulting and stakeholder management skills with client-facing experience
  • Background in MLOps, ML pipelines, or AI solution delivery highly regarded
  • Degree in Computer Science, Engineering, Data Science, Mathematics, or a related field
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable data and AI solutions using Databricks, cloud platforms, and modern frameworks
  • Lead solution architecture discussions with clients, ensuring alignment of technical delivery with business strategy
  • Present to and influence executive-level stakeholders, including boards, C-suite, and senior directors
  • Translate highly technical solutions into clear business value propositions for non-technical audiences
  • Mentor and guide teams of engineers and consultants to deliver high-quality solutions
  • Champion best practices across data engineering, MLOps, and cloud delivery
  • Build DyFlex’s reputation as a trusted partner in Data & AI through thought leadership and client advocacy
What we offer
What we offer
  • Work with SAP’s latest technologies on cloud as S/4HANA, BTP and Joule, plus Databricks, ML/AI tools and cloud platforms
  • A flexible and supportive work environment including work from home
  • Competitive remuneration and benefits including novated lease, birthday leave, salary packaging, wellbeing programme, additional purchased leave, and company-provided laptop
  • Comprehensive training budget and paid certifications (Databricks, SAP, cloud platforms)
  • Structured career advancement pathways with opportunities to lead large-scale client programs
  • Exposure to diverse industries and client environments, including executive-level engagement
  • Fulltime
Read More
Arrow Right

Consulting AI / Data Engineer

As a Consulting AI / Data Engineer, you will design, build, and optimise enterpr...
Location
Location
Australia , Sydney
Salary
Salary:
Not provided
dyflex.com.au Logo
DyFlex Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Hands-on data engineering experience in production environments
  • Strong proficiency in Python and SQL
  • Experience with at least one additional language (e.g. Java, Typescript/Javascript)
  • Experience with modern frameworks such as Apache Spark, Airflow, dbt, Kafka, or Flink
  • Background in building ML pipelines, MLOps practices, or feature stores is highly valued
  • Proven expertise in relational databases, data modelling, and query optimisation
  • Demonstrated ability to solve complex technical problems independently
  • Excellent communication skills with ability to engage clients and stakeholders
  • Degree in Computer Science, Engineering, Data Science, Mathematics, or a related field
Job Responsibility
Job Responsibility
  • Build and maintain scalable data pipelines for ingesting, transforming, and delivering data
  • Manage and optimise databases, warehouses, and cloud storage solutions
  • Implement data quality frameworks and testing processes to ensure reliable systems
  • Design and deliver cloud-based solutions (AWS, Azure, or GCP)
  • Take technical ownership of project components and lead small development teams
  • Engage directly with clients, translating business requirements into technical solutions
  • Champion best practices including version control, CI/CD, and infrastructure as code
What we offer
What we offer
  • Work with SAP’s latest technologies on cloud as S/4HANA, BTP and Joule, plus Databricks, ML/AI tools and cloud platforms
  • A flexible and supportive work environment including work from home
  • Competitive remuneration and benefits including novated lease, birthday leave, remote working, additional purchased leave, and company-provided laptop
  • Competitive remuneration and benefits including novated lease, birthday leave, salary packaging, wellbeing programme, additional purchased leave, and company-provided laptop
  • Comprehensive training budget and paid certifications (Databricks, SAP, cloud platforms)
  • Structured career advancement pathways with mentoring from senior engineers
  • Exposure to diverse industries and client environments
  • Fulltime
Read More
Arrow Right

Data & AI Impact Consultant Engineer

As a Data Consultant, you are a cornerstone of our Data & AI Business Unit – tec...
Location
Location
Belgium
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of experience in data engineering or analytics projects
  • Good knowledge of Dutch and English (French is an asset)
  • Familiar with Azure stack (Data Factory, Synapse, Storage, Purview, Functions) and/or Databricks
  • Structural and flexible thinking
  • Interest in AI and its role in modern data products (prompt engineering, GenAI, monitoring, automation)
  • Ability to create business impact and understand outcomes
  • Team-oriented mindset
Job Responsibility
Job Responsibility
  • Design and build modern data platforms (Azure, Databricks, Data Fabric) with focus on reusability and AI-readiness
  • Deliver customer- and future-oriented value
  • Help colleagues grow through coaching, feedback, or knowledge sharing
  • Provide consultancy with solution-oriented approach
  • Take initiative in client development, talent growth, or community engagement
What we offer
What we offer
  • Company car
  • Fuel/charging card
  • Group insurance
  • Hospitalization coverage
  • 32 days of annual leave
  • Hybrid working options
  • Satellite offices
  • Continuous learning & development
  • Training and certification programs
  • Fulltime
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.