CrawlJobs Logo

Intern, Data Engineering

workato.com Logo

Workato

Location Icon

Location:
Singapore

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Workato transforms technology complexity into business opportunity. As the leader in enterprise orchestration, Workato helps businesses globally streamline operations by connecting data, processes, applications, and experiences. Its AI-powered platform enables teams to navigate complex workflows in real-time, driving efficiency and agility. We are looking for an exceptional Data Engineer Intern to join our growing team.

Job Responsibility:

  • Design and implement data pipelines that load data from multiple sources into our data warehouse, building efficient data models that drive analysis and key business metrics
  • Optimize analytical workflows using dbt Cloud, leveraging macros and packages to enhance productivity, while contributing to our CI/CD processes and DevOps practices
  • Participate in AI initiatives by contributing to the development of Data Genie in Workato
  • Implement and maintain data pipelines to load data from multiple sources into a data warehouse
  • Building efficient data models to drive analysis and key metrics
  • Optimization of analytical workflows to improve productivity through the use of macros and packages in dbt Cloud
  • Continuously improve existing DevOps experience by strengthening our CI/CD processes
  • Contribute to AI initiatives by assisting with the development of Data Genie in Workato
  • Collaborate with internal business stakeholders to understand various data pain points and develop solutions to resolve those issues
  • Support operational needs of various business units

Requirements:

  • Currently pursuing a degree in Business Analytics, Information Systems, or other related field
  • Able to commit for at least 6 months
  • Good understanding of Database concepts such as ERD
  • Intermediate SQL skills (e.g. Join, CTE, and window functions etc)
  • Experience in using data warehouses such as Snowflake and data transformation tools such as dbt Cloud
  • Experience with using Extract-Load-Transform (ELT) applications or scripts
  • Fast mover and comfortable navigating a fast-paced environment
  • Excellent organizational skills and ability to juggle multiple projects simultaneously
  • Strong analytical skills, verbal and written communication skills are required to present actionable insights to business leaders
  • Thoughtful team player, a keen learner, and have a “can do” attitude with a growth mindset

Nice to have:

  • Experience in building and working with AI related solutions using LLMs is a plus
  • Understanding of CI/CD and DevOps is a plus

Additional Information:

Job Posted:
January 16, 2026

Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Intern, Data Engineering

Senior Data Engineer – Data Engineering & AI Platforms

We are looking for a highly skilled Senior Data Engineer (L2) who can design, bu...
Location
Location
India , Chennai, Madurai, Coimbatore
Salary
Salary:
Not provided
optisolbusiness.com Logo
OptiSol Business Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong hands-on expertise in cloud ecosystems (Azure / AWS / GCP)
  • Excellent Python programming skills with data engineering libraries and frameworks
  • Advanced SQL capabilities including window functions, CTEs, and performance tuning
  • Solid understanding of distributed processing using Spark/PySpark
  • Experience designing and implementing scalable ETL/ELT workflows
  • Good understanding of data modeling concepts (dimensional, star, snowflake)
  • Familiarity with GenAI/LLM-based integration for data workflows
  • Experience working with Git, CI/CD, and Agile delivery frameworks
  • Strong communication skills for interacting with clients, stakeholders, and internal teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable ETL/ELT pipelines across cloud and big data platforms
  • Contribute to architectural discussions by translating business needs into data solutions spanning ingestion, transformation, and consumption layers
  • Work closely with solutioning and pre-sales teams for technical evaluations and client-facing discussions
  • Lead squads of L0/L1 engineers—ensuring delivery quality, mentoring, and guiding career growth
  • Develop cloud-native data engineering solutions using Python, SQL, PySpark, and modern data frameworks
  • Ensure data reliability, performance, and maintainability across the pipeline lifecycle—from development to deployment
  • Support long-term ODC/T&M projects by demonstrating expertise during technical discussions and interviews
  • Integrate emerging GenAI tools where applicable to enhance data enrichment, automation, and transformations
What we offer
What we offer
  • Opportunity to work at the intersection of Data Engineering, Cloud, and Generative AI
  • Hands-on exposure to modern data stacks and emerging AI technologies
  • Collaboration with experts across Data, AI/ML, and cloud practices
  • Access to structured learning, certifications, and leadership mentoring
  • Competitive compensation with fast-track career growth and visibility
  • Fulltime
Read More
Arrow Right

Product Data Engineering Intern

Product Data Engineering Intern role at Hewlett Packard Enterprise. This is an o...
Location
Location
Puerto Rico , Aguadilla
Salary
Salary:
Not provided
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Currently pursuing a Bachelor's degree in Systems Engineering, Industrial Engineering or Computer Engineering
  • Familiarity with SAP
  • Basic programming or scripting knowledge (e.g., Python, Java, C++)
  • Strong interest in high-tech and passion for learning
  • Excellent communication and interpersonal skills
  • Strong problem-solving and analytical skills
  • Time management skills and working with strict deadlines
  • A collaborative, solution-focused mindset and overall sense of urgency
Job Responsibility
Job Responsibility
  • Support senior team members on assigned technical projects
  • Help identify and troubleshoot technical issues, providing support and suggesting solutions
  • Assist with maintaining and updating hardware, software, and other technical systems
  • Participate in team activities by attending team meetings, learn about project methodologies, and collaborate effectively with colleagues
  • Actively engage in learning about new technologies and methodologies relevant to work
  • Fulfill tasks and responsibilities assigned by a supervisor in a timely and efficient manner
  • Participate in periodic reviews to share updates and incorporate feedback on assigned projects/initiatives
What we offer
What we offer
  • Health & Wellbeing benefits
  • Personal & Professional Development programs
  • Unconditional Inclusion environment
  • Comprehensive suite of benefits supporting physical, financial and emotional wellbeing
  • Fulltime
Read More
Arrow Right

Data Engineer Intern

We deliver sustainable, extraordinary growth by creating a new, unique, inspirin...
Location
Location
China , Shanghai
Salary
Salary:
Not provided
https://www.ikea.com Logo
IKEA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Good interpersonal skills with the ability to collaborate, network, and build strong relations with team members and stakeholders
  • Good knowledge of advanced data structures and distributed computing
  • Good Knowledge of AI and machine learning concepts and algorithms
  • Broad knowledge of programming languages (e.g., Python, Java, Go, or Scala), including concepts from functional and object-oriented programming paradigms
  • Experience with AI/ML frameworks such as TensorFlow and PyTorch
  • Project experience with prompt tuning or fine-tuning experience on mainstream large language models such as ChatGPT(3.5, 4.0) and Meta Llama2
  • Fluent in English
  • At least 3 days working in office per week, and at least 6 months as the internship duration
Job Responsibility
Job Responsibility
  • Dive the transformation of IKEA into a more data-driven company by building and operating modern platforms and systems that are aligned with our constantly evolving data and AI landscape
  • Build the AI ecosystem at the top retail company
  • Play with PB level data of IKEA eco-systems(Online channels, Retail, customer fulfillment, etc.)
  • Work with top talents and get a jumpstart to your career
  • Parttime
Read More
Arrow Right

Data Engineer, 2025/2026 Intern

Join Atlassian as an intern and spend your summer with us having an impact on ho...
Location
Location
Australia , Sydney
Salary
Salary:
Not provided
https://www.atlassian.com Logo
Atlassian
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Be currently enrolled in a Bachelors or Masters program in Software Engineering, Computer Science or other related technical field and completing your studies before January 2027
  • Experience programming with Python, or other related object-oriented programming languages
  • Knowledge of data structures, in particular how they are implemented and how to apply them to meet data challenges
  • Proficiency in SQL and relational databases experience
  • Demonstrated interest in the Data Engineering field through academic coursework, previous work or internship experience, or personal projects
Job Responsibility
Job Responsibility
  • Influence product teams
  • inform Data Science and Analytics Platform teams
  • partner with data consumers and products to ensure quality and usefulness of data assets
  • help strategize measurement
  • collecting data
  • generating insights
What we offer
What we offer
  • health coverage
  • paid volunteer days
  • wellness resources
  • Fulltime
Read More
Arrow Right

Data Analyst/CAE Engineering Intern

The Data Analyst/CAE Engineering Internship is designed to offer a technically f...
Location
Location
Poland , Krakow
Salary
Salary:
Not provided
borgwarner.com Logo
BorgWarner
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Student of Mechanical or Mechatronics Engineering, IT, Automotive Engineering or a related field
  • Availability of minimum 2,5-3 days per a week
  • Very good written and spoken English is required
  • Willingness to learn new tools and processes
  • Self-directed / self-motivated personality
Job Responsibility
Job Responsibility
  • Data processing & analysis - extracting, cleaning, and organizing big sets of data
  • Developing scripts (e.g. in Python or MATLAB) to automate repetitive data processing tasks and reporting
  • Implementing ML algorithms to identify patterns in large sets of simulation data and predict component behaviour
  • Assisting senior CAE engineers in setting up and running structural, NVH or other simulations for iDMs
  • Preparing 3D models (meshing, cleaning geometry) and defining boundary conditions under supervision
  • Creating technical reports and visualizations to summarize simulation findings
  • Preparation in design reviews and inspections of mechanical parts and assemblies
  • Assisting in conducting mechanical testing, including data collection and initial analysis
  • Help investigate component issues and support root cause analysis activities
  • Follow engineering processes and quality standards throughout assigned tasks
What we offer
What we offer
  • A paid internship lasting 6 months at an international automotive company
  • Flexible working hours tailored to your needs
  • Hands-on experience and an opportunity to gain valuable professional experience in a high-paced environment
  • Comprehensive support and mentorship from experienced professionals in the field
  • Networking opportunities with industry leaders and peers to build your professional connection
  • Parttime
Read More
Arrow Right

Data and Analytics Engineer Intern

The Data and Analytics Engineer Intern will assist in designing, building, and t...
Location
Location
United States , Irvine
Salary
Salary:
17.00 - 22.00 USD / Hour
trace3.com Logo
Trace3
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Enrollment in the Junior or Senior year of an undergraduate program or master’s program at an accredited college or university
  • Candidates should be pursuing a field of study applicable to the Data Intelligence internship
  • Cumulative grade point average (GPA) of 3.0 or better
  • Ability to work independently on assigned tasks and accepts direction on given assignments
  • Self-motivated individuals with a customer mindset and desire to help people
  • Enthusiasm for technical problem solving with attention to detail and strong communication skills
  • Ability to learn and research in a dynamic and engaging environment
  • Availability to work 40 hours per week throughout the internship
Job Responsibility
Job Responsibility
  • Assist in designing, building, and testing data platforms and analytics solutions to generate actionable insights for our customers
  • Partner with our Data Intelligence Team to determine the best approach around data ingestion, structure, and storage, then work with the team to ensure these are implemented accurately
  • Contribute ideas on how to make our customers’ data more valuable and work with members of Trace3’s Engineering Team to implement solutions
What we offer
What we offer
  • Comprehensive medical, dental and vision plans for you and your dependents
  • 401(k) Retirement Plan with Employer Match, 529 College Savings Plan, Health Savings Account, Life Insurance, and Long-Term Disability
  • Competitive Compensation
  • Training and development programs
  • Major offices stocked with snacks and beverages
  • Collaborative and cool culture
  • Work-life balance and generous paid time off
  • Fulltime
Read More
Arrow Right

Data Engineer, Enterprise Data, Analytics and Innovation

Are you passionate about building robust data infrastructure and enabling innova...
Location
Location
United States
Salary
Salary:
110000.00 - 125000.00 USD / Year
vaniamgroup.com Logo
Vaniam Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional experience in data engineering, ETL, or related roles
  • Strong proficiency in Python and SQL for data engineering
  • Hands-on experience building and maintaining pipelines in a lakehouse or modern data platform
  • Practical understanding of Medallion architectures and layered data design
  • Familiarity with modern data stack tools, including: Spark or PySpark
  • Workflow orchestration (Airflow, dbt, or similar)
  • Testing and observability frameworks
  • Containers (Docker) and Git-based version control
  • Excellent communication skills, problem-solving mindset, and a collaborative approach
Job Responsibility
Job Responsibility
  • Design, build, and operate reliable ETL and ELT pipelines in Python and SQL
  • Manage ingestion into Bronze, standardization and quality in Silver, and curated serving in Gold layers of our Medallion architecture
  • Maintain ingestion from transactional MySQL systems into Vaniam Core to keep production data flows seamless
  • Implement observability, data quality checks, and lineage tracking to ensure trust in all downstream datasets
  • Develop schemas, tables, and views optimized for analytics, APIs, and product use cases
  • Apply and enforce best practices for security, privacy, compliance, and access control, ensuring data integrity across sensitive healthcare domains
  • Maintain clear and consistent documentation for datasets, pipelines, and operating procedures
  • Lead the integration of third-party datasets, client-provided sources, and new product-generated data into Vaniam Core
  • Partner with product and innovation teams to build repeatable processes for onboarding new data streams
  • Ensure harmonization, normalization, and governance across varied data types (scientific, engagement, operational)
What we offer
What we offer
  • 100% remote environment with opportunities for local meet-ups
  • Positive, diverse, and supportive culture
  • Passionate about serving clients focused on Cancer and Blood diseases
  • Investment in you with opportunities for professional growth and personal development through Vaniam Group University
  • Health benefits – medical, dental, vision
  • Generous parental leave benefit
  • Focused on your financial future with a 401(k) Plan and company match
  • Work-Life Balance and Flexibility
  • Flexible Time Off policy for rest and relaxation
  • Volunteer Time Off for community involvement
  • Fulltime
Read More
Arrow Right

Crypto Data Scientist / Machine Learning - LLM Engineer Intern

Token Metrics is searching for a highly capable machine learning engineer to opt...
Location
Location
United States , Houston
Salary
Salary:
Not provided
tokenmetrics.com Logo
Token Metrics
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in computer science, data science, mathematics, or a related field
  • Master’s degree in computational linguistics, data science, data analytics, or similar will be advantageous
  • At least two years' experience as a machine learning engineer
  • Advanced proficiency with Python, Java, and R code
  • Extensive knowledge of ML frameworks, libraries, data structures, data modeling, and software architecture
  • LLM fine-tuning experience and working with LLM Observability
  • In-depth knowledge of mathematics, statistics, and algorithms
  • Superb analytical and problem-solving abilities
  • Great communication and collaboration skills
  • Excellent time management and organizational abilities
Job Responsibility
Job Responsibility
  • Consulting with the manager to determine and refine machine learning objectives
  • Designing machine learning systems and self-running artificial intelligence (AI) to automate predictive models
  • Transforming data science prototypes and applying appropriate ML algorithms and tools
  • Ensuring that algorithms generate accurate user recommendations
  • Solving complex problems with multi-layered data sets, as well as optimizing existing machine learning libraries and frameworks
  • Developing ML algorithms to analyze huge volumes of historical data to make predictions
  • Stress testing, performing statistical analysis, and interpreting test results for all market conditions
  • Documenting machine learning processes
  • Keeping abreast of developments in machine learning
Read More
Arrow Right