CrawlJobs Logo

Data and Analytics Engineer Intern

trace3.com Logo

Trace3

Location Icon

Location:
United States , Irvine

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

17.00 - 22.00 USD / Hour

Job Description:

The Data and Analytics Engineer Intern will assist in designing, building, and testing data platforms and analytics solutions to generate actionable insights for our customers.

Job Responsibility:

  • Assist in designing, building, and testing data platforms and analytics solutions to generate actionable insights for our customers
  • Partner with our Data Intelligence Team to determine the best approach around data ingestion, structure, and storage, then work with the team to ensure these are implemented accurately
  • Contribute ideas on how to make our customers’ data more valuable and work with members of Trace3’s Engineering Team to implement solutions

Requirements:

  • Enrollment in the Junior or Senior year of an undergraduate program or master’s program at an accredited college or university
  • Candidates should be pursuing a field of study applicable to the Data Intelligence internship
  • Cumulative grade point average (GPA) of 3.0 or better
  • Ability to work independently on assigned tasks and accepts direction on given assignments
  • Self-motivated individuals with a customer mindset and desire to help people
  • Enthusiasm for technical problem solving with attention to detail and strong communication skills
  • Ability to learn and research in a dynamic and engaging environment
  • Availability to work 40 hours per week throughout the internship

Nice to have:

  • Academic or professional/internship experience working in a professional setting is a plus
  • A basic level of knowledge of either data systems, data languages such as SQL or Python or data visualization is a plus
What we offer:
  • Comprehensive medical, dental and vision plans for you and your dependents
  • 401(k) Retirement Plan with Employer Match, 529 College Savings Plan, Health Savings Account, Life Insurance, and Long-Term Disability
  • Competitive Compensation
  • Training and development programs
  • Major offices stocked with snacks and beverages
  • Collaborative and cool culture
  • Work-life balance and generous paid time off

Additional Information:

Job Posted:
December 13, 2025

Employment Type:
Fulltime
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data and Analytics Engineer Intern

Data Engineer, 2025/2026 Intern

Join Atlassian as an intern and spend your summer with us having an impact on ho...
Location
Location
Australia , Sydney
Salary
Salary:
Not provided
https://www.atlassian.com Logo
Atlassian
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Be currently enrolled in a Bachelors or Masters program in Software Engineering, Computer Science or other related technical field and completing your studies before January 2027
  • Experience programming with Python, or other related object-oriented programming languages
  • Knowledge of data structures, in particular how they are implemented and how to apply them to meet data challenges
  • Proficiency in SQL and relational databases experience
  • Demonstrated interest in the Data Engineering field through academic coursework, previous work or internship experience, or personal projects
Job Responsibility
Job Responsibility
  • Influence product teams
  • inform Data Science and Analytics Platform teams
  • partner with data consumers and products to ensure quality and usefulness of data assets
  • help strategize measurement
  • collecting data
  • generating insights
What we offer
What we offer
  • health coverage
  • paid volunteer days
  • wellness resources
  • Fulltime
Read More
Arrow Right

Product Data Engineering Intern

Product Data Engineering Intern role at Hewlett Packard Enterprise. This is an o...
Location
Location
Puerto Rico , Aguadilla
Salary
Salary:
Not provided
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Currently pursuing a Bachelor's degree in Systems Engineering, Industrial Engineering or Computer Engineering
  • Familiarity with SAP
  • Basic programming or scripting knowledge (e.g., Python, Java, C++)
  • Strong interest in high-tech and passion for learning
  • Excellent communication and interpersonal skills
  • Strong problem-solving and analytical skills
  • Time management skills and working with strict deadlines
  • A collaborative, solution-focused mindset and overall sense of urgency
Job Responsibility
Job Responsibility
  • Support senior team members on assigned technical projects
  • Help identify and troubleshoot technical issues, providing support and suggesting solutions
  • Assist with maintaining and updating hardware, software, and other technical systems
  • Participate in team activities by attending team meetings, learn about project methodologies, and collaborate effectively with colleagues
  • Actively engage in learning about new technologies and methodologies relevant to work
  • Fulfill tasks and responsibilities assigned by a supervisor in a timely and efficient manner
  • Participate in periodic reviews to share updates and incorporate feedback on assigned projects/initiatives
What we offer
What we offer
  • Health & Wellbeing benefits
  • Personal & Professional Development programs
  • Unconditional Inclusion environment
  • Comprehensive suite of benefits supporting physical, financial and emotional wellbeing
  • Fulltime
Read More
Arrow Right

Data Analytics Engineer

SDG Group is expanding its global Data & Analytics practice and is seeking a mot...
Location
Location
Egypt , Cairo
Salary
Salary:
Not provided
sdggroup.com Logo
SDG
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in computer science, Engineering, Information Systems, or related field
  • Hands-on experience in DataOps / Data Engineering
  • Strong knowledge in Databricks OR Snowflake (one of them is mandatory)
  • Proficiency in Python and SQL
  • Experience with Azure data ecosystem (ADF, ADLS, Synapse, etc.)
  • Understanding of CI/CD practices and DevOps for data.
  • Knowledge of data modeling, orchestration frameworks, and monitoring tools
  • Strong analytical and troubleshooting skills
  • Eagerness to learn and grow in a global consulting environment
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable and reliable data pipelines following DataOps best practices
  • Work with modern cloud data stacks using Databricks (Spark, Delta Lake) or Snowflake (Snow pipe, tasks, streams)
  • Develop and optimize ETL/ELT workflows using Python, SQL, and orchestration tools
  • Work with Azure data services (ADF, ADLS, Azure SQL, Azure Functions)
  • Implement CI/CD practices using Azure DevOps or Git-based workflows
  • Ensure data quality, consistency, and governance across all delivered data solutions
  • Monitor and troubleshoot pipelines for performance and operational excellence
  • Collaborate with international teams, architects, and analytics consultants
  • Contribute to technical documentation and solution design assets
What we offer
What we offer
  • Remote working model aligned with international project needs
  • Opportunity to work on European and global engagements
  • Mentorship and growth paths within SDG Group
  • A dynamic, innovative, and collaborative environment
  • Access to world-class training and learning platforms
  • Fulltime
Read More
Arrow Right

Data Engineer, Enterprise Data, Analytics and Innovation

Are you passionate about building robust data infrastructure and enabling innova...
Location
Location
United States
Salary
Salary:
110000.00 - 125000.00 USD / Year
vaniamgroup.com Logo
Vaniam Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional experience in data engineering, ETL, or related roles
  • Strong proficiency in Python and SQL for data engineering
  • Hands-on experience building and maintaining pipelines in a lakehouse or modern data platform
  • Practical understanding of Medallion architectures and layered data design
  • Familiarity with modern data stack tools, including: Spark or PySpark
  • Workflow orchestration (Airflow, dbt, or similar)
  • Testing and observability frameworks
  • Containers (Docker) and Git-based version control
  • Excellent communication skills, problem-solving mindset, and a collaborative approach
Job Responsibility
Job Responsibility
  • Design, build, and operate reliable ETL and ELT pipelines in Python and SQL
  • Manage ingestion into Bronze, standardization and quality in Silver, and curated serving in Gold layers of our Medallion architecture
  • Maintain ingestion from transactional MySQL systems into Vaniam Core to keep production data flows seamless
  • Implement observability, data quality checks, and lineage tracking to ensure trust in all downstream datasets
  • Develop schemas, tables, and views optimized for analytics, APIs, and product use cases
  • Apply and enforce best practices for security, privacy, compliance, and access control, ensuring data integrity across sensitive healthcare domains
  • Maintain clear and consistent documentation for datasets, pipelines, and operating procedures
  • Lead the integration of third-party datasets, client-provided sources, and new product-generated data into Vaniam Core
  • Partner with product and innovation teams to build repeatable processes for onboarding new data streams
  • Ensure harmonization, normalization, and governance across varied data types (scientific, engagement, operational)
What we offer
What we offer
  • 100% remote environment with opportunities for local meet-ups
  • Positive, diverse, and supportive culture
  • Passionate about serving clients focused on Cancer and Blood diseases
  • Investment in you with opportunities for professional growth and personal development through Vaniam Group University
  • Health benefits – medical, dental, vision
  • Generous parental leave benefit
  • Focused on your financial future with a 401(k) Plan and company match
  • Work-Life Balance and Flexibility
  • Flexible Time Off policy for rest and relaxation
  • Volunteer Time Off for community involvement
  • Fulltime
Read More
Arrow Right

Senior Analytics Engineer

Join Qargo as a Senior Analytics Engineer and turn complex logistics data into c...
Location
Location
Belgium , Ghent
Salary
Salary:
Not provided
qargo.com Logo
Qargo
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Extensive experience in data analytics, BI engineering, or analytics engineering in a SaaS or data-driven environment
  • Strong proficiency in SQL, data modelling, and dashboarding tools (Lightdash, Superset, Tableau, Looker, or PowerBI)
  • Experience with analytics platforms such as Mixpanel and user behavior tracking methodologies is a plus
  • Proven ability to collaborate effectively with multidisciplinary teams (product, engineering, sales, finance)
  • Strong analytical and problem-solving skills, with an eye for scalability and data quality
  • A background in Computer Science, Data Engineering, Statistics, or a related field
  • A proactive mindset with the ability to take ownership of complex problems and guide them to completion
Job Responsibility
Job Responsibility
  • Own and evolve Qargo’s data architecture, ensuring scalable, reliable and high-quality data pipelines
  • Maintain, extend and optimise our in-app Lightdash dashboards, including operational, financial, and performance insights
  • Build and refine internal BI dashboards to support data-driven decision-making across departments
  • Develop account management dashboards that reveal feature adoption, revenue insights, and upsell opportunities
  • Create product and engineering analytics, including feature usage dashboards using Mixpanel or custom-built tracking solutions
  • Lead integration-related reporting, providing visibility on integration health, tenant usage, and performance
  • Build billing and cost-analysis dashboards, including API call cost allocation and tenant-level breakdowns
  • Mentor team members, set best practices, and raise the bar for analytics engineering within Qargo
What we offer
What we offer
  • Real impact and ownership in a growing international scale-up
  • A supportive and collaborative team culture
  • Hybrid working setup with flexibility and trust
  • Opportunities to learn, grow, and expand your technical knowledge
  • Competitive salary and benefits package
Read More
Arrow Right

Data Analytics Engineer

Beacon Biosignals is seeking an experienced data analytics engineer to strengthe...
Location
Location
United States , Boston
Salary
Salary:
Not provided
beacon.bio Logo
Beacon Biosignals
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Extensive experience developing, testing, and maintaining SQL transformations to deliver reproducible analytic results in a production setting atop data platforms such as Snowflake or Databricks
  • Proficiency in a scientific programming language such as Julia, Python, or R
  • A rigorous approach to documentation and automated testing
  • Experience deploying automated analytics pipelines to a production environment, monitoring their performance, and maintaining them over time
  • A collaborative mindset
  • Strong asynchronous communication skills and a knack for making the most of synchronous collaboration
  • Product-oriented intuition for turning individual stakeholder needs into configurable solutions and broadly applicable features
  • Familiarity with or interest in learning about analyses that power clinical trials, clinical diagnostics, or other medical applications that leverage data organization, transformation, and statistics
Job Responsibility
Job Responsibility
  • Design, implement, and maintain data models and transformations that enrich biosignal metrics with clinical context, powered by Beacon's biosignal data warehouse
  • Ensure that data pipelines and products produce reproducible results by owning CI test suites and versioning strategies for data models, dashboards, and reports
  • Build reusable, scalable customer-facing analytics products, such as automated scientific reports and dashboards that can be configured and reused without custom engineering
  • Improve user documentation and tooling to enable customers and internal scientists to use the analytics layer surface to answer new scientific questions
  • Deploy and monitor analytics and feature computation services that transform raw biosignal data and clinical metadata to generate meaningful scientific results
  • Collaborate with scientific subject matter experts, product managers, and customers to identify high-impact analytics improvements that accelerate therapy development
What we offer
What we offer
  • Equity
  • PTO
Read More
Arrow Right

Head of Data Infrastructure & Analytics

As our remote Head of Data Infrastructure & Analytics, you will take the helm of...
Location
Location
United States
Salary
Salary:
200000.00 USD / Year
puffy.com Logo
Puffy
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years in digital analytics with proven track record of owning data quality in high-stakes B2C environments
  • Expert-level knowledge: GA4, GTM, server-side tracking, data pipeline architecture, e-commerce tracking
  • Can diagnose and fix tracking setup errors across the entire stack
  • Strong technical skills: SQL expert, Python or R proficiency for data validation
  • Data engineering knowledge: Understand ETL/ELT, data warehousing, API integrations, event-driven architecture
  • Experience building data quality systems: Automated validation checks, data observability, monitoring frameworks
  • Proven team leadership: Can manage 3+ person analytics team
  • Experience with data observability tools (Monte Carlo, Great Expectations, dbt tests)
  • Background in analytics engineering or data reliability engineering
  • Track record in high-growth e-commerce (sleep, wellness, DTC)
Job Responsibility
Job Responsibility
  • Data Quality Infrastructure (Your #1 Priority)
  • Audit existing tracking setup (GA4, GTM, Shopify, Klaviyo, ad platforms) and fix every configuration error
  • Build automated validation systems that catch bugs before humans see them
  • Implement pre-leadership QA processes with hard gates
  • Create monitoring and alerting
  • Technical Architecture & Reliability
  • Design and maintain server-side tracking implementation for attribution accuracy
  • Architect data pipelines that handle $2M+ monthly ad spend tracking without errors
  • Own end-to-end data flow: website events → collection → transformation → warehouse → reporting
  • Establish technical standards
What we offer
What we offer
  • Continuous learning
  • 10% monthly bonus
  • Premium insurance
  • Achievement recognition
  • Free snacks and lunches
  • Generous annual leave
  • AI tools and tech stack
  • 18+ nationalities
  • Social events & activities
  • Learning and development support (we pay for courses you need to upskill)
  • Fulltime
Read More
Arrow Right

Senior Analytics Engineer

Senior Analytics Engineer role at Fever, building a federated data organization ...
Location
Location
Spain , Madrid
Salary
Salary:
Not provided
https://feverup.com/fe Logo
Fever
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's, Master's, or PhD in Computer Engineering, Data Engineering, Data Science, or related field
  • Strong experience with SQL and data modeling (star/snowflake schemas, data vault, or similar)
  • Hands-on experience with DBT (or similar transformation frameworks)
  • Hands-on experience with Python and orchestration frameworks (Airflow, Dagster, Prefect)
  • Familiarity with modern cloud data warehouses (Snowflake, BigQuery, Redshift, etc.)
  • Experience with BI tools (Metabase, Superset, Tableau, etc.)
  • Understanding of data quality, governance, and observability practices
  • Collaborative mindset comfortable working with both engineers and business stakeholders
  • Strong communication skills adaptable to multidisciplinary, international, fast-paced environment
Job Responsibility
Job Responsibility
  • Design, build, and maintain data models (DBT, SQL) that transform raw data into clean, trusted datasets
  • Collaborate with data engineering team to define and certify business-critical metrics
  • Work with business squads (B2B, Marketing, CRM, Product) to understand their needs and turn them into reusable data assets
  • Ensure data quality and consistency through testing frameworks, observability, and governance
  • Support self-service analytics by enabling stakeholders to explore data confidently
  • Contribute to Airflow pipelines in Python for automation and orchestration
  • Contribute to data mesh vision creating domain-owned datasets
What we offer
What we offer
  • Attractive compensation package with base salary and performance bonus
  • Stock options
  • 40% discount on all Fever events and experiences
  • Home office friendly
  • Health insurance
  • Flexible remuneration with 100% tax exemption through Cobee
  • English Lessons
  • Gympass Membership
  • Possibility to receive salary in advance by Payflow
  • Fulltime
Read More
Arrow Right