CrawlJobs Logo

Data Engineer + Scientist Hybrid

Spoak Decor

Location Icon

Location:

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

115000.00 - 150000.00 USD / Year

Job Description:

Spoak is looking for a hybrid data engineer / data scientist to join us on our mission to build the world’s most loved interior design platform. As a company committed to using data to drive our business and roadmap, we are looking for a talented data engineer/data scientist hybrid who can help us to develop and maintain our data infrastructure and use advanced analytics techniques to uncover insights that will help us grow our business and achieve our mission.

Job Responsibility:

  • Design, build and maintain our data infrastructure, including ETL pipelines and databases
  • Develop and implement advanced analytics models and algorithms to uncover insights that can be used to optimize our products and customer experience
  • Work closely with product managers, designers, and engineers to identify data needs and build out new data-driven features
  • Develop and maintain data documentation, ensuring that our data is accurate, consistent, and well-documented
  • Participate in cross-functional projects and collaborate with other teams to share insights and knowledge

Requirements:

  • Bachelor's degree in computer science, statistics, mathematics or a related field
  • Strong knowledge of data engineering and data science concepts and techniques, including ETL, data warehousing, statistical modeling, machine learning, and data visualization
  • Proficiency in programming languages such as Python, R, or SQL
  • Experience with cloud platforms such as AWS or GCP
  • Ability to work collaboratively in a fast-paced, startup environment
  • Excellent communication skills and ability to explain technical concepts and insights to non-technical stakeholders

Nice to have:

  • Experience with data visualization tools such as Tableau or Power BI
  • Experience with distributed computing systems such as Hadoop or Spark
  • Experience with big data technologies such as Apache Kafka, BigQuery or Cassandra
  • Experience with containerization technologies such as Docker or Kubernetes
  • Experience with machine learning platforms like TensorFlow or PyTorch
What we offer:
  • To build an amazing company from scratch
  • To build tools that enable creativity
  • Remote-first team, EST hours
  • Medical, dental, and vision insurance
  • 401K
  • A four-day work week every other week
  • Flexible time-off
  • Monthly virtual team events
  • A close-knit team

Additional Information:

Job Posted:
January 20, 2026

Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Engineer + Scientist Hybrid

Data Engineer II

We’re looking for a Data Engineer II to join our Decisions & Insights team — the...
Location
Location
United States , San Francisco
Salary
Salary:
101250.00 - 162000.00 USD / Year
axon.com Logo
Axon
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of experience as a Data Engineer, Analytics Engineer, or similar hybrid role
  • Advanced SQL skills with strong understanding of schema design and analytical database patterns
  • Strong Python experience for data manipulation, scripting, and automation
  • Strong analytical mindset, attention to detail, and a passion for turning complex datasets into clear insights
  • Solid Git experience for collaboration and version control
  • Experience integrating and analyzing data from AWS services (e.g., Redshift, S3, APIs)
Job Responsibility
Job Responsibility
  • Design, develop, and maintain automated ETL/ELT pipelines and analytical models in Redshift using SQL, dbt, or SQLMesh
  • Build clean, structured datasets that enable fast, self-serve insights for PMs, analysts, and operational agents
  • Create dashboards and analytical tools (e.g., Tableau) that make insights intuitive and actionable
  • Optimize SQL queries, schemas, and table designs for high-volume analytics workloads
  • Partner with product managers, data scientists, and engineering teams to deliver analysis that supports product and operational decisions
  • Use Python to automate processing, improve pipeline reliability, and support advanced analytical workflows
  • Apply best practices for Git-based version control, testing, documentation, and data quality monitoring
  • Participate in design reviews for new analytical features and data systems
What we offer
What we offer
  • Competitive salary and 401k with employer match
  • Discretionary paid time off
  • Paid parental leave for all
  • Medical, Dental, Vision plans
  • Fitness Programs
  • Emotional & Mental Wellness support
  • Learning & Development programs
  • Snacks in our offices
  • Fulltime
Read More
Arrow Right

Data Engineer, Solutions Architecture

We are seeking a talented Data Engineer to design, build, and maintain our data ...
Location
Location
United States , Scottsdale
Salary
Salary:
90000.00 - 120000.00 USD / Year
clearwayenergy.com Logo
Clearway Energy
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2-4 years of hands-on data engineering experience in production environments
  • Bachelor's degree in Computer Science, Engineering, or a related field
  • Proficiency in Dagster or Airflow for pipeline scheduling, dependency management, and workflow automation
  • Advanced-level Snowflake administration, including virtual warehouses, clustering, security, and cost optimization
  • Proficiency in dbt for data modeling, testing, documentation, and version control of analytical transformations
  • Strong Python and SQL skills for data processing and automation
  • 1-2+ years of experience with continuous integration and continuous deployment practices and tools (Git, GitHub Actions, GitLab CI, or similar)
  • Advanced SQL skills, database design principles, and experience with multiple database platforms
  • Proficiency in AWS/Azure/GCP data services, storage solutions (S3, Azure Blob, GCS), and infrastructure as code
  • Experience with APIs, streaming platforms (Kafka, Kinesis), and various data connectors and formats
Job Responsibility
Job Responsibility
  • Design, deploy, and maintain scalable data infrastructure to support enterprise analytics and reporting needs
  • Manage Snowflake instances, including performance tuning, security configuration, and capacity planning for growing data volumes
  • Optimize query performance and resource utilization to control costs and improve processing speed
  • Build and orchestrate complex ETL/ELT workflows using Dagster to ensure reliable, automated data processing for asset management and energy trading
  • Develop robust data pipelines that handle high-volume, time-sensitive energy market data and asset generation and performance metrics
  • Implement workflow automation and dependency management for critical business operations
  • Develop and maintain dbt models to transform raw data into business-ready analytical datasets and dimensional models
  • Create efficient SQL-based transformations for complex energy market calculations and asset performance metrics
  • Support advanced analytics initiatives through proper data preparation and feature engineering
  • Implement comprehensive data validation, testing, and monitoring frameworks to ensure accuracy and consistency across all energy and financial data assets
What we offer
What we offer
  • generous PTO
  • medical, dental & vision care
  • HSAs with company contributions
  • health FSAs
  • dependent daycare FSAs
  • commuter benefits
  • relocation
  • a 401(k) plan with employer match
  • a variety of life & accident insurances
  • fertility programs
  • Fulltime
Read More
Arrow Right

Junior Data Scientist

Aramark Sports + Entertainment is hiring a Junior Data Scientist - Oracle Park, ...
Location
Location
United States , San Francisco
Salary
Salary:
70000.00 - 95000.00 USD / Year
aramark.com Logo
Aramark
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Must be legally authorized to work in the United States without the need for current or future employment-based sponsorship from Aramark
  • Bachelor’s degree in Mathematics, Statistics, Computer Science, Data Science, or a related field
  • equivalent practical experience may be considered
  • 1–3 years of experience in an analytical or data science role
  • Proficiency in data manipulation and transformation using Python or R
  • Familiarity with SQL for data querying and analysis
  • Knowledge of data science workflows including data cleaning, feature engineering, and predictive modeling
  • Effective organizational and time management skills, with the ability to manage multiple projects simultaneously
  • Solid understanding of statistics, experimental design, and core data science concepts
  • Strong communication skills with the ability to present findings and recommendations to technical and non-technical audiences
Job Responsibility
Job Responsibility
  • Analyze consumer behavior at Oracle Park by leveraging purchasing and dining data to identify key customer segments. Present insights using statistical methods and engaging visualizations tailored to diverse stakeholder audiences
  • Evaluate operational performance by integrating data from labor tracking, Point of Sale (POS), and inventory systems. Identify inefficiencies and recommend actionable improvements to enhance venue operations
  • Conduct ad-hoc analyses to assess the effectiveness of short-term strategies. Collaborate with cross-functional teams to define success metrics and deliver timely, data-backed evaluations
  • Support the development of automated reporting workflows that deliver key performance metrics to stakeholders, including Oracle Park operations and the San Francisco Giants
  • Assist in building scalable data pipelines using Python, R, and SQL to streamline data access and support analytics and reporting initiatives
  • Perform machine learning experiments and model evaluation tasks under the guidance of the team’s Lead Data Scientist
What we offer
What we offer
  • medical, dental, vision, and work/life resources
  • retirement savings plans like 401(k)
  • paid days off such as parental leave and disability coverage
  • Fulltime
Read More
Arrow Right

Technical Lead – AI/ML & Data Platforms

We are seeking a Technical Lead with strong managerial capabilities to drive the...
Location
Location
United States , Sunnyvale
Salary
Salary:
Not provided
thirdeyedata.ai Logo
Thirdeye Data
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong expertise in data pipelines, architecture, and analytics platforms (e.g., Snowflake, Tableau)
  • Experience reviewing and optimizing data transformations, aggregations, and business logic
  • Hands-on familiarity with LLMs and practical RAG implementations
  • Knowledge of AI/ML workflows, model lifecycle management, and experimentation frameworks
  • Proven experience in managing complex, multi-track projects
  • Skilled in project tracking and collaboration tools (Jira, Confluence, or equivalent)
  • Excellent communication and coordination skills with technical and non-technical stakeholders
  • Experience working with cross-functional, globally distributed teams
Job Responsibility
Job Responsibility
  • Coordinate multiple workstreams simultaneously, ensuring timely delivery and adherence to quality standards
  • Facilitate daily stand-ups and syncs across global time zones, maintaining visibility and accountability
  • Understand business domains and technical architecture to enable informed decisions and proactive risk management
  • Collaborate with data engineers, AI/ML scientists, analysts, and product teams to translate business goals into actionable plans
  • Track project progress using Agile or hybrid methodologies, escalate blockers, and resolve dependencies
  • Own task lifecycle — from planning through execution, delivery, and retrospectives
  • Perform technical reviews of data pipelines, ETL processes, and architecture, identifying quality or design gaps
  • Evaluate and optimize data aggregation logic while ensuring alignment with business semantics
  • Contribute to the design and development of RAG pipelines and workflows involving LLMs
  • Create and maintain Tableau dashboards and reports aligned with business KPIs for stakeholders
  • Fulltime
Read More
Arrow Right

Senior AI Project Manager - Broker Dealer & Wealth Management

Seeking a Senior AI Project Manager to lead AI and data-driven initiatives withi...
Location
Location
United States , El Segundo
Salary
Salary:
85.00 USD / Hour
bhsg.com Logo
Beacon Hill
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of project or program management experience, including AI/ML or data-driven initiatives
  • Direct experience in Broker Dealer & Wealth Management environments (required)
  • Proven delivery experience in financial services or highly regulated environments
  • Hands-on experience managing large, complex programs with minimal supervision
  • Strong understanding of the AI/ML lifecycle: data pipelines, model training/testing, evaluation, deployment, and MLOps
  • Working knowledge of data quality, governance, bias, privacy, and ethical AI considerations
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and AI/engineering tools (e.g., Jira, Git, TensorFlow, PyTorch)
  • Define and manage scope, schedule, cost, risk, resources, and quality
  • Create and maintain detailed project plans, milestones, dependencies, and reporting
  • Lead cross-functional teams including data scientists, ML engineers, data engineers, and business stakeholders
Job Responsibility
Job Responsibility
  • Lead AI and data-driven initiatives within a Broker Dealer & Wealth Management environment
  • Own delivery end-to-end-from project definition through deployment
  • Ensure alignment between business objectives, data teams, and regulatory requirements
  • Fulltime
Read More
Arrow Right

Data Engineer

Location
Location
Vietnam , Hà Nội
Salary
Salary:
Not provided
cmcglobal.com.vn Logo
CMC Global Company Limited.
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Data Engineer with strong Hadoop / Spark / Talend experience
  • Experience building and operating large-scale data lakes and data warehouses
  • Experience with Hadoop ecosystem and big data tools, including Spark and Kafka
  • Experience with Master Data Management (MDM) tools and platforms such as Informatica MDM, Talend Data Catalog, Semarchy xDM, IBM PIM & IKC, or Profisee
  • Familiarity with MDM processes such as golden record creation, survivorship, reconciliation, enrichment, and quality
  • Experience in data governance, including data quality management, data profiling, data remediation, and automated data lineage
  • Experience with stream-processing systems including Spark-Streaming
  • Experience working with Cloud services using one or more Cloud providers such as Azure, GCP, or AWS
  • Experience with Delta Lake and Databricks
  • Advanced working experience with relational SQL and NoSQL databases, including Hive, HBase, and Postgres
Job Responsibility
Job Responsibility
  • Create and manage a single master record for each business entity, ensuring data consistency, accuracy, and reliability
  • Implement data governance processes, including data quality management, data profiling, data remediation, and automated data lineage
  • Create and maintain multiple robust and high-performance data processing pipelines within Cloud, Private Data Centre, and Hybrid data ecosystems
  • Assemble large, complex data sets from a wide variety of data sources
  • Collaborate with Data Scientists, Machine Learning Engineers, Business Analysts, and Business users to derive actionable insights and reliable foresights into customer acquisition, operational efficiency, and other key business performance metrics
  • Develop, deploy, and maintain multiple microservices, REST APIs, and reporting services
  • Design and implement internal processes to automate manual workflows, optimize data delivery, and re-design infrastructure for greater scalability
  • Establish expertise in designing, analyzing, and troubleshooting large-scale distributed systems
  • Support and work with cross-functional teams in a dynamic environment
What we offer
What we offer
  • Attractive compensation package: 14-month salary scheme plus annual bonus and additional allowances
  • Annual bonus package tailored based on performance and contribution
  • Young, open, and dynamic working environment that promotes innovation and creativity
  • Ongoing learning and development with regular professional training and opportunities to enhance both technical and soft skills
  • Exposure to cutting-edge technologies and diverse real-world enterprise projects
  • Vibrant company culture with regular team-building activities, sports tournaments, arts events, Family Day, and more
  • Full compliance with Vietnamese labor laws, plus additional internal perks such as annual company trips, special holidays, and other corporate benefits
  • Fulltime
Read More
Arrow Right

Senior AI Project Manager

Seeking a Senior AI Project Manager to lead AI and data-driven initiatives withi...
Location
Location
United States , Los Angeles
Salary
Salary:
90.00 USD / Hour
bhsg.com Logo
Beacon Hill
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of project or program management experience, including AI/ML or data-driven initiatives
  • Direct experience in Broker Dealer & Wealth Management environments (required)
  • Proven delivery experience in financial services or highly regulated environments
  • Hands-on experience managing large, complex programs with minimal supervision
  • Strong understanding of the AI/ML lifecycle: data pipelines, model training/testing, evaluation, deployment, and MLOps
  • Working knowledge of data quality, governance, bias, privacy, and ethical AI considerations
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and AI/engineering tools (e.g., Jira, Git, TensorFlow, PyTorch)
  • Define and manage scope, schedule, cost, risk, resources, and quality
  • Create and maintain detailed project plans, milestones, dependencies, and reporting
  • Lead cross-functional teams including data scientists, ML engineers, data engineers, and business stakeholders
Job Responsibility
Job Responsibility
  • Lead AI and data-driven initiatives within a Broker Dealer & Wealth Management environment
  • Own delivery end-to-end-from project definition through deployment
  • Ensure alignment between business objectives, data teams, and regulatory requirements
  • Fulltime
Read More
Arrow Right

Senior AI Project Manager – Broker Dealer & Wealth Management

Seeking a Senior AI Project Manager to lead AI and data-driven initiatives withi...
Location
Location
United States , Los Angeles
Salary
Salary:
90.00 USD / Hour
bhsg.com Logo
Beacon Hill
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of project or program management experience, including AI/ML or data-driven initiatives
  • Direct experience in Broker Dealer & Wealth Management environments (required)
  • Proven delivery experience in financial services or highly regulated environments
  • Hands-on experience managing large, complex programs with minimal supervision
  • Strong understanding of the AI/ML lifecycle: data pipelines, model training/testing, evaluation, deployment, and MLOps
  • Working knowledge of data quality, governance, bias, privacy, and ethical AI considerations
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and AI/engineering tools (e.g., Jira, Git, TensorFlow, PyTorch)
  • Define and manage scope, schedule, cost, risk, resources, and quality
  • Create and maintain detailed project plans, milestones, dependencies, and reporting
  • Lead cross-functional teams including data scientists, ML engineers, data engineers, and business stakeholders
Job Responsibility
Job Responsibility
  • Lead AI and data-driven initiatives within a Broker Dealer & Wealth Management environment
  • Own delivery end-to-end-from project definition through deployment
  • Ensure alignment between business objectives, data teams, and regulatory requirements
  • Fulltime
Read More
Arrow Right