CrawlJobs Logo

Data Analyst - Operations & Engineering Support

https://www.roberthalf.com Logo

Robert Half

Location Icon

Location:
United States , Salt Lake City

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

This isn't a traditional 'pull reports and make charts' analyst role. As an entry level Data Analyst, you’ll operate at the intersection of data engineering, technical support, and operations. You’ll help troubleshoot data issues, support pipeline reliability, and fulfill ad-hoc data requests while gaining hands-on exposure to the full data ecosystem. This is a fast-paced, high-impact role where adaptability, curiosity, and problem-solving are key. You don’t need to know everything on day one—we’re looking for someone proactive, resourceful, and eager to learn. The ideal candidate takes initiative, explores problems independently, and uses available tools (including AI) to find solutions efficiently.

Job Responsibility:

  • Data Issue Triage: Investigate and resolve L2/L3 data-related issues by analyzing datasets, writing queries, and clearly documenting findings for internal teams
  • Pipeline Monitoring & Support: Develop an understanding of data pipelines, identify potential points of failure, and proactively flag issues
  • Data Quality & Validation: Perform quality checks on datasets (both spatial and tabular) to ensure accuracy before distribution
  • Data Integration: Assist with onboarding new data sources into ETL processes, including setup, testing, and validation
  • AI-Driven Problem Solving: Leverage AI tools to accelerate troubleshooting, learn new systems, and improve efficiency—while applying sound judgment

Requirements:

  • SQL: Comfortable writing queries using joins, filters, and aggregations to explore and analyze data
  • Programming Exposure: Familiarity with Python, R, or similar scripting languages
  • Ticketing Systems: Experience with tools like Jira, Asana, or similar for managing and tracking work
  • Version Control: Basic understanding of Git (commits, branches, pull requests)
  • 0–2 years of experience in data analytics, technical support, or a related field
  • Naturally curious and resourceful
  • able to research, test, and problem-solve independently
  • Strong attention to detail with a focus on data accuracy
  • Comfortable working in a dynamic environment with shifting priorities
  • Clear and concise communicator, especially in written formats (tickets, documentation, messaging tools)
  • Interested in growing toward a data engineering career path

Nice to have:

  • Experience with spatial data (GeoJSON, Shapefiles) or GIS tools (QGIS, ArcGIS)
  • Exposure to ETL processes, data warehousing, or pipeline tools (e.g., Airflow, dbt)
  • Familiarity with BI tools such as Tableau, Looker, or Power BI
  • Background in SaaS, marketing analytics, tourism, or agency environments
  • Experience using AI tools in day-to-day workflows and the ability to articulate how they improve productivity
What we offer:
  • medical, vision, dental, and life and disability insurance
  • eligible to enroll in our company 401(k) plan
  • free online training

Additional Information:

Job Posted:
March 21, 2026

Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Analyst - Operations & Engineering Support

Senior Support Operations Analyst

As a Senior Support Operations Analyst, you'll be the technical architect behind...
Location
Location
United States
Salary
Salary:
111220.00 - 133422.00 USD / Year
babylist.com Logo
Babylist
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years in Customer Support Operations, Business Intelligence/Analytics, or similar roles in high-growth, customer-facing environments
  • Strong analytical skills and operational instincts, with a track record of designing scalable processes that improve agent performance and user experience
  • Advanced SQL proficiency with experience writing complex queries, joins, and building data models for customer support analytics
  • 3+ years hands-on experience with Zendesk (or Salesforce Service Cloud), including building automation, workflows, triggers, and API integrations
  • Proven track record implementing and optimizing AI chatbot platforms such as Sierra, Decagon, Forethought, Ada, or similar enterprise solutions
  • Expert-level proficiency building dashboards and reports in Sigma, Tableau, or PowerBI—able to work independently with minimal technical support
  • Experience integrating data across multiple systems and working with APIs
  • Experienced in building and maintaining reporting and performance analysis, including CSAT, cost per contact, and ROI-focused metrics
  • Proven ability to lead end-to-end change management for major tool rollouts and workflow changes, including stakeholder communication, training development, adoption tracking, and post-launch support
  • Familiar with AI support technologies and their practical integration into support systems (chatbots, knowledge base platforms)
Job Responsibility
Job Responsibility
  • Ensure high accuracy visibility into support performance through scalable tools (like dashboards) and internal reporting systems
  • Identify and work with our tech team to drive system improvements across the CS Tech stack (Zendesk, chatbots, phones platform)
  • Work cross-functionally with engineering, product, supply chain, and offshore support teams to identify root causes of user frustration and implement solutions
  • Partner with CS leadership to identify trends and key issues within our reporting frameworks and take active role in resolution
  • Support regular reporting cadences (weekly, monthly, quarterly), with summaries that guide cross-functional decisions
  • Surface actionable insights from data to support empathetic, user-centered decision-making and analyze trends across channels and issue types, supporting data-driven improvements to the support model
  • Drive adoption of AI-enabled workflows that reduce handle time and enhance support quality
  • Own and evolve self-service content, tools, and AI integrations such as Copilots and knowledge bases
  • Maintain ownership of CS-tools and vendor relationships, ensuring we operate on a modern, AI-enabled stack that supports scale, agent effectiveness, and automation—while optimizing for quality, usability, and value
  • Identify inefficiencies and recommend solutions that improve agent effectiveness and reduce user friction
What we offer
What we offer
  • Competitive salary with equity and bonus opportunities
  • Company-paid medical, dental, and vision insurance
  • Retirement savings plan with company matching and flexible spending accounts
  • Generous paid parental leave and PTO
  • Remote work stipend to set up your office
  • Perks for physical, mental, and emotional health, parenting, childcare, and financial planning
  • Fulltime
Read More
Arrow Right

Data Engineering & Analytics Lead

Premium Health is seeking a highly skilled, hands-on Data Engineering & Analytic...
Location
Location
United States , Brooklyn
Salary
Salary:
Not provided
premiumhealth.org Logo
Premium Health
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Engineering, or a related field. Master's degree preferred
  • Proven track record and progressively responsible experience in data engineering, data architecture, or related technical roles
  • healthcare experience preferred
  • Strong knowledge of data engineering principles, data integration, ETL processes, and semantic mapping techniques and best practices
  • Experience implementing data quality management processes, data governance frameworks, cataloging, and master data management concepts
  • Familiarity with healthcare data standards (e.g., HL7, FHIR, etc), health information management principles, and regulatory requirements (e.g., HIPAA)
  • Understanding of healthcare data, including clinical, operational, and financial data models, preferred
  • Advanced proficiency in SQL, data modeling, database design, optimization, and performance tuning
  • Experience designing and integrating data from disparate systems into harmonized data models or semantic layers
  • Hands-on experience with modern cloud-based data platforms (e.g Azure, AWS, GCP)
Job Responsibility
Job Responsibility
  • Collaborate with the CDIO and Director of Technology to define a clear data vision aligned with the organization's goals and execute the enterprise data roadmap
  • Serve as a thought leader for data engineering and analytics, guiding the evolution of our data ecosystem and championing data-driven decision-making across the organization
  • Build and mentor a small data team, providing technical direction and performance feedback, fostering best practices and continuous learning, while remaining a hands-on implementor
  • Define and implement best practices, standards, and processes for data engineering, analytics, and data management across the organization
  • Design, implement, and maintain a scalable, reliable, and high-performing modern data infrastructure, aligned with the organizational needs and industry best practices
  • Architect and maintain data lake/lakehouse, warehouse, and related platform components to support analytics, reporting, and operational use cases
  • Establish and enforce data architecture standards, governance models, naming conventions ,and documentation
  • Develop, optimize, and maintain scalable ETL/ELT pipelines and data workflows to collect, transform, normalize, and integrate data from diverse systems
  • Implement robust data quality processes, validation, monitoring, and error-handling frameworks
  • Ensure data is accurate, timely, secure, and ready for self-service analytics and downstream applications
What we offer
What we offer
  • Paid Time Off, Medical, Dental and Vision plans, Retirement plans
  • Public Service Loan Forgiveness (PSLF)
  • Fulltime
Read More
Arrow Right

Manager, Ad Data Operations

As a Manager, Ad Data Operations, you’ll design, maintain and optimize technical...
Location
Location
United States , New York; Oakland
Salary
Salary:
97900.00 - 153300.00 USD / Year
siriusxm.com Logo
SiriusXM
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related technical field
  • 5+ years of experience as a technical analyst or data modeling in cloud data platforms
  • Expertise in SQL, dbt, and Snowflake
  • strong understanding of data warehousing and ELT architecture
  • Deep knowledge of dimensional modeling, metric standardization, and semantic layer principles
  • Experience working with modern data platforms (Snowflake preferred) and modeling tools like dbt
  • Strong stakeholder management skills
  • able to explain complex data logic to non-technical audiences
  • Familiarity with CI/CD, version control (Git), scripting (Python/Shell), automated testing, and analytics workflow orchestration
  • Excellent communication skills and a collaborative mindset
Job Responsibility
Job Responsibility
  • Translate complex business and product KPIs into scalable semantic models using dbt, SQL, and Snowflake
  • Lead the design and maintenance of dimensional models that support standardized metric definitions across teams
  • Act as a key contributor in semantic layer architecture discussions and tool selection
  • Collaborate with product managers, architects, product analysts and data engineers to ensure clear and actionable KPIs and data requirements
  • Manage the modeling of KPIs and business logic through a semantic layer that supports self-service analytics and consistency across reports and dashboards
  • Act as a thought partner to analysts and business leads, helping them access and interpret data effectively
  • Own development standards including CI/CD workflows, model testing, performance optimization, and code reviews
  • Manage and analyze large-scale data volumes (1+ PB) across complex ecosystems that power streaming, podcasting, satellite, publisher, and advertiser use cases
What we offer
What we offer
  • discretionary short-term and long-term incentives
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
84835.61 - 149076.17 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows, BQML, Vertex AI.
  • Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics.
  • Hands-on experience working with real-time, unstructured, and synthetic data.
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar.
  • Expert knowledge on Python programming and SQL.
  • Experience with cloud platforms (AWS, GCP, Azure) and their data services
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
  • Familiarity with synthetic data generation and unstructured data processing
Job Responsibility
Job Responsibility
  • Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets
  • Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth.
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound.
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
  • Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics
  • Optimize data pipelines for performance, scalability, and reliability.
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root.
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Eligible to participate in an annual incentive program
  • Fulltime
Read More
Arrow Right

Senior F-35 Operations Analyst

Provide Advisory and Assistance Services (A&AS) to the F-35 United Operational T...
Location
Location
United States , Las Vegas
Salary
Salary:
Not provided
dcscorp.com Logo
DCS Corporation
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • U.S. Citizenship
  • Bachelor’s degree plus 12 years or a Master’s Degree plus 10 years of relevant experience
  • Active Top Secret Clearance (SCI Eligible)
  • Experience in Air Force or Navy operations and tactics
  • Fighter aircraft knowledge and test experience
  • Experience developing test plans, reports and other formal technical documents
Job Responsibility
Job Responsibility
  • Provide Advisory and Assistance Services (A&AS) to the F-35 United Operational Test Team (UOTT) with developing and managing Test Plans, Test Trials, Reports and developing/processing supporting documentation
  • Develop test plan and data analysis plan inputs
  • Develop data analysis tools and methodologies
  • Manage test data
  • Create products from test data to achieve test objectives
  • Provide written test report inputs that augment the generated technical products
  • Provide A&AS with developing and managing a unified AFOTEC Det 6 data analysis toolset shared across sites
  • Provide expertise in F-35 Operational Test (OT) flight data analysis techniques and software tool development
  • Participate in F-35 OT data analysis
  • Assist Mission Systems Engineer to develop appropriate analysis methodologies and analysis products to fuel test planning and reporting efforts
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

As a Senior Data Engineer on our Core Engineering Data Team, you will design and...
Location
Location
United States , Boston
Salary
Salary:
111800.00 - 164000.00 USD / Year
simplisafe.com Logo
SimpliSafe
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience
  • 4+ years of experience in software engineering, data engineering, or a related field, with at least 2 years focused on data operations or data infrastructure
  • Strong knowledge of AWS or other public cloud platforms (e.g., Azure, GCP)
  • Strong SQL knowledge and experience optimizing for data warehousing technologies like AWS Athena
  • Strong knowledge of Python for use in data transformation
  • Hands-on experience with ETL/ELT, schema design, and datalake technologies
  • Hands-on experience with data orchestration tools like Dagster, Airflow, or Prefect
  • Experience with CI/CD pipelines, Docker, Kubernetes, and infrastructure-as-code tools (e.g., Terraform, CloudFormation)
  • Familiarity with various data and table formats (JSON, Avro, Parquet, Iceberg)
  • Love of data and a passion for building reliable data products
Job Responsibility
Job Responsibility
  • Collaborate with analysts, engineers, product managers, and stakeholders to design and implement solutions for Product and Engineering data workflows
  • Identify areas for improvement and contribute to centralized data platform
  • Manage data pipeline, orchestration, storage, and analytics infrastructure for Product and Engineering
  • Monitor performance and reliability of data pipelines, implementing solutions for scalability and efficiency
  • Optimize table structures to support query and usage patterns
  • Partner with producers of data across SimpliSafe to develop an understanding of data creation and meaning
  • Support data discovery, catalog, and analytics tooling
  • Implement and maintain data security measures and ensure compliance with data governance policies
  • Design and implement testing strategies and data quality validation to ensure the accuracy, reliability, and integrity of data pipelines
  • Contribute to the formation of our new team, assisting with the development of team norms, practices, and charter
What we offer
What we offer
  • A mission- and values-driven culture and a safe, inclusive environment where you can build, grow and thrive
  • A comprehensive total rewards package that supports your wellness and provides security for SimpliSafers and their families
  • Free SimpliSafe system and professional monitoring for your home
  • Employee Resource Groups (ERGs) that bring people together, give opportunities to network, mentor and develop, and advocate for change
  • Participation in our annual bonus program, equity, and other forms of compensation, in addition to a full range of medical, retirement, and lifestyle benefits
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
85000.00 - 150000.00 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or another related technical field Required
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field Preferred
  • 2+ years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows Required
  • 6+ years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics Required
  • Expert knowledge on SQL and Python programming
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed
  • Experience in tuning queries for performance and scalability
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar
  • Excellent organizational, prioritization and analytical abilities
  • Have proven experience working in incremental execution through successful launches
Job Responsibility
Job Responsibility
  • Work closely with various business, IT, Analyst and Data Science groups to collect business requirements
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform
  • Optimize data pipelines for performance, scalability, and reliability
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
  • Troubleshoot and resolve data engineering issues as they arise
  • Develop REST APIs to expose data to other teams within the company
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Fulltime
Read More
Arrow Right

Staff Data Engineer

We are seeking a Staff Data Engineer to architect and lead our entire data infra...
Location
Location
United States , New York; San Francisco
Salary
Salary:
170000.00 - 210000.00 USD / Year
taskrabbit.com Logo
Taskrabbit
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7-10 years of experience in Data Engineering
  • Expertise in building and maintaining ELT data pipelines using modern tools such as dbt, Airflow, and Fivetran
  • Deep experience with cloud data warehouses such as Snowflake, BigQuery, or Redshift
  • Strong data modeling skills (e.g., dimensional modeling, star/snowflake schemas) to support both operational and analytical workloads
  • Proficient in SQL and at least one general-purpose programming language (e.g., Python, Java, or Scala)
  • Experience with streaming data platforms (e.g., Kafka, Kinesis, or equivalent) and real-time data processing patterns
  • Familiarity with infrastructure-as-code tools like Terraform and DevOps practices for managing data platform components
  • Hands-on experience with BI and semantic layer tools such as Looker, Mode, Tableau, or equivalent
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable, reliable data pipelines and infrastructure to support analytics, operations, and product use cases
  • Develop and evolve dbt models, semantic layers, and data marts that enable trustworthy, self-serve analytics across the business
  • Collaborate with non-technical stakeholders to deeply understand their business needs and translate them into well-defined metrics and analytical tools
  • Lead architectural decisions for our data platform, ensuring it is performant, maintainable, and aligned with future growth
  • Build and maintain data orchestration and transformation workflows using tools like Airflow, dbt, and Snowflake (or equivalent)
  • Champion data quality, documentation, and observability to ensure high trust in data across the organization
  • Mentor and guide other engineers and analysts, promoting best practices in both data engineering and analytics engineering disciplines
What we offer
What we offer
  • Employer-paid health insurance
  • 401k match with immediate vesting
  • Generous and flexible time off with 2 company-wide closure weeks
  • Taskrabbit product stipends
  • Wellness + productivity + education stipends
  • IKEA discounts
  • Reproductive health support
  • Fulltime
Read More
Arrow Right