This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Analytics Engineering team provides all parts of Intercom’s business with the right set of accurate and up-to-date data tables that form the basis of all of our most important operational and strategic data-driven decisions. The team uses their deep knowledge of how GTM teams work to anticipate their data needs. They use their deep technical expertise to design the right data models and reliably turn these models into high-quality core data assets in our data warehouse. The team are experts at driving data model efficiencies and solving data quality issues at their root cause. This team is the glue between all of our Analyst teams and our Data Infrastructure teams. They speak everyone's language and make everything faster.
Job Responsibility:
Data Platform Development: Design, build, and manage scalable data pipelines and ELT processes to support a robust, analytics-ready data platform.
Cross-functional Collaboration: Partner with engineering, analytics, and business teams to understand data needs and ensure accurate, insightful data solutions.
Data Strategy & Governance: Lead initiatives in data model development, data quality ownership, warehouse management, and production support for critical workflows.
Advanced Analytics & Insights: Conducted in-depth data analysis and built custom models to support strategic business decisions and performance measurement.
Automation & Optimization: Streamline data collection and reporting processes to reduce manual effort and improve efficiency.
Innovation in Data Infrastructure: Created scalable solutions like unified data pipelines and access control systems to meet evolving organizational needs.
Impact Measurement: Developed a strategy and technical pipeline for evaluating research output impact using citation metadata.
Requirements:
You write advanced SQL with a preference for well-architected data models, optimized query performance, and clearly documented code
You’re familiar with the modern data stack. DBT and Snowflake experience are a big plus.
A growth mindset and eagerness to learn.
You exhibit great judgment and sharp business and product instincts that allow you to differentiate essential versus nice-to-have and to make good choices about trade-offs
You practice excellent communication skills, and you tailor explanations of technical concepts to a variety of audiences
Nice to have:
Exposure to Apache Airflow or other DAG frameworks — we use Airflow to orchestrate and schedule all of our data workflows and transformations
Worked in Tableau, Looker, or similar visualization/business intelligence platform
Experience with operational tools and business systems such Google Analytics, Marketo, Salesforce, Segment, or Stripe
Familiarity with Python
What we offer:
Competitive salary and meaningful equity
Comprehensive medical, dental, and vision coverage
Regular compensation reviews - great work is rewarded!
Flexible paid time off policy
Paid Parental Leave Program
401k plan & match
In-office bicycle storage
Fun events for Intercomrades, friends, and family!
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.