This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
This is a new role and one of the most strategically important on the team. You will own the revenue data layer — the Snowflake-based foundation that every automation, AI workflow, and analytics system in our GTM stack depends on. You will build and maintain the pipelines that move data from Salesforce, Gong, Marketo, and other GTM sources into a clean, governed, and queryable form. You are the quality gate. When an AI workflow produces a wrong answer, it is often a data problem upstream. Your job is to make sure that does not happen — by building pipelines that are reliable, schemas that are stable, and data contracts that every system can trust. Every workflow this team builds is only as good as the data underneath it. This role is the foundation everything else is built on.
Job Responsibility:
Design, build, and maintain the Snowflake data warehouse that serves as the single source of truth for all GTM analytics and automation
Define and enforce schema standards, naming conventions, and data governance policies across revenue data
Build and maintain dbt models (or equivalent) that transform raw GTM source data into clean, analytics-ready tables
Monitor pipeline health and proactively identify and resolve data quality issues before they surface in downstream workflows
Maintain data documentation and data dictionaries for all GTM data assets
Build and maintain data pipelines that ingest data from Salesforce, Gong, Marketo, Intercom, and other GTM platforms into Snowflake
Design reliable, observable pipeline architectures with appropriate alerting and failure handling
Implement change data capture (CDC) and incremental loading patterns to keep revenue data current
Partner with the GTM Engineers to ensure every automation workflow pulling from Snowflake has the data it needs in the right shape
Serve as the data contract owner between GTM source systems and the analytics and AI layers
Support the build of revenue analytics in Sigma, Tableau, or equivalent BI tooling
Enable self-serve querying by GTM leaders through well-designed, documented data models
Build forecasting support data models that feed weekly pipeline and forecast intelligence workflows
Requirements:
4–7+ years of data engineering experience building production data pipelines and warehouses
Strong Snowflake experience — schema design, performance optimization, data sharing, and query patterns
Experience with dbt for data transformation and modeling in a production environment
Proficiency in SQL and at least one pipeline language — Python preferred
Experience building pipelines from SaaS platforms (Salesforce, Gong, Marketo, or similar) via API or CDC
Familiarity with pipeline orchestration tools — Airflow, Prefect, dbt Cloud, Fivetran, or similar
Strong understanding of data modeling, normalization, and dimensional modeling patterns
Experience designing and implementing data governance standards in a multi-source environment
Familiarity with GTM data — CRM objects, pipeline stages, activity data, revenue metrics — strongly preferred
Experience with BI tools (Sigma, Tableau, Looker, or similar) a plus
Bachelor's degree in Computer Science, Data Engineering, Statistics, or related technical field
Nice to have:
Experience with BI tools (Sigma, Tableau, Looker, or similar) a plus
Familiarity with GTM data — CRM objects, pipeline stages, activity data, revenue metrics — strongly preferred