This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Data Infrastructure team builds distributed systems and tools supporting Intercom and empowering people with information. As the company grows, so does the volume and velocity of our data and the appetite for more-and-more sophisticated and specialized, often AI-assisted, data solutions. Our team builds, maintains, evolves, and extends the data platform, enabling our partners to self-serve by creating their own end-to-end data workflows, from ingestion through transforming data and evaluating experiments to analyzing usage and running predictive models. We provide a solid data foundation to support various highly impactful business and product-focused projects.
Job Responsibility:
Evolve the Data Platform by designing and building the next generation of the stack
Develop, run and support our data pipelines using tools like Airflow, PlanetScale, Kinesis, Snowflake, Tableau, all in AWS
Collaborate with product managers, data engineers, analysts and data scientists to develop tooling and infrastructure to support their needs
Develop automation and tooling to support the creation and discovery of high quality analytics data in an environment where dozens of changes can be shipped daily
Implement systems to monitor our infrastructure, detect and surface data quality issues and ensure Operational Excellence
Requirements:
3+ years of full-time, professional work experience in the data space using Python and SQL
Solid experience building and running data pipelines for large and complex datasets including handling dependencies
Hands-on cloud provider experience (preferably AWS) including service integrations and automation via CLI and APIs
Solid understanding of data security practices and are passionate about privacy
Some DevOps experience
You care about your craft
Nice to have:
Worked with Apache Airflow - we use Airflow extensively to orchestrate and schedule all of our data workflows. A good understanding of the quirks of operating Airflow at scale would be helpful
Experience or understanding of tools and technologies included in the modern data stack ( Snowflake, DBT )
Industry awareness of up-and-coming technologies and vendors
What we offer:
Competitive salary and equity in a fast-growing start-up
We serve lunch every weekday, plus a variety of snack foods and a fully stocked kitchen
Regular compensation reviews - we reward great work!
Pension scheme & match up to 4%
Peace of mind with life assurance, as well as comprehensive health and dental insurance for you and your dependents
Open vacation policy and flexible holidays so you can take time off when you need it
Paid maternity leave, as well as 6 weeks paternity leave for fathers, to let you spend valuable time with your loved ones
If you’re cycling, we’ve got you covered on the Cycle-to-Work Scheme. With secure bike storage too
MacBooks are our standard, but we also offer Windows for certain roles when needed
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.