This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As a Senior Data Engineer, you’ll be part of the team responsible for designing and delivering the data products that power Fever’s partner ecosystem — from real-time APIs and dashboards to data exports or automated reports. Our teams leverage modern data platforms (Snowflake, DBT, Airflow, Superset, DataHub, etc.) while developing our own technology to deliver high-quality, reliable, and scalable data products across multiple business domains. You’ll work at the crossroads of engineering, product, and business, translating complex requirements into elegant and maintainable solutions that create real impact for our clients and internal teams.
Job Responsibility:
Deeply understand Fever’s backend systems and data models, identifying the logic and transformations that connect business processes to data representations
Collaborate fluidly with Product Managers, Project Managers, and business stakeholders to translate ideas and reporting needs into clear technical requirements and data structures
Develop and optimize ETL/ELT workflows using Python, DBT, Airflow, and Snowflake, ensuring scalability, maintainability, and reliability
Design, build, and maintain partner-facing data products, including APIs (real-time and batch), dashboards, webhooks, and file-based exports
Ensure that all data delivered to partners is accurate, consistent, and aligned with the underlying business context
Contribute to the architecture, testing, and observability of our data ecosystem, promoting best practices and automation wherever possible
Take ownership from idea to delivery — you’ll scope, design, implement, and ship solutions in close collaboration with cross-functional teams
Support the evolution of Fever’s data mesh, helping domain teams design data products that follow common engineering and governance standards
Requirements:
Bachelor’s or Master’s degree in Computer Engineering, Data Engineering, or a related field
Strong proficiency in Python, SQL, Airflow, and Snowflake (or similar technologies)
A deep understanding of data modeling and how complex backend systems generate and expose data across products and domains
Proven experience building data products for external consumption, such as APIs, dashboards, and partner-facing data feeds
Excellent communication and collaboration skills — you can work seamlessly with business stakeholders and product managers, translating abstract needs into clear data solutions
You communicate clearly, think in terms of impact, and make technical decisions grounded in business context
You thrive in fast-paced environments, adapt quickly to change, and maintain a strong sense of ownership from concept to delivery
Familiarity with modern data practices: version control (Git), CI/CD, testing, and observability
Fluent in Spanish and English
you’ll communicate in Spanish with local teams and in English with global teams
Nice to have:
Experience with real-time data platforms as Tinybird or Clickhouse
Exposure to domains like B2B, Marketing, or Partner Reporting
Experience with streaming or near-real-time data (Kafka, Kinesis, etc.)
What we offer:
Attractive compensation package consisting of base salary and the potential to earn a significant bonus for top performance
Stock options
Opportunity to have a real impact in a high-growth global category leader
40% discount on all Fever events and experiences
Home office friendly
Responsibility from day one and professional and personal growth
Great work environment with a young, international team of talented people to work with
Health insurance and other benefits such as Flexible remuneration with a 100% tax exemption through Cobee
English Lessons
Gympass Membership
Possibility to receive in advance part of your salary by Payflow