This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
iCapital is seeking an exceptional Data Engineer to join our Lisbon team and help scale the data foundations that power our business. In this role, you will design, build, and optimize the data pipelines and infrastructure that enable data to be a strategic asset across the organization. Data sits at the core of iCapital’s operations. We rely on clean, reliable, well‑structured data to drive decision‑making, enhance client experiences, and support a rapidly growing product suite. As a Data Engineer, you’ll not only be responsible for moving and transforming data—you’ll also develop a deep understanding of the business context behind it and help teams act on insights. This role is highly collaborative. You will partner with product, engineering, analytics, and business stakeholders to quickly understand evolving challenges and deliver scalable, maintainable, and high‑quality solutions. Clean code, thoughtful architecture, and an eagerness to innovate are essential. If you enjoy working with modern technologies, building robust data systems, and bringing creative ideas to life, we’d love to hear from you.
Job Responsibility:
Develop and automate large scale, high-performance data platform pipelines and infrastructure to drive iCapital business growth and enable data-driven decision making
Design and develop reusable components and frameworks for ingestion, cleansing, and data quality
Streamline the ingestion of raw data from various sources into our Data Lake and Data Warehouse
Design data models for optimal storage and retrieval that represent the tangible business domains across iCapital’s ecosystem
Coordinate closely with operations, sales, and product development teams daily to push iCapital’s FinTech strategy and improve the overall profitability of our business
Requirements:
6+ years of professional experience in a Data Engineering or data-driven Software Engineering role
Exceptional Python and SQL skills
Extensive experience with orchestration frameworks (Prefect, Airflow, Dagster, etc.)
Deep understanding of OLAP (Snowflake, Databricks) / OLTP (PostgreSQL, MongoDB) databases and ELT frameworks (dbt, dlthub)
Experience developing and deploying code to Cloud environments (AWS, GCP, Azure)
Possess a strong grasp of object-oriented/functional programming and an ability to write easy-to-scale, high-quality code