CrawlJobs Logo

Data Engineer - Core Systems

n26.com Logo

N26

Location Icon

Location:
Germany; Spain , Berlin

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are seeking a highly skilled and experienced Data Engineer to join our Core Systems team. The ideal candidate will have a proven track record of building and maintaining robust data products and data mart infrastructure within the AWS Redshift environment. This role is critical in empowering our organization with actionable insights through efficient and reliable data solutions.

Job Responsibility:

  • Design, develop, and deploy scalable data products that provide meaningful insights to business stakeholders
  • Build and maintain a robust data mart infrastructure within AWS Redshift to support reporting needs
  • Design and implement efficient data models to optimize data storage, retrieval, and performance
  • Develop and maintain efficient ETL (Extract, Transform, Load) processes to ingest data from various sources into the data mart
  • Implement data quality checks and monitoring processes to ensure data accuracy and integrity
  • Continuously monitor and optimize data pipelines and queries for optimal performance
  • Work closely with data analysts, business intelligence teams, and other stakeholders to understand data requirements and deliver solutions
  • Stay abreast of emerging technologies and industry trends in data engineering and propose innovative solutions

Requirements:

  • Background in Computer Science, Engineering, or a related field
  • 5+ years of experience in data engineering, with a strong emphasis on data product development and data warehousing
  • Extensive experience with AWS Redshift, including data modelling, performance tuning, and cluster management
  • Proficiency in ETL tools such as Apache Airflow, AWS Glue, or similar
  • Strong programming skills in Python or Java
  • Expertise in SQL for data manipulation and analysis
  • Experience with data modelling techniques and best practices
  • Excellent problem-solving and analytical skills
  • Strong communication and interpersonal skills to collaborate effectively with cross-functional teams

Nice to have:

  • Experience in the fintech industry
  • Familiarity with DBT
  • Knowledge of data governance and security practices
  • Experience with Infrastructure-as-Code tools - Terraform
What we offer:
  • Accelerate your career growth by joining one of Europe’s most talked about disruptors
  • Employee benefits that range from a competitive personal development budget, work from home budget, discounts to fitness & wellness memberships, language apps and public transportation
  • As an N26 employee you will have access to a Premium subscription on your personal N26 bank account. As well as subscriptions for friends and family members
  • Additional day of annual leave for each year of service
  • A high degree of autonomy and access to cutting edge technologies - all while working with a friendly team of peers of diverse nationalities, life experiences and backgrounds
  • A relocation package with visa support for those who need it

Additional Information:

Job Posted:
December 10, 2025

Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Engineer - Core Systems

Senior Software Engineer, Core Data

As a Senior Software Engineer on our Core Data team, you will take a leading rol...
Location
Location
United States
Salary
Salary:
190000.00 - 220000.00 USD / Year
pomelocare.com Logo
Pomelo Care
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience building high-quality, scalable data systems and pipelines
  • Expert-level proficiency in SQL and Python, with a deep understanding of data modeling and transformation best practices
  • Hands-on experience with dbt for data transformation and Dagster, Beam, Dataflow or similar tools for pipeline orchestration
  • Experience with modern data stack tools and cloud platforms, with a strong understanding of data warehouse design principles
  • A track record of delivering elegant and maintainable solutions to complex data problems that drive real business impact
Job Responsibility
Job Responsibility
  • Build and maintain elegant data pipelines that orchestrate ingestion from diverse sources and normalize data for company-wide consumption
  • Lead the design and development of robust, scalable data infrastructure that enables our clinical and product teams to make data-driven decisions, using dbt, Dagster, Beam and Dataflow
  • Write clean, performant SQL and Python to transform raw data into actionable insights that power our platform
  • Architect data models and transformations that support both operational analytics and new data-driven product features
  • Mentor other engineers, providing technical guidance on data engineering best practices and thoughtful code reviews, fostering a culture of data excellence
  • Collaborate with product, clinical and analytics teams to understand data needs and ensure we are building infrastructure that unlocks the most impactful insights
  • Optimize data processing workflows for performance, reliability and cost-effectiveness
What we offer
What we offer
  • Competitive healthcare benefits
  • Generous equity compensation
  • Unlimited vacation
  • Membership in the First Round Network (a curated and confidential community with events, guides, thousands of Q&A questions, and opportunities for 1-1 mentorship)
  • Fulltime
Read More
Arrow Right

Senior Principal Engineer Core Data Platform

As an engineer well into your career, we know you're an expert at what you do an...
Location
Location
United States , Seattle; San Francisco; Mountain View
Salary
Salary:
198300.00 - 318600.00 USD / Year
https://www.atlassian.com Logo
Atlassian
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related technical field
  • 12+ years of experience in backend software development, with a focus on distributed systems and large-scale storage solutions
  • 8+ years of experience designing and managing highly available, large-scale storage architectures in cloud environments
  • 5+ years of hands-on experience working with AWS storage services (S3, EBS, EFS, FSx, Glacier, DynamoDB)
  • Proficiency in system design, performance optimization, and cost-efficient architecture for exabyte-scale storage
  • Expertise in at least one major backend programming language (Kotlin, Java, Go, Rust, or Python)
  • Experience leading technical strategy and architectural decisions in large, multi-team engineering organizations
  • Strong understanding of distributed systems principles, including consistency models, replication, sharding, and consensus algorithms (Raft, Paxos)
  • Deep knowledge of security best practices, including encryption, access control (IAM), and compliance standards (SOC2, GDPR, HIPAA)
  • Experience mentoring senior engineers and driving high-impact engineering initiatives
Job Responsibility
Job Responsibility
  • Collaborate with partner teams and internal customers to help define technical direction and OKRs for the Core Data platform organization
  • Regularly tackle the largest and most complex problems on the team, from technical design to implementation and launch
  • Partner across engineering teams to take on company-wide initiatives spanning multiple projects
  • Routinely tackle complex architecture challenges and apply architectural standards and start using them on new projects
  • Work across senior engineering and product leaders to build strategy and design solutions to earn customers trust and business
  • Own key OKRs and end-to-end outcomes of critical projects in a microservices environment
  • Champion best practices and innovative techniques for scalability, reliability, and performance optimizations
  • Own engineering and operational excellence for the health of our systems and processes
  • Proactively drive opportunities for continuous improvements and own key operational metrics
  • Continually drive developer productivity initiatives to ensure that we unleash the potential of our own teams
What we offer
What we offer
  • health coverage
  • paid volunteer days
  • wellness resources
  • Fulltime
Read More
Arrow Right

Data Engineer, Enterprise Data, Analytics and Innovation

Are you passionate about building robust data infrastructure and enabling innova...
Location
Location
United States
Salary
Salary:
110000.00 - 125000.00 USD / Year
vaniamgroup.com Logo
Vaniam Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional experience in data engineering, ETL, or related roles
  • Strong proficiency in Python and SQL for data engineering
  • Hands-on experience building and maintaining pipelines in a lakehouse or modern data platform
  • Practical understanding of Medallion architectures and layered data design
  • Familiarity with modern data stack tools, including: Spark or PySpark
  • Workflow orchestration (Airflow, dbt, or similar)
  • Testing and observability frameworks
  • Containers (Docker) and Git-based version control
  • Excellent communication skills, problem-solving mindset, and a collaborative approach
Job Responsibility
Job Responsibility
  • Design, build, and operate reliable ETL and ELT pipelines in Python and SQL
  • Manage ingestion into Bronze, standardization and quality in Silver, and curated serving in Gold layers of our Medallion architecture
  • Maintain ingestion from transactional MySQL systems into Vaniam Core to keep production data flows seamless
  • Implement observability, data quality checks, and lineage tracking to ensure trust in all downstream datasets
  • Develop schemas, tables, and views optimized for analytics, APIs, and product use cases
  • Apply and enforce best practices for security, privacy, compliance, and access control, ensuring data integrity across sensitive healthcare domains
  • Maintain clear and consistent documentation for datasets, pipelines, and operating procedures
  • Lead the integration of third-party datasets, client-provided sources, and new product-generated data into Vaniam Core
  • Partner with product and innovation teams to build repeatable processes for onboarding new data streams
  • Ensure harmonization, normalization, and governance across varied data types (scientific, engagement, operational)
What we offer
What we offer
  • 100% remote environment with opportunities for local meet-ups
  • Positive, diverse, and supportive culture
  • Passionate about serving clients focused on Cancer and Blood diseases
  • Investment in you with opportunities for professional growth and personal development through Vaniam Group University
  • Health benefits – medical, dental, vision
  • Generous parental leave benefit
  • Focused on your financial future with a 401(k) Plan and company match
  • Work-Life Balance and Flexibility
  • Flexible Time Off policy for rest and relaxation
  • Volunteer Time Off for community involvement
  • Fulltime
Read More
Arrow Right

Staff Data Engineer

We’re looking for a Staff Data Engineer to own the design, scalability, and reli...
Location
Location
United States , San Jose
Salary
Salary:
150000.00 - 250000.00 USD / Year
figure.ai Logo
Figure
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience owning or architecting large-scale data platforms — ideally in EV, autonomous driving, or robotics fleet environments, where telemetry, sensor data, and system metrics are core to product decisions
  • Deep expertise in data engineering and architecture (data modeling, ETL orchestration, schema design, transformation frameworks)
  • Strong foundation in Python, SQL, and modern data stacks (dbt, Airflow, Kafka, Spark, BigQuery, ClickHouse, or Snowflake)
  • Experience building data quality, validation, and observability systems to detect regressions, schema drift, and missing data
  • Excellent communication skills — able to understand technical needs from domain experts (controls, perception, operations) and translate complex data patterns into clear, actionable insights for engineers and leadership
  • First-principles understanding of electrical and mechanical systems, including motors, actuators, encoders, and control loops
Job Responsibility
Job Responsibility
  • Architect and evolve Figure’s end-to-end platform data pipeline — from robot telemetry ingestion to warehouse transformation and visualization
  • Improve and maintain existing ETL/ELT pipelines for scalability, reliability, and observability
  • Detect and mitigate data regressions, schema drift, and missing data via validation and anomaly-detection frameworks
  • Identify and close gaps in data coverage, ensuring high-fidelity metrics coverage across releases and subsystems
  • Define the tech stack and architecture for the next generation of our data warehouse, transformation framework, and monitoring layer
  • Collaborate with robotics domain experts (controls, perception, Guardian, fall-prevention) to turn raw telemetry into structured metrics that drive engineering/business decisions
  • Partner with fleet management, operators, and leadership to design and communicate fleet-level KPIs, trends, and regressions in clear, actionable ways
  • Enable self-service access to clean, documented datasets for engineers
  • Develop tools and interfaces that make fleet data accessible and explorable for engineers without deep data backgrounds
  • Fulltime
Read More
Arrow Right

Software Engineer, Data Infrastructure

The Data Infrastructure team at Figma builds and operates the foundational platf...
Location
Location
United States , San Francisco; New York
Salary
Salary:
149000.00 - 350000.00 USD / Year
figma.com Logo
Figma
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of Software Engineering experience, specifically in backend or infrastructure engineering
  • Experience designing and building distributed data infrastructure at scale
  • Strong expertise in batch and streaming data processing technologies such as Spark, Flink, Kafka, or Airflow/Dagster
  • A proven track record of impact-driven problem-solving in a fast-paced environment
  • A strong sense of engineering excellence, with a focus on high-quality, reliable, and performant systems
  • Excellent technical communication skills, with experience working across both technical and non-technical counterparts
  • Experience mentoring and supporting engineers, fostering a culture of learning and technical excellence
Job Responsibility
Job Responsibility
  • Design and build large-scale distributed data systems that power analytics, AI/ML, and business intelligence
  • Develop batch and streaming solutions to ensure data is reliable, efficient, and scalable across the company
  • Manage data ingestion, movement, and processing through core platforms like Snowflake, our ML Datalake, and real-time streaming systems
  • Improve data reliability, consistency, and performance, ensuring high-quality data for engineering, research, and business stakeholders
  • Collaborate with AI researchers, data scientists, product engineers, and business teams to understand data needs and build scalable solutions
  • Drive technical decisions and best practices for data ingestion, orchestration, processing, and storage
What we offer
What we offer
  • equity
  • health, dental & vision
  • retirement with company contribution
  • parental leave & reproductive or family planning support
  • mental health & wellness benefits
  • generous PTO
  • company recharge days
  • a learning & development stipend
  • a work from home stipend
  • cell phone reimbursement
  • Fulltime
Read More
Arrow Right

Big Data Platform Senior Engineer

Lead Java Data Engineer to guide and mentor a talented team of engineers in buil...
Location
Location
Bahrain , Seef, Manama
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Significant hands-on experience developing high-performance Java applications (Java 11+ preferred) with strong foundation in core Java concepts, OOP, and OOAD
  • Proven experience building and maintaining data pipelines using technologies like Kafka, Apache Spark, or Apache Flink
  • Familiarity with event-driven architectures and experience in developing real-time, low-latency applications
  • Deep understanding of distributed systems concepts and experience with MPP platforms such as Trino (Presto) or Snowflake
  • Experience deploying and managing applications on container orchestration platforms like Kubernetes, OpenShift, or ECS
  • Demonstrated ability to lead and mentor engineering teams, communicate complex technical concepts effectively, and collaborate across diverse teams
  • Excellent problem-solving skills and data-driven approach to decision-making
Job Responsibility
Job Responsibility
  • Provide technical leadership and mentorship to a team of data engineers
  • Lead the design and development of highly scalable, low-latency, fault-tolerant data pipelines and platform components
  • Stay abreast of emerging open-source data technologies and evaluate their suitability for integration
  • Continuously identify and implement performance optimizations across the data platform
  • Partner closely with stakeholders across engineering, data science, and business teams to understand requirements
  • Drive the timely and high-quality delivery of data platform projects
  • Fulltime
Read More
Arrow Right

Sr Data Engineer

(Locals or Nearby resources only). You will work with technologies that include ...
Location
Location
United States , Glendale
Salary
Salary:
Not provided
enormousenterprise.com Logo
Enormous Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of data engineering experience developing large data pipelines
  • Proficiency in at least one major programming language (e.g. Python, Java, Scala)
  • Hands-on production environment experience with distributed processing systems such as Spark
  • Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
  • Experience with at least one major Massively Parallel Processing (MPP) or cloud database technology (Snowflake, Databricks, Big Query)
  • Experience in developing APIs with GraphQL
  • Advance understanding of OLTP vs OLAP environments
  • Candidates must work W2, no Corp 2 Corp
  • US Citizen, Green Card Holder, H4-EAD, TN-Visa
  • Airflow
Job Responsibility
Job Responsibility
  • Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines
  • Build and maintain APIs to expose data to downstream applications
  • Develop real-time streaming data pipelines
  • Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform
  • Contribute to developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, and more
  • Ensure high operational efficiency and quality of the Core Data platform datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our stakeholders (Engineering, Data Science, Operations, and Analytics teams)
What we offer
What we offer
  • 3 levels of medical insurance for you and your family
  • Dental insurance for you and your family
  • 401k
  • Overtime
  • Sick leave policy: accrue 1 hour for every 30 hours worked up to 48 hours
Read More
Arrow Right

Sr. Manager- Data Engineering

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
96404.10 - 169021.85 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field
  • 8+ years experience in data engineering solutions such as data platforms, data ingestion, data management, or publication/analytics
  • 3+ years leadership experience building and managing a high performing teams
  • 2+ years experience in Google Cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows
  • Progressively responsible experience starting as Data Engineer and advancement in complexity, and level of responsibility
  • Must be highly analytical having the proven ability to develop and reverse complex engineering solutions
  • Critical thinking and advanced problem-solving skills are core behaviors among the team
  • Excellent oral/written communication and presentation skills
  • Experience in objectively evaluating current processes for opportunities to optimize inter departmental communication, collaboration, and end-to-end process performance
Job Responsibility
Job Responsibility
  • Design and build trusted, reliable and timely datasets, metrics and data pipelines that are critical to the direction of the company
  • Build and lead a high-performing data engineering team in a hands-on technical capacity
  • Be responsible for shaping how we acquire, collect and leverage data
  • Define and manage SLA's for all data sets and processes running in production
  • Work closely with Product Managers, Analysts, Data Scientists to develop and own data-driven systems
  • Develop data democratization layer for self-served reporting
  • Assist development team in troubleshooting, coding, testing, implementation, and documenting solutions
  • Support, grow, mentor and inspire new and existing team members
  • Be a key leader of the Data and Analytics team, working to propel the business towards being more data-driven
  • Performs other duties as assigned
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Eligible to participate in an annual incentive program
  • Fulltime
Read More
Arrow Right