This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As a Senior Data Engineer, you will take ownership of major parts of our data platform and help shape how data is collected, processed, modeled, and delivered across the company. You will design and build large-scale data architectures, guide engineering decisions, and work closely with cross-functional partners to enable data-driven products and operations. This role requires both technical excellence and strong soft skills, including clear communication, ownership, and the ability to drive alignment across functions.
Job Responsibility:
Design data structures, batch and streaming pipelines, and processes for large-scale analytics and machine learning
Design, optimize and implement robust ETL/ELT workflows to feed Data Warehouse (Redshift) and other analytical layers for reporting, analytics, and operational use cases
Maintain, improve, and troubleshoot production pipelines, ensuring high reliability and performance
Own the architecture and development of core data systems end-to-end
Mentor Mid-Level and Junior Engineers and provide guidance on best practices, workflow standards, and troubleshooting
Collaborate closely with Data Scientists, Product, and Engineering teams to translate business requirements into scalable data solutions
Document systems, pipelines, and architecture
maintain runbooks and operational guides
Identify technical risks, propose solutions, and drive improvements in platform design, reliability, and performance
Participate in code reviews, provide technical mentorship, and help establish best practices
Requirements:
5+ years of hands-on experience in data engineering or a related field
Strong programming skills in Python and SQL (CTEs, window functions, complex joins, stored procedures)
Experience building batch and real-time data pipelines with Spark, Flink or similar frameworks
Solid understanding of ETL/ELT workflows, CI/CD pipelines, and production-grade deployment practices
Strong knowledge of data modeling, Redshift or similar MPP data warehouses, and distributed systems
Experience with AWS or other cloud provider services (EC2, Lambda, Step Functions, Glue, Redshift, S3)
Ability to operate independently, solve complex problems, and make sound technical decisions
Excellent communication skills and ability to work with both technical and non-technical stakeholders
Solid communication skills in English
Nice to have:
Experience designing infrastructure as code using AWS CDK or Terraform
Experience with ML model training, evaluation, and inference pipelines
Knowledge of data governance, metadata management, and data quality frameworks
Track record of designing analytical layers and building scalable data platforms from scratch
What we offer:
Company breakfast and lunch every day in the office
Flexible working hours with home office opportunity
Medicare health insurance
Udemy business account for continuous learning and self-development
Office Massage
Being part of a professional development team and sharing our success in the international market that you can be proud of
Supportive and inspiring team accepted and respected managers sharing more than a decade long track record
Recreation room with darts, ping pong, foosball, Xbox, and other games
Modern and fancy office in Buda close to Széll Kálmán tér