This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Our client is a global technology and digital transformation consultancy, partnering with enterprise organisations to deliver large-scale data, cloud, and AI initiatives. They operate across multiple industries, helping businesses modernise platforms, unlock the value of data, and drive measurable outcomes. With a strong engineering culture and international delivery model, teams work on complex, high-impact programmes across multiple regions, combining technical excellence with a collaborative and people-focused environment. We are looking for an experienced Data Architect to design, modernise, and support enterprise‑scale data platforms. This role requires a hands‑on technologist who can lead architectural decisions, deliver complex migrations, and collaborate closely with global stakeholders.
Job Responsibility:
Design and implement scalable, enterprise-grade data architectures using modern cloud platforms, with a strong focus on Snowflake
Lead and contribute to large-scale data migrations from Hadoop, Spark, and other legacy big data platforms to cloud-native solutions
Architect and build production-grade ELT/ETL pipelines, ensuring reliability, performance, and data quality
Work closely with engineering, analytics, and business stakeholders to translate requirements into robust data solutions
Apply best practices in data modelling, ingestion, transformation, and orchestration within complex enterprise environments
Collaborate effectively within distributed global teams, contributing to design discussions and technical decision-making across regions
Ensure solutions meet enterprise standards for scalability, security, and maintainability
Requirements:
Strong hands-on experience with Snowflake in enterprise production environments
Proven experience delivering data migrations from Hadoop, Spark, or other large-scale big data platforms
Background working in large, enterprise-scale data environments (not small or start-up platforms)
Demonstrated experience building and maintaining production-ready ELT/ETL pipelines
Solid understanding of modern data architecture principles, cloud data platforms, and distributed systems
Comfortable working in globally distributed teams across time zones
Fluent Hungarian (spoken and written)
Willing and able to work three days per week from the Budapest office
Nice to have:
Experience in consulting or multi‑client delivery environments
Exposure to AWS, Azure, or GCP
Familiarity with enterprise data governance and security frameworks