This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Hunkemöller is looking for a Data Engineer to take a core role in our digital data transformation. This is a fantastic opportunity for a forward-thinking, adaptable data engineering professional to help build our next-generation, cloud-native data platform on GCP.
Job Responsibility:
Build Data Pipelines: Develop, test, and maintain robust, scalable data pipelines using SQL, dbt, and cloud technologies (GCP), ensuring high standards of data quality and reliability
AI-Augmented Engineering: Actively leverage advanced AI coding assistants and LLMs to accelerate pipeline development, debug complex code, generate documentation, and automate repetitive tasks
Collaborate on Data Modeling: Assist in the implementation of scalable data models (e.g., star schemas, data vaults) within our enterprise data warehouse (BigQuery)
Develop on GCP: Build and maintain our Google Cloud Platform (GCP) data infrastructure, focusing on automation, security, and performance improvements
Collaborate and Learn: Partner with Product, Data, and Design teams to resolve technical data issues. Participate in code reviews and continuously learn and share new engineering best practices
Build for Analytics & AI: Build and optimize data platforms that power our BI, Data Science, and AI solutions, ensuring data is accessible, reliable, and ready for analysis
Requirements:
Solid Data Engineering Experience: 1 to 3 years of hands-on experience in data engineering, with a strong track record of building data pipelines and working with data warehouse solutions
Adaptability & Continuous Learning: A strong desire to learn quickly and adapt to new technologies. You embrace modern development practices and are comfortable using AI tools as a force multiplier in your daily work
Strong SQL Proficiency: Advanced skills in writing, optimizing, and debugging complex SQL queries for data manipulation and analysis
Cloud & Data Warehousing: Solid knowledge of cloud data services, preferably on Google Cloud Platform (BigQuery, Dataflow, etc.)
Programming & Frameworks: Experience with Python and applying software engineering principles to data solutions. Hands-on experience with dbt is highly preferred
Code Quality & Version Control: Proficient in writing clean, well-documented, and tested code, with strong experience using Git and CI/CD workflows
English Proficiency: Excellent written and verbal English communication skills
What we offer:
25 days of annual leave, with the option to buy or sell up to 4 additional days
Hybrid work model, combining office and remote working
Possibility to work from abroad for up to 2 weeks per year
An international work environment, working with teams across different countries
Travel allowance to support commuting costs
Access to the Hunkemöller Academy for professional development