This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Hunkemöller is looking for a Cloud Data Engineer to take a core role in our digital data transformation. This is a fantastic opportunity for a forward-thinking, adaptable data engineering professional to help build and scale the ingestion and infrastructure backbone of our next-generation data platform on GCP
Job Responsibility:
Build Ingestion Pipelines: Design, develop, and deploy robust data ingestion pipelines from various third-party APIs, webhooks, and source systems into Google Cloud
AI-Augmented Engineering: Actively leverage advanced AI coding assistants to accelerate pipeline development, generate boilerplate API connection code, debug complex scripts, and automate repetitive tasks
GCP Infrastructure & Orchestration: Build and manage data workflows using Cloud Composer (Airflow), and leverage Cloud Run and Dataflow for scalable, containerized data processing
Drive Reverse ETL: Architect and maintain the data pipelines that push refined data from BigQuery back into our operational platforms (marketing tools, CRM, etc.) to drive business action
Manage Operational Databases: Utilize Firestore and other NoSQL/relational databases to support operational data needs and microservices
Collaborate and Learn: Partner with our data modeling specialists to ensure smooth handoffs between ingestion and transformation
Participate in code reviews and continuously share new engineering best practices
Requirements:
3 to 5 years of hands-on experience in data engineering, with a strong focus on data integration, APIs, and pipeline architecture
A strong desire to learn quickly and adapt to new technologies
Strong Python programming skills with a proven track record of building custom API extractors, handling pagination, rate limiting, and working with REST/GraphQL endpoints
Hands-on experience with Google Cloud Platform's ecosystem, specifically Cloud Composer, Dataflow, Cloud Run, and Firestore
Proficient in writing clean, well-documented, and tested code (e.g., pytest), with strong experience using Git, Docker, and CI/CD pipelines
Excellent written and verbal English communication skills
Nice to have:
Experience managing Infrastructure as Code (specifically Terraform) or working with downstream data transformation tools (dbt)