This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Cloud Architecture team in Enterprise Technology serves to enable tech teams to build secure, modern, cloud-native solutions – on-premises or in the public cloud. We focus on filling the gaps between tools and platforms, accelerating adoption through proven patterns and hands-on support. We deliver reference architectures, code accelerators, and production-grade examples, while offering advisory services and collaborative engagement with teams to help them deliver robust, scalable solutions.
Job Responsibility:
Set up, test, and troubleshoot containerised development environments using Podman
Contribute to the development of code accelerators
Write small tools (mostly in Python, Bash or Terraform) to support developer workflows or demos
Participate in hands-on prototyping or demos of architectural patterns for internal teams
Create and update internal documentation and developer guides to make it easier for teams to adopt
Join pairing or working sessions with other engineers across the organisation to help smooth the path to adoption
Learn and explore new tools and cloud native technologies (e.g. AWS, GCP, Snowflake, Databricks) under the guidance of senior team members
Requirements:
A strong academic background in Computer Science, Engineering, Data Science, or a related technical discipline
Proficiency in Java or Python
Strong written and verbal communication skills
Proactive learner with the ability to self-teach new tools and technologies
an ability to educate others
Familiarity with the Linux command line and shell scripting
Understanding of basic data concepts – file formats (e.g. CSV, Parquet), data pipelines, and storage layers
Exposure to containers and Podman or Docker
Comfortable using Git
an awareness of CI/CD practices and tools such as GitHub Actions or Azure DevOps
Nice to have:
Experience of working with Apache Spark / Flink / Kafka
Familiarity with object storage e.g. AWS S3
Knowledge of containerised development workflows using e.g., VSCode
Basic understanding of cloud platforms like AWS or GCP
Experience contributing to open-source or internal code templates, demos, or accelerators
Familiarity with data catalog services (e.g. Hive, Polaris, Glue)
What we offer:
27 days annual leave (plus bank holidays)
A discretional annual performance related bonus
Private Medical Care & Life Insurance
Employee Assistance Program
Pension Plan
Paid Parental Leave
Special discounts for employees, family, and friends
Access to an array of learning and development resources
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.