CrawlJobs Logo

Cloud Technical Architect / Data DevOps Engineer

https://www.hpe.com/ Logo

Hewlett Packard Enterprise

Location Icon

Location:
United Kingdom, Bristol

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

Not provided

Job Description:

The role involves designing, implementing, and optimizing scalable Big Data and cloud solutions while collaborating with internal and external teams. It requires expertise in a range of technologies including AWS, Kubernetes, containerization, and Infrastructure as Code. The position focuses on delivering client outcomes and technical excellence, aligned with HPE's culture of innovation and inclusion.

Job Responsibility:

  • Detailed development and implementation of scalable clustered Big Data solutions, with a specific focus on automated dynamic scaling, self-healing systems
  • Participating in the full lifecycle of data solution development, from requirements engineering through to continuous optimisation engineering and all the typical activities in between
  • Providing technical thought-leadership and advisory on technologies and processes at the core of the data domain, as well as data domain adjacent technologies
  • Engaging and collaborating with both internal and external teams and be a confident participant as well as a leader
  • Assisting with solution improvement activities driven either by the project or service
  • Support the design and development of new capabilities, preparing solution options, investigating technology, designing and running proof of concepts, providing assessments, advice and solution options, providing high level and low level design documentation
  • Cloud Engineering capability to leverage Public Cloud platform using automated build processes deployed using Infrastructure as Code
  • Provide technical challenge and assurance throughout development and delivery of work
  • Develop re-useable common solutions and patterns to reduce development lead times, improve commonality and lowering Total Cost of Ownership
  • Work independently and/or within a team using a DevOps way of working

Requirements:

  • An organised and methodical approach
  • Excellent time keeping and task prioritisation skills
  • An ability to provide clear and concise updates
  • An ability to convey technical concepts to all levels of audience
  • Data engineering skills – ETL/ELT
  • Technical implementation skills – application of industry best practices & designs patterns
  • Technical advisory skills – experience in researching technological products / services with the intent to provide advice on system improvements
  • Experience of working in hybrid environments with both classical and DevOps
  • Excellent written & spoken English skills
  • Excellent knowledge of Linux operating system administration and implementation
  • Broad understanding of the containerisation domain adjacent technologies/services, such as: Docker, OpenShift, Kubernetes etc.
  • Infrastructure as Code and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc.
  • Monitoring utilising products such as: Prometheus, Grafana, ELK, filebeat etc.
  • Observability - SRE
  • Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem
  • Edge technologies e.g. NGINX, HAProxy etc.
  • Excellent knowledge of YAML or similar languages
  • Experienced in Cloud native technologies in AWS
  • Experienced in deploying IaaS/PaaS in Multi Cloud Environments
  • Experienced in Cloud and Infrastructure Engineering building and testing new capabilities, and supporting the development of new solutions and common templates
  • Experienced in being able to act as bridge from the infrastructure through to user facing systems

Nice to have:

  • Jupyter Hub Awareness
  • Minio or similar S3 storage technology
  • Trino / Presto
  • RabbitMQ or other common queue technology e.g. ActiveMQ
  • NiFi
  • Rego
  • Familiarity with code development, shell-scripting in Python, Bash etc.
  • Experienced in Kubernetes Containers
  • Experienced in the use of Automation tools e.g. Terraform, Ansible, Foreman, Puppet and Python
  • Experienced in different flavours of Linux platform and services
What we offer:
  • Extensive social benefits
  • Flexible working hours
  • Competitive salary
  • Shared values
  • Equal opportunities
  • Work-life balance
  • Evolving career opportunities
  • Comprehensive suite of benefits that supports physical, financial and emotional wellbeing

Additional Information:

Job Posted:
March 20, 2025

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Cloud Technical Architect / Data DevOps Engineer

Senior Java Architect & Cloud Engineer

The Equity Middle Office technology group is actively transforming its technolog...
Location
Location
Singapore , Singapore
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Degree in Computer Science or Electronic/Electrical Engineering
  • ~15 years of Banking Software development experience, including management experiences or equivalent
  • Knowledge of low-latency frameworks such as Chronicle / garbage-free programming in Java
  • Knowledge in IT Infrastructure (i.e. IT Networks, Communications, and Data C-entre Management) and Infra Support Operations
  • Working experience in Linux operating system, Windows, Groovy, Python, JavaScript, Java, ELK, Bitbucket, Jenkins, Confluence, SonarQube, Nexus and scripting experience to do integrations through API, CLI for extracting data and to perform automated operations
  • Very Strong experience in in Shell Scripting, Batch Scripting to do automation, command line integration and invoking REST API using postman is mandatory
  • Must have hands on experience in building microservices using in Java and Spring Boot Framework Stack
  • Working experience in Messaging platform such AMPS, TIBCO, SOLACE and MQ
  • Experience with relational SQL and NoSQL database
  • Strong knowledge and experience in DevOps automation, containerization and orchestration using tools such as Gradle, Maven, Docker, Kubernetes, Terraform, Artifactory
Job Responsibility
Job Responsibility
  • Be recognized as a trusted partner for business application owners and other technology teams who seek to make use of Cloud based infrastructure
  • Define the technology roadmap and prioritize technical resources against to achieve maximum success
  • Ensuring the platform conforms to security best practices and is fully consistent with banking audit and compliance requirements and fully consistent with the design ethos and technical requirements of external cloud providers
  • Supporting adoption of containers and container control frameworks for internal Cloud Services, including container platform selection and design and ensuring that self-service design/deployment/control web containers is appropriate for requirements
  • Ensuring lifecycle management consists of documentation such as test cases, source code repositories etc are actively used and maintained
  • Recommend new services to complement and enhance infrastructure elements to stream-line and support applications development and deployment
  • Developing highly available infrastructures in a cloud services environment, preferably with cloud providers such as OpenShift or AWS
  • Implement continuous Integration / Continuous Deployment practice, tooling, and techniques, particularly evidence of leading organizational and cultural change to adopt CI/CD practices (Jira, Confluence, BitBucket, Git
  • Jenkins, Artifactory, Terraform, Packer, Rundeck, Ansible, AWS, ELK, AppDynamics)
  • Enable AI based monitoring automation to effectively detect/predict/prevent issues in the environment and code base
  • Fulltime
Read More
Arrow Right

Data Architect

The Data Architect will be responsible for designing, developing, and implementi...
Location
Location
United States , Andover, Massachusetts
Salary
Salary:
155500.00 - 376000.00 USD / Year
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 15+ years of relevant experience in the industry delivering technical and business strategy at an advanced/strategist level
  • Bachelor's, Master's, or PhD degree in Computer Science, Information Systems, Engineering, or equivalent
  • Strong understanding of data architecture, cloud infrastructure, and data management technologies
  • Proven experience driving innovations in data solutions and the productization of advanced development activities
  • Must have a track record of architecting, building, and deploying mission-critical, highly distributed, data-centric applications and solutions
  • Experience with at least one major IaaS and/or PaaS technology (OpenStack, AWS, Azure, VMware, etc.), including defining and scripting full topologies
  • Must be able to work in a global, complex, and diverse environment
Job Responsibility
Job Responsibility
  • Designing, developing, and implementing robust data solutions to support business objectives
  • Collaborating with cloud and data architects to design and set standards for the HPE GreenLake Hybrid Cloud platform and data solution portfolio
  • Driving evaluation of data storage and integration solutions and conducting research on platform behavior under different workloads
  • Designing and implementing data pipelines
  • Optimizing data storage solutions
  • Establishing best practices for data integration and analysis
  • Leading advanced development teams building proof-of-concept implementations for data platforms and solutions
  • Acting as a cross-functional product and technical expert for hybrid cloud and data technologies
  • Providing consultation, design input, and feedback for product development and design reviews across multiple organizations
  • Guiding and mentoring less-experienced staff members
What we offer
What we offer
  • Comprehensive suite of benefits that supports physical, financial and emotional wellbeing
  • Specific programs catered to career goals
  • Unconditionally inclusive work culture that celebrates individual uniqueness
  • Flexibility to manage work and personal needs
  • Opportunities for professional growth and development
  • Fulltime
Read More
Arrow Right
New

Senior Solutions Architect & Customer Advisor – Hybrid Cloud, Data and AI

We are seeking an exceptional Senior Solutions Architect & Customer Advisor to s...
Location
Location
United States
Salary
Salary:
161000.00 - 268300.00 USD / Year
rackspace.com Logo
Rackspace
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Information Technology, Engineering or related field
  • MBA or relevant graduate degree preferred
  • 15+ years of hands-on experience in cloud architecture, enterprise infrastructure or solutions architecture
  • Proven track record designing and implementing large-scale cloud solutions for enterprise clients
  • Deep expertise in Hybrid Cloud and Data Center architecture with focus on at least two major platforms (AWS, Azure, GCP) and experience with private cloud platforms (VMware, OpenStack)
  • Strong technical foundation in infrastructure, networking, security, and cloud-native development
  • Demonstrated ability to architect solutions from requirements through implementation
  • Experience working with Banking & Financial Services or Healthcare clients on cloud initiatives
  • Information security, risk management, and regulatory expertise
  • Understanding of industry-specific operational models and applications
Job Responsibility
Job Responsibility
  • Lead technical discovery engagements to assess current cloud maturity, identify hybrid cloud opportunities, and understand infrastructure and application modernization requirements
  • Conduct comprehensive assessments of customer technology environments to identify gaps, risks, and transformation opportunities
  • Translate customer business challenges into technical requirements and cloud architecture specifications
  • Design end-to-end cloud solutions across private cloud, hybrid cloud, and data center ecosystems, including network and cybersecurity
  • Develop technical roadmaps that align cloud strategy with business objectives
  • Create proofs-of-concept, reference architectures, and migration strategies
  • Architect and optimize hybrid cloud solutions across AWS, Azure, and/or GCP platforms, ensuring resilience, scalability, and operational excellence
  • Design cloud-native architectures leveraging infrastructure-as-code (Terraform, CloudFormation), automation, and DevOps practices
  • Provide technical leadership on data center modernization, VMware environments, and private cloud platforms
  • Advise on AI/ML integration, intelligent automation, and emerging cloud technologies
What we offer
What we offer
  • incentive compensation opportunities in the form of annual bonuses or incentives, equity awards, and an Employee Stock Purchase Plan (ESPP)
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

Our Senior Data Engineers enable public sector organisations to embrace a data-d...
Location
Location
United Kingdom , Bristol; London; Manchester; Swansea
Salary
Salary:
60000.00 - 80000.00 GBP / Year
madetech.com Logo
Made Tech
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Enthusiasm for learning and self-development
  • Proficiency in Git (inc. Github Actions) and able to explain the benefits of different branch strategies
  • Gathering and meeting the requirements of both clients and users on a data project
  • Strong experience in IaC and able to guide how one could deploy infrastructure into different environments
  • Owning the cloud infrastructure underpinning data systems through a DevOps approach
  • Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop
  • Good understanding of the possible architectures involved in modern data system design (e.g. Data Warehouse, Data Lakes and Data Meshes) and the different use cases for them
  • Ability to create data pipelines on a cloud environment and integrate error handling within these pipelines. With an understanding how to create reusable libraries to encourage uniformity of approach across multiple data pipelines.
  • Able to document and present an end-to-end diagram to explain a data processing system on a cloud environment, with some knowledge of how you would present diagrams (C4, UML etc.)
  • To provide guidance how one would implement a robust DevOps approach in a data project. Also would be able to talk about tools needed for DataOps in areas such as orchestration, data integration and data analytics.
Job Responsibility
Job Responsibility
  • Enable public sector organisations to embrace a data-driven approach by providing data platforms and services that are high-quality, cost-efficient, and tailored to clients’ needs
  • Develop, operate, and maintain these services
  • Provide maximum value to data consumers, including analysts, scientists, and business stakeholders
  • Play one or more roles according to our clients' needs
  • Support as a senior contributor for a project, focusing on both delivering engineering work as well as upskilling members of the client team
  • Play more of a technical architect role and work with the larger MadeTech team to identify growth opportunities within the account
  • Have a drive to deliver outcomes for users
  • Make sure that the wider context of a delivery is considered and maintain alignment between the operational and analytical aspects of the engineering solution
What we offer
What we offer
  • 30 days of paid annual leave + bank holidays
  • Flexible Parental Leave
  • Part time remote working for all our staff
  • Paid counselling as well as financial and legal advice
  • Flexible benefit platform which includes a Smart Tech scheme, Cycle to work scheme, and an individual benefits allowance which you can invest in a Health care cash plan or Pension plan
  • Optional social and wellbeing calendar of events
  • Fulltime
Read More
Arrow Right

Azure Data Architect

We are offering an exciting opportunity for an Azure Data Architect within the O...
Location
Location
United States , Houston
Salary
Salary:
Not provided
https://www.roberthalf.com Logo
Robert Half
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Comprehensive understanding of Azure products and Data platforms such as Databricks
  • Strong collaboration skills with Enterprise Architects, Cloud Engineering, Business Architects, and Product teams
  • Proficient in Azure Databricks, Azure DevOps, Azure Blob Storage, Azure Data Lake
  • Experienced in scripting languages like Python, Bash, or Powershell
  • Knowledge of cloud security principles and best practices
  • Familiarity with Azure Resource Manager, Virtual Networks, Azure Blob Storage, Azure Automation, Azure Active Directory, and Azure Site Recovery
Job Responsibility
Job Responsibility
  • Collaborate with multiple teams to design and implement solutions
  • Ensure solutions are optimized for performance, cost, and compliance
  • Operate hands-on with Azure products and Data platforms
  • Develop and deploy Cloud Native Applications using Azure PaaS Capabilities
  • Manage cloud deployment, technical and security architecture, database architecture, virtualization, software design, networking, DevOps, and DevSecOps
  • Employ Azure data services
  • Utilize scripting languages to automate routine tasks
  • Implement IAM, Authentication and Authorization of applications
  • Utilize knowledge of cloud security principles and best practices
  • Handle Azure Resource Manager, Virtual Networks, Azure Blob Storage, Azure Automation, Azure Active Directory, and Azure Site Recovery
What we offer
What we offer
  • Medical, vision, dental, life and disability insurance
  • Eligibility to enroll in company 401(k) plan
  • Fulltime
Read More
Arrow Right

Data Architect

We are seeking an experienced Data Architect with deep technical expertise and a...
Location
Location
United States
Salary
Salary:
Not provided
indatalabs.com Logo
InData Labs
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in data architecture, data engineering, or database design
  • Proven experience designing large-scale data systems in cloud environments (AWS, Azure, or GCP)
  • Strong expertise in relational and non-relational databases (e.g., PostgreSQL, SQL Server, MongoDB, Snowflake, Redshift, BigQuery)
  • Proficiency in data modeling tools (e.g., ER/Studio, ERwin, dbt, Lucidchart)
  • Hands-on experience with ETL frameworks, data pipelines, and orchestration tools (e.g., Apache Airflow, Fivetran, Talend)
  • Solid understanding of data governance, metadata management, and data lineage tools
  • Experience working with modern data stack technologies (e.g., Databricks, Kafka, Spark, dbt)
  • Strong SQL and at least one programming language (Python, Scala, or Java)
  • Excellent communication and leadership skills
  • Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or related field
Job Responsibility
Job Responsibility
  • Design and implement enterprise-grade data architectures to support analytics, reporting, and operational needs
  • Define data standards, data flows, and governance frameworks across systems and departments
  • Collaborate with data engineers, analysts, and business stakeholders to translate business requirements into technical data solutions
  • Develop and maintain logical and physical data models using modern modeling tools
  • Oversee data integration strategies including ETL/ELT pipelines, APIs, and real-time data ingestion
  • Evaluate, recommend, and implement new data technologies and tools aligned with industry best practices
  • Ensure data quality, security, and compliance across all platforms
  • Act as a technical mentor to engineering and analytics teams, promoting architectural consistency and knowledge sharing
  • Partner with DevOps and infrastructure teams to ensure optimal deployment, scalability, and performance of data systems
  • Lead initiatives in data warehousing, master data management, and data lakes (on-premise and cloud)
What we offer
What we offer
  • 100% remote with flexible hours
  • Work from anywhere in the world
  • Be part of a senior, talented, and supportive team
  • Flat structure – your input is always welcome
  • Clients in the US and Europe, projects with real impact
  • Room to grow and experiment with cutting-edge AI solutions
Read More
Arrow Right
New

Data Engineer

We are looking for a seasoned Data Engineer to join our MarTech and Data Strateg...
Location
Location
United States
Salary
Salary:
Not provided
zionandzion.com Logo
Zion & Zion
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4-6 years experience in a data engineering role
  • Significant experience and (preferably) certified in public cloud products: GCP Cloud Architect / Data Engineer or equivalent on AWS
  • Experience with tools like dbt
  • Familiar with ETL/ELT and (nice to have) reverse ETL platforms
  • Experience on a cloud-based or Business Intelligence project (as a technical project manager, developer or architect)
  • DevOps capabilities (CI/CD, Infrastructure as code, Docker, etc.)
  • Experience leveraging digital data in the cloud for marketing activations
  • Excellent verbal and written communication skills and comfortable working with both marketing and technical teams
  • Client-facing experience for detailed technical specifications discussions
  • Fluent in SQL, Python or R
Job Responsibility
Job Responsibility
  • Work with internal and external teams to design and implement technical architecture in order to facilitate the advanced activation of data
  • Interacting with 3rd party MarTech solutions (Google Analytics, Google Marketing Platform, Data Management Platforms, Tag Management Solutions, Adobe Analytics, Clouds, Customer Data Platforms, etc.)
  • Come up with creative solutions to integrate data from a variety of sources into platforms and data warehouses
  • Work with internal teams to specify data processing pipelines (database schemas, integrity constraints, delivery throughput) for use case activations and implement it in the cloud
  • Work with internal data science team to scope and check typical machine learning and AI project requirements
  • Work with data visualization teams to design and implement tables to help power complex dashboards
Read More
Arrow Right
New

Senior Data Solutions Architect with AWS

Provectus, a leading AI consultancy and solutions provider specializing in Data ...
Location
Location
Poland , Wroclaw
Salary
Salary:
Not provided
provectus.com Logo
Provectus
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field
  • Minimum of 8 years of experience in data solution architecture, with at least 3 years focused on AWS
  • Proven experience in designing and implementing large-scale data engineering solutions on AWS
  • Experience with Databricks is a plus
  • Deep expertise in AWS platform services, including S3, EC2, Lambda, EMR, Glue, Redshift, AWS MSK, and EKS
  • Proficient in programming languages like Python, SQL, and Scala
  • Experience with data warehousing, ETL processes, and real-time data streaming
  • Familiarity with open-source technologies and tools commonly used in data engineering
  • AWS Certified Solutions Architect – Professional or similar AWS certifications are a plus
  • Excellent communication and presentation skills, with the ability to articulate complex technical concepts to non-technical stakeholders
Job Responsibility
Job Responsibility
  • Lead complex, high-impact customer engagements focused on AWS Data Platform solutions
  • Define and drive technical strategies that align AWS capabilities with customer business objectives, incorporating Databricks (as a plus) solutions where appropriate
  • Architect and design scalable data platforms using AWS, ensuring optimal performance, reliability, security, and cost efficiency
  • Evaluate and select appropriate technologies and tools to meet customer needs, integrating AWS services with other solutions such as Databricks, and Snowflake as necessary
  • Establish, fulfill, and maintain comprehensive architectural documentation to ensure alignment with technical standards and best practices across the organization
  • Collaborate with the sales team during the pre-sales process by providing technical expertise to position AWS-based data solutions effectively
  • Participate in customer meetings to assess technical needs, scope potential solutions, and identify opportunities for growth
  • Create technical proposals, solution architectures, and presentations to support sales efforts and ensure alignment with customer expectations
  • Assist in responding to RFPs/RFIs by providing accurate technical input and aligning solutions to client requirements
  • Demonstrate AWS capabilities through POCs (Proof of Concepts) and technical demonstrations to help customers evaluate the proposed solutions
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.