CrawlJobs Logo

Cloud Technical Architect / Data DevOps Engineer

https://www.hpe.com/ Logo

Hewlett Packard Enterprise

Location Icon

Location:
United Kingdom , Bristol

Category Icon

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

Not provided

Job Description:

The role involves designing, implementing, and optimizing scalable Big Data and cloud solutions while collaborating with internal and external teams. It requires expertise in a range of technologies including AWS, Kubernetes, containerization, and Infrastructure as Code. The position focuses on delivering client outcomes and technical excellence, aligned with HPE's culture of innovation and inclusion.

Job Responsibility:

  • Detailed development and implementation of scalable clustered Big Data solutions, with a specific focus on automated dynamic scaling, self-healing systems
  • Participating in the full lifecycle of data solution development, from requirements engineering through to continuous optimisation engineering and all the typical activities in between
  • Providing technical thought-leadership and advisory on technologies and processes at the core of the data domain, as well as data domain adjacent technologies
  • Engaging and collaborating with both internal and external teams and be a confident participant as well as a leader
  • Assisting with solution improvement activities driven either by the project or service
  • Support the design and development of new capabilities, preparing solution options, investigating technology, designing and running proof of concepts, providing assessments, advice and solution options, providing high level and low level design documentation
  • Cloud Engineering capability to leverage Public Cloud platform using automated build processes deployed using Infrastructure as Code
  • Provide technical challenge and assurance throughout development and delivery of work
  • Develop re-useable common solutions and patterns to reduce development lead times, improve commonality and lowering Total Cost of Ownership
  • Work independently and/or within a team using a DevOps way of working

Requirements:

  • An organised and methodical approach
  • Excellent time keeping and task prioritisation skills
  • An ability to provide clear and concise updates
  • An ability to convey technical concepts to all levels of audience
  • Data engineering skills – ETL/ELT
  • Technical implementation skills – application of industry best practices & designs patterns
  • Technical advisory skills – experience in researching technological products / services with the intent to provide advice on system improvements
  • Experience of working in hybrid environments with both classical and DevOps
  • Excellent written & spoken English skills
  • Excellent knowledge of Linux operating system administration and implementation
  • Broad understanding of the containerisation domain adjacent technologies/services, such as: Docker, OpenShift, Kubernetes etc.
  • Infrastructure as Code and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc.
  • Monitoring utilising products such as: Prometheus, Grafana, ELK, filebeat etc.
  • Observability - SRE
  • Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem
  • Edge technologies e.g. NGINX, HAProxy etc.
  • Excellent knowledge of YAML or similar languages
  • Experienced in Cloud native technologies in AWS
  • Experienced in deploying IaaS/PaaS in Multi Cloud Environments
  • Experienced in Cloud and Infrastructure Engineering building and testing new capabilities, and supporting the development of new solutions and common templates
  • Experienced in being able to act as bridge from the infrastructure through to user facing systems

Nice to have:

  • Jupyter Hub Awareness
  • Minio or similar S3 storage technology
  • Trino / Presto
  • RabbitMQ or other common queue technology e.g. ActiveMQ
  • NiFi
  • Rego
  • Familiarity with code development, shell-scripting in Python, Bash etc.
  • Experienced in Kubernetes Containers
  • Experienced in the use of Automation tools e.g. Terraform, Ansible, Foreman, Puppet and Python
  • Experienced in different flavours of Linux platform and services
What we offer:
  • Extensive social benefits
  • Flexible working hours
  • Competitive salary
  • Shared values
  • Equal opportunities
  • Work-life balance
  • Evolving career opportunities
  • Comprehensive suite of benefits that supports physical, financial and emotional wellbeing

Additional Information:

Job Posted:
March 20, 2025

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Cloud Technical Architect / Data DevOps Engineer

Senior Java Architect & Cloud Engineer

The Equity Middle Office technology group is actively transforming its technolog...
Location
Location
Singapore , Singapore
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Degree in Computer Science or Electronic/Electrical Engineering
  • ~15 years of Banking Software development experience, including management experiences or equivalent
  • Knowledge of low-latency frameworks such as Chronicle / garbage-free programming in Java
  • Knowledge in IT Infrastructure (i.e. IT Networks, Communications, and Data C-entre Management) and Infra Support Operations
  • Working experience in Linux operating system, Windows, Groovy, Python, JavaScript, Java, ELK, Bitbucket, Jenkins, Confluence, SonarQube, Nexus and scripting experience to do integrations through API, CLI for extracting data and to perform automated operations
  • Very Strong experience in in Shell Scripting, Batch Scripting to do automation, command line integration and invoking REST API using postman is mandatory
  • Must have hands on experience in building microservices using in Java and Spring Boot Framework Stack
  • Working experience in Messaging platform such AMPS, TIBCO, SOLACE and MQ
  • Experience with relational SQL and NoSQL database
  • Strong knowledge and experience in DevOps automation, containerization and orchestration using tools such as Gradle, Maven, Docker, Kubernetes, Terraform, Artifactory
Job Responsibility
Job Responsibility
  • Be recognized as a trusted partner for business application owners and other technology teams who seek to make use of Cloud based infrastructure
  • Define the technology roadmap and prioritize technical resources against to achieve maximum success
  • Ensuring the platform conforms to security best practices and is fully consistent with banking audit and compliance requirements and fully consistent with the design ethos and technical requirements of external cloud providers
  • Supporting adoption of containers and container control frameworks for internal Cloud Services, including container platform selection and design and ensuring that self-service design/deployment/control web containers is appropriate for requirements
  • Ensuring lifecycle management consists of documentation such as test cases, source code repositories etc are actively used and maintained
  • Recommend new services to complement and enhance infrastructure elements to stream-line and support applications development and deployment
  • Developing highly available infrastructures in a cloud services environment, preferably with cloud providers such as OpenShift or AWS
  • Implement continuous Integration / Continuous Deployment practice, tooling, and techniques, particularly evidence of leading organizational and cultural change to adopt CI/CD practices (Jira, Confluence, BitBucket, Git
  • Jenkins, Artifactory, Terraform, Packer, Rundeck, Ansible, AWS, ELK, AppDynamics)
  • Enable AI based monitoring automation to effectively detect/predict/prevent issues in the environment and code base
  • Fulltime
Read More
Arrow Right

Data Architect

The Data Architect will be responsible for designing, developing, and implementi...
Location
Location
United States , Andover, Massachusetts
Salary
Salary:
155500.00 - 376000.00 USD / Year
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 15+ years of relevant experience in the industry delivering technical and business strategy at an advanced/strategist level
  • Bachelor's, Master's, or PhD degree in Computer Science, Information Systems, Engineering, or equivalent
  • Strong understanding of data architecture, cloud infrastructure, and data management technologies
  • Proven experience driving innovations in data solutions and the productization of advanced development activities
  • Must have a track record of architecting, building, and deploying mission-critical, highly distributed, data-centric applications and solutions
  • Experience with at least one major IaaS and/or PaaS technology (OpenStack, AWS, Azure, VMware, etc.), including defining and scripting full topologies
  • Must be able to work in a global, complex, and diverse environment
Job Responsibility
Job Responsibility
  • Designing, developing, and implementing robust data solutions to support business objectives
  • Collaborating with cloud and data architects to design and set standards for the HPE GreenLake Hybrid Cloud platform and data solution portfolio
  • Driving evaluation of data storage and integration solutions and conducting research on platform behavior under different workloads
  • Designing and implementing data pipelines
  • Optimizing data storage solutions
  • Establishing best practices for data integration and analysis
  • Leading advanced development teams building proof-of-concept implementations for data platforms and solutions
  • Acting as a cross-functional product and technical expert for hybrid cloud and data technologies
  • Providing consultation, design input, and feedback for product development and design reviews across multiple organizations
  • Guiding and mentoring less-experienced staff members
What we offer
What we offer
  • Comprehensive suite of benefits that supports physical, financial and emotional wellbeing
  • Specific programs catered to career goals
  • Unconditionally inclusive work culture that celebrates individual uniqueness
  • Flexibility to manage work and personal needs
  • Opportunities for professional growth and development
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Our Senior Data Engineers enable public sector organisations to embrace a data-d...
Location
Location
United Kingdom , Bristol; London; Manchester; Swansea
Salary
Salary:
60000.00 - 80000.00 GBP / Year
madetech.com Logo
Made Tech
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Enthusiasm for learning and self-development
  • Proficiency in Git (inc. Github Actions) and able to explain the benefits of different branch strategies
  • Gathering and meeting the requirements of both clients and users on a data project
  • Strong experience in IaC and able to guide how one could deploy infrastructure into different environments
  • Owning the cloud infrastructure underpinning data systems through a DevOps approach
  • Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop
  • Good understanding of the possible architectures involved in modern data system design (e.g. Data Warehouse, Data Lakes and Data Meshes) and the different use cases for them
  • Ability to create data pipelines on a cloud environment and integrate error handling within these pipelines. With an understanding how to create reusable libraries to encourage uniformity of approach across multiple data pipelines.
  • Able to document and present an end-to-end diagram to explain a data processing system on a cloud environment, with some knowledge of how you would present diagrams (C4, UML etc.)
  • To provide guidance how one would implement a robust DevOps approach in a data project. Also would be able to talk about tools needed for DataOps in areas such as orchestration, data integration and data analytics.
Job Responsibility
Job Responsibility
  • Enable public sector organisations to embrace a data-driven approach by providing data platforms and services that are high-quality, cost-efficient, and tailored to clients’ needs
  • Develop, operate, and maintain these services
  • Provide maximum value to data consumers, including analysts, scientists, and business stakeholders
  • Play one or more roles according to our clients' needs
  • Support as a senior contributor for a project, focusing on both delivering engineering work as well as upskilling members of the client team
  • Play more of a technical architect role and work with the larger MadeTech team to identify growth opportunities within the account
  • Have a drive to deliver outcomes for users
  • Make sure that the wider context of a delivery is considered and maintain alignment between the operational and analytical aspects of the engineering solution
What we offer
What we offer
  • 30 days of paid annual leave + bank holidays
  • Flexible Parental Leave
  • Part time remote working for all our staff
  • Paid counselling as well as financial and legal advice
  • Flexible benefit platform which includes a Smart Tech scheme, Cycle to work scheme, and an individual benefits allowance which you can invest in a Health care cash plan or Pension plan
  • Optional social and wellbeing calendar of events
  • Fulltime
Read More
Arrow Right

Azure Data Architect

We are offering an exciting opportunity for an Azure Data Architect within the O...
Location
Location
United States , Houston
Salary
Salary:
Not provided
https://www.roberthalf.com Logo
Robert Half
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Comprehensive understanding of Azure products and Data platforms such as Databricks
  • Strong collaboration skills with Enterprise Architects, Cloud Engineering, Business Architects, and Product teams
  • Proficient in Azure Databricks, Azure DevOps, Azure Blob Storage, Azure Data Lake
  • Experienced in scripting languages like Python, Bash, or Powershell
  • Knowledge of cloud security principles and best practices
  • Familiarity with Azure Resource Manager, Virtual Networks, Azure Blob Storage, Azure Automation, Azure Active Directory, and Azure Site Recovery
Job Responsibility
Job Responsibility
  • Collaborate with multiple teams to design and implement solutions
  • Ensure solutions are optimized for performance, cost, and compliance
  • Operate hands-on with Azure products and Data platforms
  • Develop and deploy Cloud Native Applications using Azure PaaS Capabilities
  • Manage cloud deployment, technical and security architecture, database architecture, virtualization, software design, networking, DevOps, and DevSecOps
  • Employ Azure data services
  • Utilize scripting languages to automate routine tasks
  • Implement IAM, Authentication and Authorization of applications
  • Utilize knowledge of cloud security principles and best practices
  • Handle Azure Resource Manager, Virtual Networks, Azure Blob Storage, Azure Automation, Azure Active Directory, and Azure Site Recovery
What we offer
What we offer
  • Medical, vision, dental, life and disability insurance
  • Eligibility to enroll in company 401(k) plan
  • Fulltime
Read More
Arrow Right

Data Architect

We are seeking an experienced Data Architect with deep technical expertise and a...
Location
Location
United States
Salary
Salary:
Not provided
indatalabs.com Logo
InData Labs
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in data architecture, data engineering, or database design
  • Proven experience designing large-scale data systems in cloud environments (AWS, Azure, or GCP)
  • Strong expertise in relational and non-relational databases (e.g., PostgreSQL, SQL Server, MongoDB, Snowflake, Redshift, BigQuery)
  • Proficiency in data modeling tools (e.g., ER/Studio, ERwin, dbt, Lucidchart)
  • Hands-on experience with ETL frameworks, data pipelines, and orchestration tools (e.g., Apache Airflow, Fivetran, Talend)
  • Solid understanding of data governance, metadata management, and data lineage tools
  • Experience working with modern data stack technologies (e.g., Databricks, Kafka, Spark, dbt)
  • Strong SQL and at least one programming language (Python, Scala, or Java)
  • Excellent communication and leadership skills
  • Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or related field
Job Responsibility
Job Responsibility
  • Design and implement enterprise-grade data architectures to support analytics, reporting, and operational needs
  • Define data standards, data flows, and governance frameworks across systems and departments
  • Collaborate with data engineers, analysts, and business stakeholders to translate business requirements into technical data solutions
  • Develop and maintain logical and physical data models using modern modeling tools
  • Oversee data integration strategies including ETL/ELT pipelines, APIs, and real-time data ingestion
  • Evaluate, recommend, and implement new data technologies and tools aligned with industry best practices
  • Ensure data quality, security, and compliance across all platforms
  • Act as a technical mentor to engineering and analytics teams, promoting architectural consistency and knowledge sharing
  • Partner with DevOps and infrastructure teams to ensure optimal deployment, scalability, and performance of data systems
  • Lead initiatives in data warehousing, master data management, and data lakes (on-premise and cloud)
What we offer
What we offer
  • 100% remote with flexible hours
  • Work from anywhere in the world
  • Be part of a senior, talented, and supportive team
  • Flat structure – your input is always welcome
  • Clients in the US and Europe, projects with real impact
  • Room to grow and experiment with cutting-edge AI solutions
Read More
Arrow Right

Data Engineer

We are looking for a seasoned Data Engineer to join our MarTech and Data Strateg...
Location
Location
United States
Salary
Salary:
Not provided
zionandzion.com Logo
Zion & Zion
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4-6 years experience in a data engineering role
  • Significant experience and (preferably) certified in public cloud products: GCP Cloud Architect / Data Engineer or equivalent on AWS
  • Experience with tools like dbt
  • Familiar with ETL/ELT and (nice to have) reverse ETL platforms
  • Experience on a cloud-based or Business Intelligence project (as a technical project manager, developer or architect)
  • DevOps capabilities (CI/CD, Infrastructure as code, Docker, etc.)
  • Experience leveraging digital data in the cloud for marketing activations
  • Excellent verbal and written communication skills and comfortable working with both marketing and technical teams
  • Client-facing experience for detailed technical specifications discussions
  • Fluent in SQL, Python or R
Job Responsibility
Job Responsibility
  • Work with internal and external teams to design and implement technical architecture in order to facilitate the advanced activation of data
  • Interacting with 3rd party MarTech solutions (Google Analytics, Google Marketing Platform, Data Management Platforms, Tag Management Solutions, Adobe Analytics, Clouds, Customer Data Platforms, etc.)
  • Come up with creative solutions to integrate data from a variety of sources into platforms and data warehouses
  • Work with internal teams to specify data processing pipelines (database schemas, integrity constraints, delivery throughput) for use case activations and implement it in the cloud
  • Work with internal data science team to scope and check typical machine learning and AI project requirements
  • Work with data visualization teams to design and implement tables to help power complex dashboards
Read More
Arrow Right

Senior Azure Data Engineer

Seeking a Lead AI DevOps Engineer to oversee design and delivery of advanced AI/...
Location
Location
Poland
Salary
Salary:
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • At least 6 years of professional experience in the Data & Analytics area
  • 1+ years of experience (or acting as) in the Senior Consultant or above role with a strong focus on data solutions build in Azure and Databricks/Synapse/(MS Fabric is nice to have)
  • Proven experience in Azure cloud-based infrastructure, Databricks and one of SQL implementation (e.g., Oracle, T-SQL, MySQL, etc.)
  • Proficiency in programming languages such as SQL, Python, PySpark is essential (R or Scala nice to have)
  • Very good level of communication including ability to convey information clearly and specifically to co-workers and business stakeholders
  • Working experience in the agile methodologies – supporting tools (JIRA, Azure DevOps)
  • Experience in leading and managing a team of data engineers, providing guidance, mentorship, and technical support
  • Knowledge of data management principles and best practices, including data governance, data quality, and data integration
  • Good project management skills, with the ability to prioritize tasks, manage timelines, and deliver high-quality results within designated deadlines
  • Excellent problem-solving and analytical skills, with the ability to identify and resolve complex data engineering issues
Job Responsibility
Job Responsibility
  • Act as a senior member of the Data Science & AI Competency Center, AI Engineering team, guiding delivery and coordinating workstreams
  • Develop and execute a cloud data strategy aligned with organizational goals
  • Lead data integration efforts, including ETL processes, to ensure seamless data flow
  • Implement security measures and compliance standards in cloud environments
  • Continuously monitor and optimize data solutions for cost-efficiency
  • Establish and enforce data governance and quality standards
  • Leverage Azure services, as well as tools like dbt and Databricks, for efficient data pipelines and analytics solutions
  • Work with cross-functional teams to understand requirements and provide data solutions
  • Maintain comprehensive documentation for data architecture and solutions
  • Mentor junior team members in cloud data architecture best practices
What we offer
What we offer
  • Stable employment
  • “Office as an option” model
  • Workation
  • Great Place to Work® certified employer
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs
  • Upskilling support
Read More
Arrow Right

Cloud Solutions Architect

Location
Location
Canada , Burlington; Toronto
Salary
Salary:
Not provided
jkconsultants.ca Logo
JK Contracting and Consulting Inc.
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Information Technology, or a related field
  • Proven experience as a Cloud Solutions Architect, Cloud Engineer, or a similar role
  • In-depth knowledge of cloud platforms such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP)
  • Strong understanding of cloud services, including compute, storage, databases, networking, security, and DevOps tools
  • Experience in designing cloud-based architectures with a focus on scalability, availability, and cost-efficiency
  • Familiarity with containerization technologies (e.g., Docker, Kubernetes) and infrastructure-as-code tools (e.g., Terraform, CloudFormation)
  • Experience with cloud security, compliance, and data protection standards (e.g., GDPR, HIPAA)
  • Strong problem-solving and troubleshooting skills
  • Excellent communication and collaboration skills, with the ability to work with both technical and non-technical teams
Job Responsibility
Job Responsibility
  • Design and implement cloud architecture solutions based on business needs, leveraging cloud services like AWS, Microsoft Azure, and Google Cloud
  • Develop strategies for cloud migration, scalability, and disaster recovery
  • Create and maintain architecture documentation, including system designs, technical specifications, and implementation plans
  • Collaborate with software developers, network engineers, and other stakeholders to ensure seamless integration of cloud solutions with existing systems
  • Optimize cloud infrastructure for performance, cost-efficiency, and security
  • Ensure compliance with industry standards, regulatory requirements, and best practices for cloud security and data privacy
  • Monitor and evaluate the performance of cloud systems to ensure reliability and minimize downtime
  • Provide guidance and technical expertise on cloud technologies, best practices, and cloud-native design patterns
  • Conduct security assessments and recommend security measures for cloud-based applications and services
  • Assist in the implementation of automation tools and DevOps practices to improve system efficiency and delivery
Read More
Arrow Right