This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
This is a DevOps and Infrastructure Engineer position in the Federal Risk Infrastructure group. This is an exciting role for a self-starter who has a thirst for new challenges as well as new technologies. This role involves close collaboration with various development, infrastructure, and platform teams across multiple regions. The candidate will develop high-quality DevOps solutions involving cloud and on-prem systems, CI/CD workflows, process automation, release management, and observability implementation. This role is ideal for someone looking for a strong career development path with many opportunities to learn and grow.
Job Responsibility:
Leverage expertise throughout the DevOps lifecycle to build and maintain CI/CD solutions
Onboard and maintain infrastructure components for the ORS platform
Develop/maintain DevOps tooling and automation solutions to reduce toil and optimize processes
Drive the environment stability initiatives, lay the foundations for adoption of automation, meeting Firm’s infrastructure and software hygiene requirements
Implement observability/telemetry solutions by leveraging both cloud-native and open-source tools for complex systems
Liaise with stakeholders, global team members and RPE, fostering an inclusive and positive working environment in the team
Requirements:
8-12 years' experience
Strong understanding of Linux systems, OS fundamentals & networking
Strong scripting experience in Python and Bash
Experience with modern DevOps tools and practices including Git, Jenkins, CI/CD, Autosys, Observability tooling (Splunk, Prometheus, Loki, Grafana, etc.), IaC, etc.
Understanding of modern distributed systems architecture (Kubernetes, Docker, etc.)
Experience in procuring and maintaining both cloud and on-prem infrastructure for large platforms
Experience with any of the public cloud platforms (AWS, Azure, GCP)
Good analytical capability, logical problem-solving skills and ability to work in a fast-paced environment
Excellent communication, collaboration, and interpersonal skills
Willingness to learn new systems and technologies
Bachelor’s Degree in CS or related fields
Nice to have:
Experience in data domains and ETL workflows
Familiarity with data stacks such as Kafka, Databricks, and Snowflake