CrawlJobs Logo

Senior DevOps Lead Data Engineering

https://www.citi.com/ Logo

Citi

Location Icon

Location:
India , Pune

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are seeking an experienced and highly skilled Senior DevOps Engineering Lead with 7-10 years of progressive experience to drive our data platform infrastructure automation, CI/CD excellence, and operational efficiency. This role is pivotal in managing and scaling our resilient and performant systems, with a strong focus on empowering microservices architecture. The ideal candidate will possess deep technical expertise in a wide range of DevOps tools and practices, coupled with strong leadership and problem-solving abilities.

Job Responsibility:

  • CI/CD Pipeline Ownership: Design, implement, and maintain robust, scalable, and secure CI/CD pipeline architecture for microservices applications, ensuring continuous integration, delivery, and deployment
  • Infrastructure Planning & Management: Lead the design, provisioning, optimization and management of scalable data infrastructure (compute, storage, networking) across cloud , ECS and/or on-premise environments, specifically supporting a data mesh paradigm
  • Elastic Stack Expertise: Manage and optimize Elastic Stack (Elasticsearch, Kibana, Logstash, Beats) for centralized logging, monitoring, and analytics
  • Automation & Scripting: Design, Develop and maintain automation scripts and tools using Shell scripts, Python, Java, or other relevant languages to streamline operational tasks and improve efficiency
  • Infrastructure Procurement & Lifecycle: Oversee the end-to-end Solution (SLTN) process for infrastructure procurement, ensuring timely and compliant acquisition of resources
  • Capacity Estimation & Planning: Conduct thorough capacity planning and performance analysis for microservices and underlying infrastructure to ensure scalability and reliability
  • Access Management & Security: Design and implement secure machine-to-machine communication strategies and manage infrastructure access, adhering to security best practices
  • Microservices Operations: Provide operational expertise and support for highly distributed microservices architectures, including troubleshooting, performance tuning, and incident response
  • Governance & Observability: Implement and enforce data governance policies through automation, and establish comprehensive observability (monitoring, logging, alerting) for data pipelines and infrastructure
  • Mentorship & Best Practices: Mentor junior DevOps engineers, promote DevOps best practices (e.g., IaC, GitOps, observability), and foster a culture of continuous improvement
  • Cross-Functional Collaboration: Work closely with partner development, QA, and infra security teams to ensure seamless integration and deployment processes

Requirements:

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field
  • 7-10 years of hands-on experience in DevOps, Site Reliability Engineering (SRE), or a similar role
  • Proven expertise in designing and implementing DevOps leadership
  • In-depth experience with container orchestration platforms like AWS, Openshift ECS (or Kubernetes)
  • Strong practical experience with Elastic Stack (Elasticsearch, Kibana, Logstash, Beats) for transaction, logging and monitoring
  • Proficiency in scripting languages (Schell scripting, Python is a must), and strong Java development skills, particularly for tooling and automation
  • Demonstrated knowledge of microservices architecture principles and operational challenges
  • Familiarity with machine-to-machine authentication and authorization mechanisms
  • Must have knowledge of automation principles and practices
  • Experience with job scheduling tools like Autosys
  • Familiarity with Helix Blueprint or similar automation frameworks
  • Excellent problem-solving, communication, and collaboration skills
  • Modern Engineering Practices: Familiarity in modern engineering practices such as Trunk-Based Development, Test-Driven Development (TDD), Behavior-Driven Development (BDD), Contract Testing, and Agile methodologies

Nice to have:

  • AWS Certifications (e.g., Solutions Architect, DevOps Engineer, SysOps Administrator)
  • Experience with Kafka and Tibco messaging infrastructure, including best practices for messaging design and operations

Additional Information:

Job Posted:
February 17, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior DevOps Lead Data Engineering

Senior SSE Data Engineer

Designs, develops, troubleshoots and debugs software programs for software enhan...
Location
Location
Israel , Tel Aviv
Salary
Salary:
Not provided
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's or Master's degree in Computer Science, Information Systems, or equivalent
  • Typically 6-10 years experience
  • Extensive experience with multiple software systems design tools and languages
  • Excellent analytical and problem solving skills
  • Experience in overall architecture of software systems for products and solutions
  • Designing and integrating software systems running on multiple platform types into overall architecture
  • Evaluating forms and processes for software systems testing and methodology, including writing and execution of test plans, debugging, and testing scripts and tools
  • Excellent written and verbal communication skills
  • mastery in English and local language
  • Ability to effectively communicate product architectures, design proposals and negotiate options at senior management levels
Job Responsibility
Job Responsibility
  • Leads multiple project teams of other software systems engineers and internal and outsourced development partners responsible for all stages of design and development for complex products and platforms, including solution design, analysis, coding, testing, and integration
  • Manages and expands relationships with internal and outsourced development partners on software systems design and development
  • Reviews and evaluates designs and project activities for compliance with systems design and development guidelines and standards
  • provides tangible feedback to improve product quality and mitigate failure risk
  • Provides domain-specific expertise and overall software systems leadership and perspective to cross-organization projects, programs, and activities
  • Drives innovation and integration of new technologies into projects and activities in the software systems design organization
  • Provides guidance and mentoring to less- experienced staff members
What we offer
What we offer
  • Health & Wellbeing
  • Personal & Professional Development
  • Unconditional Inclusion
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

At Blue Margin, we are on a mission to build the go-to data platform for PE-back...
Location
Location
United States , Fort Collins
Salary
Salary:
110000.00 - 140000.00 USD / Year
bluemargin.com Logo
Blue Margin
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related field
  • 5+ years of professional experience in data engineering, with emphasis on Python & PySpark/Apache Spark
  • Proven ability to manage large datasets and optimize for speed, scalability, and reliability
  • Strong SQL skills and understanding of relational and distributed data systems
  • Experience with Azure Data Factory, Synapse Pipelines, Fivetran, Delta Lake, Microsoft Fabric, or Snowflake
  • Knowledge of data modeling, orchestration, and Delta/Parquet file management best practices
  • Familiarity with CI/CD, version control, and DevOps practices for data pipelines
  • Experience leveraging AI-assisted tools to accelerate engineering workflows
  • Strong communication skills
  • ability to convey complex technical details to both engineers and business stakeholders
Job Responsibility
Job Responsibility
  • Architect, design, and optimize large-scale data pipelines using tools like PySpark, SparkSQL, Delta Lake, and cloud-native tools
  • Drive efficiency in incremental/delta data loading, partitioning, and performance tuning
  • Lead implementations across Azure Synapse, Microsoft Fabric, and/or Snowflake environments
  • Collaborate with stakeholders and analysts to translate business needs into scalable data solutions
  • Evaluate and incorporate AI/automation to improve development speed, testing, and data quality
  • Oversee and mentor junior data engineers, establishing coding standards and best practices
  • Ensure high standards for data quality, security, and governance
  • Participate in solution design for client engagements, balancing technical depth with practical outcomes
What we offer
What we offer
  • Competitive pay
  • strong benefits
  • flexible hybrid work setup
  • Fulltime
Read More
Arrow Right

Senior Azure Data Engineer

Seeking a Lead AI DevOps Engineer to oversee design and delivery of advanced AI/...
Location
Location
Poland
Salary
Salary:
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • At least 6 years of professional experience in the Data & Analytics area
  • 1+ years of experience (or acting as) in the Senior Consultant or above role with a strong focus on data solutions build in Azure and Databricks/Synapse/(MS Fabric is nice to have)
  • Proven experience in Azure cloud-based infrastructure, Databricks and one of SQL implementation (e.g., Oracle, T-SQL, MySQL, etc.)
  • Proficiency in programming languages such as SQL, Python, PySpark is essential (R or Scala nice to have)
  • Very good level of communication including ability to convey information clearly and specifically to co-workers and business stakeholders
  • Working experience in the agile methodologies – supporting tools (JIRA, Azure DevOps)
  • Experience in leading and managing a team of data engineers, providing guidance, mentorship, and technical support
  • Knowledge of data management principles and best practices, including data governance, data quality, and data integration
  • Good project management skills, with the ability to prioritize tasks, manage timelines, and deliver high-quality results within designated deadlines
  • Excellent problem-solving and analytical skills, with the ability to identify and resolve complex data engineering issues
Job Responsibility
Job Responsibility
  • Act as a senior member of the Data Science & AI Competency Center, AI Engineering team, guiding delivery and coordinating workstreams
  • Develop and execute a cloud data strategy aligned with organizational goals
  • Lead data integration efforts, including ETL processes, to ensure seamless data flow
  • Implement security measures and compliance standards in cloud environments
  • Continuously monitor and optimize data solutions for cost-efficiency
  • Establish and enforce data governance and quality standards
  • Leverage Azure services, as well as tools like dbt and Databricks, for efficient data pipelines and analytics solutions
  • Work with cross-functional teams to understand requirements and provide data solutions
  • Maintain comprehensive documentation for data architecture and solutions
  • Mentor junior team members in cloud data architecture best practices
What we offer
What we offer
  • Stable employment
  • “Office as an option” model
  • Workation
  • Great Place to Work® certified employer
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs
  • Upskilling support
Read More
Arrow Right

Lead Data Engineer

As a Lead Data Engineer or architect at Made Tech, you'll play a pivotal role in...
Location
Location
United Kingdom , Any UK Office Hub (Bristol / London / Manchester / Swansea)
Salary
Salary:
80000.00 - 96000.00 GBP / Year
madetech.com Logo
Made Tech
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proficiency in Git (inc. Github Actions) and able to explain the benefits of different branch strategies
  • Strong experience in IaC and able to guide how one could deploy infrastructure into different environments
  • Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop
  • Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes)
  • Ability to create data pipelines on a cloud environment and integrate error handling within these pipelines
  • You understand how to create reusable libraries to encourage uniformity or approach across multiple data pipelines
  • Able to document and present end-to-end diagrams to explain a data processing system on a cloud environment
  • Some knowledge of how you would present diagrams (C4, UML, etc.)
  • Enthusiasm for learning and self-development
  • You have experience of working on agile delivery-lead projects and can apply agile practices such as Scrum, XP, Kanban
Job Responsibility
Job Responsibility
  • Define, shape and perfect data strategies in central and local government
  • Help public sector teams understand the value of their data, and make the most of it
  • Establish yourself as a trusted advisor in data driven approaches using public cloud services like AWS, Azure and GCP
  • Contribute to our recruitment efforts and take on line management responsibilities
  • Help implement efficient data pipelines & storage
What we offer
What we offer
  • 30 days of paid annual leave
  • Flexible parental leave options
  • Part time remote working for all our staff
  • Paid counselling as well as financial and legal advice
  • 7% employer matched pension
  • Flexible benefit platform which includes a Smart Tech scheme, Cycle to work scheme, and an individual benefits allowance which you can invest in a Health care cash plan or Pension plan
  • Optional social and wellbeing calendar of events
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

The mission of the business intelligence team is to create a data-driven culture...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
https://www.randstad.com Logo
Randstad
Expiration Date
February 28, 2026
Flip Icon
Requirements
Requirements
  • Master’s degree in Computer Science / Information Technology or related field, highly preferred
  • Extensive knowledge of BI concepts and related technologies that help drive sustainable technical solutions
  • Extensive Experience with data lakes, ETL and data warehouses
  • Advanced experience of building data pipelines
  • Passion for building quality BI software
  • Project Management and/or process improvement experience highly preferred
  • Polyglot coder and expert level in multiple languages including, Python, R, Java, SQL, relational databases, ERP, DOMO or other data visualization tools i.e. Tableau
  • Advanced and proven experience with Google cloud platform (GCP) is preferred. But experience with Microsoft Azure / Amazon will be considered
  • Any exposure to Kafka, Spark, and Scala will be an added advantage
  • Should demonstrate a strong understanding of OOPS concepts and methodologies
Job Responsibility
Job Responsibility
  • Architect and build complex data pipelines using advanced cloud data technologies
  • Lead efforts to optimize data pipelines for performance, scalability, and cost-efficiency
  • Define industry best practices for building data pipelines
  • Ensure data security, compliance, and governance standards are met
  • Partner with leadership team to define and implement agile and DevOps methodologies
  • Serve as subject matter expert and define data architecture and infrastructure requirements
  • Partner with business analysts to plan project execution including appropriate product and technical specifications, direction, resources, and establishing realistic completion times
  • Understand data technology trends and identify opportunities to implement new technologies and provide forward-thinking recommendations
  • Proactively partner with internal stakeholders to bridge gaps, provide historical references, and design the appropriate processes
  • Design and implement a robust data observability process
  • Fulltime
Read More
Arrow Right

Senior DevOps Cloud Engineer

A Senior DevOps Cloud Engineer in the HPE Networking Business designs, develops ...
Location
Location
United States , Roseville
Salary
Salary:
133500.00 - 307000.00 USD / Year
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in computer science, engineering, information systems, or closely related quantitative discipline
  • typically, 10+ years’ experience
  • proven track record of designing, implementing, and supporting multi-tier architectures in an enterprise scale organization
  • experience leading teams of engineers, providing technical direction and oversight
  • strong programming skills in Python
  • experience with Tcl, C/C++, and JavaScript a plus
  • good understanding of distributed systems, event-driven programming paradigms, and designing for scale and performance
  • experience with cloud-native applications, developer tools, managed services, and next-generation databases
  • knowledge of DevOps practices like CI/CD, infrastructure as code, containerization, and orchestration using Kubernetes, Redis, Kafka
  • good written and verbal communication skills and agile in a changing environment.
Job Responsibility
Job Responsibility
  • Analyses new or enhancement feature requests and determines the required coding, testing, and integration activities
  • designs and develops moderate to complex software modules per feature specifications adhering to quality and security policies
  • identifies debugs and creates solutions for issues with code and integration into application architecture
  • develops and executes comprehensive test plans for features adhering to performance, scale, usability, and security requirements
  • deploys cloud-based systems and application code using continuous integration/deployment (CI/CD) pipelines to automate cloud applications' management, scaling, and deployment
  • contributes towards innovation and integration of new technologies into projects
  • analyzes science, engineering, business, and other data processing problems to develop and implement solutions to complex application problems, system administration issues, or network concerns.
What we offer
What we offer
  • Comprehensive suite of benefits that supports physical, financial and emotional wellbeing
  • programs catered to helping you reach career goals
  • unconditional inclusion in the workplace.
  • Fulltime
Read More
Arrow Right

Senior AI Data Engineer

We are looking for a Senior AI Data Engineer to join an exciting project for our...
Location
Location
Poland , Warsaw
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Degree in Computer Science, Data Science, Artificial Intelligence, or a related field
  • Several years of experience in AI and Machine Learning development, preferably in Customer Care solutions
  • Strong proficiency in Python and NLP frameworks
  • Hands-on experience with Azure AI services (e.g., Azure Machine Learning, Cognitive Services, Bot Services)
  • Solid understanding of cloud architectures and microservices on Azure
  • Experience with CI/CD pipelines and MLOps
  • Excellent leadership and communication skills
  • Analytical mindset with strong problem-solving abilities
  • Polish and English at a minimum B2 level.
Job Responsibility
Job Responsibility
  • Lead the development and implementation of AI-powered features for a Customer Care platform
  • Design and deploy Machine Learning and NLP models to automate customer inquiries
  • Collaborate with DevOps and cloud architects to ensure a high-performance, scalable, and secure Azure-based architecture
  • Optimize AI models to enhance customer experience
  • Integrate Conversational AI, chatbots, and language models into the platform
  • Evaluate emerging technologies and best practices in Artificial Intelligence
  • Mentor and guide a team of AI/ML developers.
What we offer
What we offer
  • Flexible working hours
  • Hybrid work model, allowing employees to divide their time between home and modern offices in key Polish cities
  • A cafeteria system that allows employees to personalize benefits by choosing from a variety of options
  • Generous referral bonuses, offering up to PLN6,000 for referring specialists
  • Additional revenue sharing opportunities for initiating partnerships with new clients
  • Ongoing guidance from a dedicated Team Manager for each employee
  • Tailored technical mentoring from an assigned technical leader, depending on individual expertise and project needs
  • Dedicated team-building budget for online and on-site team events
  • Opportunities to participate in charitable initiatives and local sports programs
  • A supportive and inclusive work culture with an emphasis on diversity and mutual respect.
  • Fulltime
Read More
Arrow Right

Senior Data Solutions Architect with AWS

Provectus, a leading AI consultancy and solutions provider specializing in Data ...
Location
Location
Serbia , Belgrade, Novi Sad
Salary
Salary:
Not provided
provectus.com Logo
Provectus
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field
  • Minimum of 8 years of experience in data solution architecture, with at least 3 years focused on AWS
  • Proven experience in designing and implementing large-scale data engineering solutions on AWS
  • Experience with Databricks is a plus
  • Deep expertise in AWS platform services, including S3, EC2, Lambda, EMR, Glue, Redshift, AWS MSK, and EKS
  • Proficient in programming languages like Python, SQL, and Scala
  • Experience with data warehousing, ETL processes, and real-time data streaming
  • Familiarity with open-source technologies and tools commonly used in data engineering
  • AWS Certified Solutions Architect – Professional or similar AWS certifications are a plus
  • Excellent communication and presentation skills, with the ability to articulate complex technical concepts to non-technical stakeholders
Job Responsibility
Job Responsibility
  • Lead complex, high-impact customer engagements focused on AWS Data Platform solutions
  • Define and drive technical strategies that align AWS capabilities with customer business objectives, incorporating Databricks (as a plus) solutions where appropriate
  • Architect and design scalable data platforms using AWS, ensuring optimal performance, reliability, security, and cost efficiency
  • Evaluate and select appropriate technologies and tools to meet customer needs, integrating AWS services with other solutions such as Databricks, and Snowflake as necessary
  • Establish, fulfill, and maintain comprehensive architectural documentation to ensure alignment with technical standards and best practices across the organization
  • Collaborate with the sales team during the pre-sales process by providing technical expertise to position AWS-based data solutions effectively
  • Participate in customer meetings to assess technical needs, scope potential solutions, and identify opportunities for growth
  • Create technical proposals, solution architectures, and presentations to support sales efforts and ensure alignment with customer expectations
  • Assist in responding to RFPs/RFIs by providing accurate technical input and aligning solutions to client requirements
  • Demonstrate AWS capabilities through POCs (Proof of Concepts) and technical demonstrations to help customers evaluate the proposed solutions
Read More
Arrow Right