CrawlJobs Logo

Dataops Engineer (Database)

optisolbusiness.com Logo

OptiSol Business Solutions

Location Icon

Location:
India , Chennai City Corporation

Category Icon

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

Not provided

Job Description:

DataOps Engineer Experience: 3–5 Years Location: Chennai | Madurai | Coimbatore (Hybrid) Employment Type: FTE (Full-time) We are looking for a skilled DataOps Engineer with strong Database Administration expertise to manage, optimize, and scale enterprise data systems. This role focuses on performance tuning, automation, and ensuring high availability of databases across on-premise and cloud environments.

Job Responsibility:

  • Manage and administer Oracle, SQL Server, or PostgreSQL databases
  • Optimize performance through query tuning, indexing, and execution plans
  • Troubleshoot issues like deadlocks, blocking, and slow queries
  • Implement backup, restore, and disaster recovery strategies
  • Ensure high availability using AAGs, failover clusters, and replication
  • Monitor and maintain database health and performance
  • Automate DBA tasks using T-SQL, PowerShell, and DataOps practices
  • Manage and monitor cloud databases (Azure SQL, AWS RDS)
  • Collaborate with engineering teams for scalable data solutions
  • Support CI/CD and database deployment processes
  • Work with monitoring tools for performance and anomaly detection
  • Ensure reliability, scalability, and security of data systems

Requirements:

  • Strong experience in Oracle, SQL Server, or PostgreSQL administration
  • Expertise in performance tuning, query optimization, and indexing
  • Proficiency in T-SQL scripting and database automation
  • Hands-on experience troubleshooting database issues
  • Knowledge of cloud platforms (Azure SQL, AWS RDS)
  • Strong analytical, problem-solving, and communication skills

Nice to have:

  • Exposure to AI-driven monitoring tools (AIOps, Dynatrace, SolarWinds)
  • Experience with CI/CD pipelines and DataOps practices
  • Knowledge of Docker, Kubernetes, or containerized environments
  • Familiarity with NoSQL or Big Data platforms
  • Understanding of predictive analytics and capacity planning
What we offer:
  • Opportunity to work on modern DataOps and database systems
  • Exposure to cloud, AI, and advanced data technologies
  • Collaborative and growth-focused work environment
  • Learning, certifications, and career development opportunities

Additional Information:

Job Posted:
May 05, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Dataops Engineer (Database)

DataOps Engineer

Looking for DataOps Engineer to lead database performance management for SaaS he...
Location
Location
Salary
Salary:
Not provided
hivex.tech Logo
Hivex
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3.5+ years of professional database management, development, and/or DataOps experience in a SaaS product environment
  • Experience database performance engineering large scale systems through high growth
  • Experience leading data quality management activities
  • Ability to collaborate with Java and Python developers on best practices for database performance and data quality
  • Deep knowledge of database internals and best practices for transactional and analytical processing
  • Ability to problem-solve collaboratively and independently
Job Responsibility
Job Responsibility
  • Define and build automated a database performance engineering process and framework
  • Collect and manage deterministic, well-known, and representative test sets
  • Optimize database performance using configuration, best-practices, and effective models
  • Engage with developers to collaborate on requirements and performance engineering
  • Create and manage ETL processes
  • Detect and respond to operational and customer problems
Read More
Arrow Right

SQL Ops Engineer

We are seeking a SQL Ops Engineer to ensure the performance, reliability, securi...
Location
Location
India , Remote
Salary
Salary:
Not provided
augustahitech.com Logo
Augusta Hitech Soft Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8 years of hands-on experience as a SQL Database Administrator, SQL Ops Engineer, or equivalent
  • Deep expertise in Microsoft SQL Server administration, tuning, and high availability features
  • Strong command of T-SQL, indexing, execution plans, and performance troubleshooting
  • Experience with backup/recovery strategies, security hardening, and compliance in regulated industries
  • Proficiency in automation scripting (PowerShell, T-SQL) and monitoring tools (SQL Server Management Studio, Extended Events, Azure Monitor, or similar)
  • Solid understanding of healthcare data environments and regulatory requirements (HIPAA, PHI)
  • Strong analytical, problem-solving, and documentation skills
Job Responsibility
Job Responsibility
  • Install, configure, upgrade, patch, and maintain SQL Server (and other RDBMS) instances in on-prem, cloud, and hybrid environments
  • Perform performance tuning — query optimization, indexing strategies, execution plan analysis, and resource governance
  • Design and implement high availability and disaster recovery solutions (Always On Availability Groups, clustering, replication, backups)
  • Monitor database health, capacity, and performance using native and third-party tools
  • proactively resolve bottlenecks
  • Manage security, encryption, auditing, access controls, and compliance requirements (HIPAA, data privacy, PHI protection)
  • Automate routine operations (maintenance plans, monitoring scripts, alerting) using PowerShell, T-SQL, or Python
  • Troubleshoot and resolve database incidents
  • conduct root-cause analysis and implement permanent fixes
  • Support data migration, integration, and replication projects across systems
Read More
Arrow Right

Senior DataOps Engineer

Drive optimisations, upgrades and maintenance of a Kubernetes based data and mod...
Location
Location
Salary
Salary:
Not provided
sniconsulting.net Logo
SNI sp. z o.o.
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience as DataOps Engineer or similar role covering the most of required skills
  • Expertise in Cloud architecture and key technologies (Kubernetes, Airflow, Managed Airflow)
  • Expertise in modern development tools and practices (e.g. CI/CD, DevOps, Observability, Pair Programming, TDD)
  • Knowledge of infrastructure-as-code tools (CloudFormation)
  • Experience with databases (Database: Redshift)
  • Programming language (Python)
  • Expertise in choosing and applying design patterns.
  • Developing software with scale, security and reliability in mind.
  • Knowledge of software development principles, design patterns and best practices
  • Test Driven Development and testing practices
Job Responsibility
Job Responsibility
  • Drive optimisations, upgrades and maintenance of a Kubernetes based data and modelling platform
  • Supporting access management fielding questions around Airflow and minor feature enhancements
  • Assist with Migration of Data pipelines
  • Fulltime
Read More
Arrow Right

Graduate Data Engineer

As a Graduate Data Engineer, you will build and maintain scalable data pipelines...
Location
Location
United Kingdom , Marlow
Salary
Salary:
Not provided
srgtalent.com Logo
SRG
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Degree in Computer Science, Engineering, Mathematics, or similar, or similar work experience
  • Up to 2 years of experience building data pipelines at work or through internships
  • Can write clear and reliable Python/PySpark code
  • Familiar with popular analytics tools (like pandas, numpy, matplotlib), big data frameworks (like Spark), and cloud services (like Palantir, AWS, Azure, or Google Cloud)
  • Deep understanding of data models, relational and non-relational databases, and how they are used to organize, store, and retrieve data efficiently for analytics and machine learning
  • Knowledge about software engineering methods, including DevOps, DataOps, or MLOps is a plus
  • Master's degree in engineering (such as AI/ML, Data Systems, Computer Science, Mathematics, Biotechnology, Physics), or minimum 2 years of relevant technology experience
  • Experience with Generative AI (GenAI) and agentic systems will be considered a strong plus
  • Have a proactive and adaptable mindset: willing to take initiative, learn new skills, and contribute to different aspects of a project as needed to drive solutions from start to finish, even beyond the formal job description
  • Show a strong ability to thrive in situations of ambiguity, taking initiative to create clarity for yourself and the team, and proactively driving progress even when details are uncertain or evolving
Job Responsibility
Job Responsibility
  • Build and maintain data pipelines, leveraging PySpark and/or Typescript within Foundry, to transform raw data into reliable, usable datasets
  • Assist in preparing and optimizing data pipelines to support machine learning and AI model development, ensuring datasets are clean, well-structured, and readily usable by Data Science teams
  • Support the integration and management of feature engineering processes and model outputs into Foundry's data ecosystem, helping enable scalable deployment and monitoring of AI/ML solutions
  • Engaged in gathering and translating stakeholder requirements for key data models and reporting, with a focus on Palantir Foundry workflows and tools
  • Participate in developing and refining dashboards and reports in Foundry to visualize key metrics and insights
  • Collaborate with Product, Engineering, and GTM teams to align data architecture and solutions, learning to support scalable, self-serve analytics across the organization
  • Have some prompt engineering experience with large language models, including writing and evaluating complex multi-step prompts
  • Continuously develop your understanding of the company's data landscape, including Palantir Foundry's ontology-driven approach and best practices for data management
Read More
Arrow Right

Azure DataOps Lead

The Azure DataOps Lead will be responsible for leading the operational delivery,...
Location
Location
India
Salary
Salary:
Not provided
rackspace.com Logo
Rackspace
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8–12 years of total IT experience with at least 3–5 years in Azure DataOps or Data Engineering leadership
  • Hands-on expertise with key Azure Data Services, including: Azure Data Factory (ADF)
  • Azure Synapse Analytics
  • Azure Databricks
  • Azure SQL Database / SQL Managed Instance
  • Azure Data Lake Storage Gen2 (ADLS)
  • Strong understanding of DataOps concepts
  • Experience in monitoring and alerting using Log Analytics, Application Insights, and Azure Monitor
  • Working knowledge of incident management, RCA documentation, and operational reporting
  • Strong analytical skills for troubleshooting performance issues and identifying optimization opportunities
Job Responsibility
Job Responsibility
  • Lead and manage the Azure DataOps function, ensuring smooth daily operations, incident resolution, and performance stability across production data platforms
  • Oversee data pipeline orchestration and automation using Azure Data Factory (ADF), Synapse Analytics, Databricks, and Logic Apps
  • Implement CI/CD pipelines for data workflows using Azure DevOps or equivalent automation tools
  • Drive incident, problem, change, and request management processes aligned with ITIL best practices
  • Coordinate with L1/L2 support teams for escalations, RCA preparation, and client communication
  • Maintain governance for data quality, access control, and compliance using Azure Purview, Key Vault, and RBAC
  • Collaborate with Data Architects and Cloud Engineers to design scalable, resilient, and cost-efficient Azure data solutions
  • Ensure 24/7 operational readiness through proactive alert monitoring, performance tuning, and preventive maintenance
  • Contribute to automation initiatives using PowerShell, Python, or ARM templates to reduce manual efforts and improve system reliability
  • Partner with customer stakeholders to report on SLAs, KPIs, RCA summaries, and provide technical recommendations for improvement
Read More
Arrow Right

Azure DataOps Lead

The Azure DataOps Lead will be responsible for leading the operational delivery,...
Location
Location
India , Gurgaon
Salary
Salary:
Not provided
rackspace.com Logo
Rackspace
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8–12 years of total IT experience with at least 3–5 years in Azure DataOps or Data Engineering leadership
  • Hands-on expertise with key Azure Data Services, including: Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, Azure SQL Database / SQL Managed Instance, Azure Data Lake Storage Gen2 (ADLS)
  • Strong understanding of DataOps concepts
  • Experience in monitoring and alerting using Log Analytics, Application Insights, and Azure Monitor
  • Working knowledge of incident management, RCA documentation, and operational reporting
  • Strong analytical skills for troubleshooting performance issues and identifying optimization opportunities
  • Excellent communication and stakeholder management skills across global teams
Job Responsibility
Job Responsibility
  • Lead and manage the Azure DataOps function, ensuring smooth daily operations, incident resolution, and performance stability across production data platforms
  • Oversee data pipeline orchestration and automation using Azure Data Factory (ADF), Synapse Analytics, Databricks, and Logic Apps
  • Implement CI/CD pipelines for data workflows using Azure DevOps or equivalent automation tools
  • Drive incident, problem, change, and request management processes aligned with ITIL best practices
  • Coordinate with L1/L2 support teams for escalations, RCA preparation, and client communication
  • Maintain governance for data quality, access control, and compliance using Azure Purview, Key Vault, and RBAC
  • Collaborate with Data Architects and Cloud Engineers to design scalable, resilient, and cost-efficient Azure data solutions
  • Ensure 24/7 operational readiness through proactive alert monitoring, performance tuning, and preventive maintenance
  • Contribute to automation initiatives using PowerShell, Python, or ARM templates to reduce manual efforts and improve system reliability
  • Partner with customer stakeholders to report on SLAs, KPIs, RCA summaries, and provide technical recommendations for improvement
  • Fulltime
Read More
Arrow Right

Data Engineer - AWS

We are seeking an AWS Data Engineer with 4–7 years of experience to design and b...
Location
Location
India , Mumbai
Salary
Salary:
Not provided
necsws.com Logo
NEC Software Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s Degree in Computer Science, Engineering, or related field
  • 4–7 years of experience in application development and data engineering
  • 3+ years of experience with big data technologies
  • 3+ years of experience with cloud platforms (AWS preferred
  • Azure or GCP also acceptable)
  • Proficiency in Python, SQL, Scala, or Java (3+ years)
  • Experience with distributed computing tools such as Hadoop, Hive, EMR, Kafka, or Spark (3+ years)
  • Hands-on experience with real-time data and streaming applications (3+ years)
  • NoSQL database experience (MongoDB, Cassandra) – 3+ years
  • Data warehousing expertise (Redshift or equivalent) – 3+ years
Job Responsibility
Job Responsibility
  • Develop, test, deploy, orchestrate, monitor, and troubleshoot cloud-based data pipelines and automation workflows in alignment with best practices and security standards
  • Collaborate with data scientists, architects, ETL developers, and business stakeholders to capture, format, and integrate data from internal systems, external sources, and data warehouses
  • Research and experiment with batch and streaming data technologies to evaluate their business impact and suitability for current use cases
  • Contribute to the definition and continuous improvement of data engineering processes and procedures
  • Ensure data integrity, accuracy, and security across corporate data assets
  • Maintain high data quality standards for Data Services, Analytics, and Master Data Management
  • Build automated, scalable, and test-driven data pipelines
  • Apply software development practices including Git-based version control, CI/CD, and release management to enhance AWS CI/CD pipelines
  • Partner with DevOps engineers and architects to improve DataOps tools and frameworks
  • Fulltime
Read More
Arrow Right

Multi-Cloud SQL/Oracle Database Administrator

We are looking for a Multi-Cloud Lead SQL/Oracle Database Administrator (DBA) to...
Location
Location
United States , Austin
Salary
Salary:
68000.00 - 78202.00 USD / Year
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Must be a US citizen or Green card holder
  • 5+ years of experience as a SQL DBA in enterprise environments
  • 3+ years of hands-on experience managing SQL databases in AWS, Azure, GCP, and OCI
  • 2+ years of experience as a Oracle DBA
  • Microsoft SQL DBA or any SQL DBA certification
  • Bachelor's degree in Computer Science
  • Strong experience with AWS RDS/Aurora, Azure SQL Database, GCP Cloud SQL/AlloyDB, and OCI Autonomous Database
  • Working knowledge of DynamoDB, Cosmos DB, Firestore/Bigtable, Oracle NoSQL Database
  • Proficiency in writing and optimizing SQL queries, stored procedures, and triggers
  • Solid understanding of cloud security principles, IAM, and encryption mechanisms
Job Responsibility
Job Responsibility
  • Provision, configure, and maintain SQL Server, PostgreSQL, MySQL, and Oracle DB instances in AWS RDS/Aurora, Azure SQL Database, GCP Cloud SQL/AlloyDB, and OCI Autonomous Database
  • Design and implement high-availability (HA) and disaster recovery (DR) solutions
  • Perform database backup, restore, patching, and upgrade operations across all cloud platforms
  • Monitor and tune database performance using cloud-native monitoring tools and traditional database performance tools
  • Manage and optimize cloud-native object databases: AWS DynamoDB, Azure Cosmos DB, Google Cloud Firestore / Bigtable, Oracle NoSQL Database Cloud Service
  • Implement best practices for partitioning, indexing, throughput provisioning, and capacity management
  • Establish lifecycle management policies for object data, backup strategies, and encryption
  • Automate database provisioning and configuration using Terraform, AWS CloudFormation, Azure ARM/Bicep templates, GCP Deployment Manager, and OCI Resource Manager
  • Implement CI/CD pipelines for database schema changes using GitHub Actions, Azure DevOps, or Cloud Build
  • Enforce database security policies, encryption (at rest and in transit), and access controls via IAM roles/policies across all clouds
What we offer
What we offer
  • Medical, dental, and vision insurance
  • Flexible spending or health savings account
  • Life and AD&D insurance
  • Short and long term disability coverage
  • Paid time off
  • Employee assistance
  • Participation in a 401k program with company match
  • Additional voluntary or legally-required benefits
  • Incentive compensation based on individual and/or company performance
  • Fulltime
Read More
Arrow Right