CrawlJobs Logo

Dataops Analyst

collance.tech Logo

Collance

Location Icon

Location:
India , Noida Sector 62

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are seeking a detail-oriented and analytical individual with strong Excel and SQL skills to join our team as a Dataops Analyst. The ideal candidate will be responsible for managing, analyzing, and interpreting large datasets to support business decisions.

Job Responsibility:

  • Collect, clean, and analyze data from various sources to generate insights
  • Create, maintain, and optimize reports and dashboards using Excel and SQL
  • Develop and execute SQL queries to extract and manipulate data from databases
  • Perform data validation and ensure the accuracy of insights and reports
  • Work collaboratively with cross-functional teams to understand data requirements
  • Identify trends, patterns, and actionable insights to support decision-making
  • Automate recurring data processes and reports to improve efficiency
  • Troubleshoot and resolve issues in datasets and reporting systems

Requirements:

  • Strong Excel Skills: Advanced knowledge of Excel functions, formulas, pivot tables, and data visualization
  • Experience with macros and VBA is a plus
  • SQL Proficiency: Strong ability to write, debug, and optimize complex SQL queries
  • Experience with relational databases (e.g., MySQL, PostgreSQL, SQL Server)
  • Proficiency in data visualization tools
  • Familiarity with ETL (Extract, Transform, Load) processes
  • Bachelor’s degree in computer science, Data Analytics, Statistics, or a related field
  • 2+ years of experience in a data-driven role
  • Excellent analytical and problem-solving skills
  • Strong attention to detail and organizational skills
  • Effective communication and teamwork abilities

Nice to have:

Experience with macros and VBA is a plus

Additional Information:

Job Posted:
February 20, 2026

Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Dataops Analyst

Senior DataOps/Cloud Data Engineer

Our client is looking for a Senior DataOps/Cloud Data Engineer for a 6 month con...
Location
Location
Canada , Guelph
Salary
Salary:
Not provided
https://www.randstad.com Logo
Randstad
Expiration Date
April 27, 2026
Flip Icon
Requirements
Requirements
  • Minimum 7 years Demonstrated hands-on experience building and maintaining end-to-end data pipelines in production
  • Minimum 7 years Strong expertise in SQL and Python from an engineering standpoint
  • Minimum 7 years Experience implementing data quality controls, monitoring, and troubleshooting with minimal escalation
  • Minimum 7 years Practical experience with modern data platforms (e.g., Microsoft Fabric or similar enterprise environments)
Job Responsibility
Job Responsibility
  • Design, build, and maintain end to end data pipelines using Microsoft Fabric, including Lakehouse, Data Factory, Dataflows, and notebooks
  • Develop and optimize SQL based transformations, data models, and curated datasets for enterprise reporting and analytics
  • Build and maintain Python based data engineering logic for ingestion, transformation, validation, and automation
  • Implement and operate data quality controls, including validation rules, reconciliation checks, and exception handling
  • Monitor data pipelines, investigate failures or data quality issues, and implement fixes with minimal escalation
  • Integrate data from multiple enterprise systems, including CRM, ticketing systems, telephony platforms, and operational databases
  • Maintain technical documentation, data lineage, and operational runbooks for owned pipelines and datasets
  • Work closely with the Manager and Data Analysts to ensure data assets meet defined standards and reporting requirements
What we offer
What we offer
  • Earn a competitive rate within the industry
  • Potential for extension
Read More
Arrow Right

Azure Data Engineer

We are seeking a visionary Azure Data Engineer to join an ambitious, cloud-first...
Location
Location
Greece , Athens
Salary
Salary:
Not provided
https://www.randstad.com Logo
Randstad
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven professional experience with Azure Databricks, Azure Data Factory, and Azure SQL
  • Deep understanding of software engineering principles, including Git, CI/CD, and automated testing
  • Strong knowledge of Data Vault or Kimball techniques to design robust, business-aligned models
  • Proficiency in SQL, Python, and YAML is essential
  • A 'you build it, you own it' attitude, characterized by a proactive and entrepreneurial approach to problem-solving
Job Responsibility
Job Responsibility
  • Build and maintain high-performance ETL pipelines using Medallion Architecture, ensuring data moves seamlessly from source to insight
  • Take the lead in transitioning legacy workflows (SQL Server, ADF, SSRS/SSAS) into a sophisticated Databricks environment
  • Enhance platform capabilities by optimizing ingestion, transformation, and publishing processes
  • Champion 'DataOps' by identifying opportunities for process improvement and increasing overall platform efficiency
  • Act as a bridge between analysts and architects, ensuring all data products are optimized for observability, quality, and performance
  • Fulltime
Read More
Arrow Right

Data Ops Engineer

We are looking for a skilled DataOps Engineer to operationalize and industrializ...
Location
Location
India
Salary
Salary:
Not provided
augustahitech.com Logo
Augusta Hitech Soft Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8 years of experience in data engineering or DataOps roles
  • Strong expertise in building and operating data pipelines (ETL/ELT) and orchestration tools
  • Proficiency in Python, SQL, and at least one cloud platform (Azure Synapse, AWS Glue, GCP Dataflow preferred)
  • Hands-on experience with dbt, Airflow, Docker, Kubernetes, and Git
  • Experience with data quality, observability, and testing frameworks
  • Solid understanding of healthcare data concepts, compliance (HIPAA), and regulated environments
  • Excellent problem-solving, collaboration, and communication skills
Job Responsibility
Job Responsibility
  • Design, build, deploy, and optimize end-to-end data pipelines (ingestion, transformation, orchestration, and delivery) using modern DataOps practices
  • Implement CI/CD pipelines for data workflows, including version control, automated testing, and deployment of transformations (e.g., using dbt)
  • Orchestrate complex workflows with tools like Apache Airflow, Prefect, or cloud-native orchestrators
  • Establish monitoring, alerting, and observability for data pipelines — ensuring data freshness, quality, and lineage
  • Perform root-cause analysis on pipeline failures and implement preventive measures
  • Collaborate with data engineers, analysts, data scientists, and business stakeholders to translate requirements into reliable data products
  • Enforce data governance, quality frameworks, and compliance controls (HIPAA, PHI security) in all data processes
  • Automate infrastructure provisioning and support cloud data platforms (Azure, AWS, or GCP)
  • Contribute to continuous improvement of DataOps processes, tools, and standards in the managed services environment
  • Participate in on-call rotation and maintain SLAs for data availability and performance
Read More
Arrow Right

Senior Data Engineer

As a Senior Data Engineer, you’ll design, build, and operate scalable, reliable ...
Location
Location
Bulgaria , Sofia; Varna
Salary
Salary:
Not provided
mypos.com Logo
myPOS
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Engineering, or a related technical field (or equivalent practical experience)
  • 6+ years of experience as a Data Engineer, building and maintaining production-grade pipelines and datasets
  • Strong Python and SQL skills with a solid understanding of data structures, performance, and optimization strategies
  • Hands-on experience with orchestration (like Airflow, Dagster, Databricks Workflows) and distributed processing in a cloud environment
  • Experience with analytical data modeling (star and snowflake schemas), DWH, ETL/ELT patterns, and dimensional concepts
  • Experience building reliable incremental data ingestion pipelines from DBs and APIs
  • Familiarity with at least one major cloud provider (GCP, AWS, Azure) and deploying data solutions in the cloud
  • Familiarity with CI/CD for data pipelines, IaC (Terraform), and/or DataOps practices
  • Strong troubleshooting mindset: ability to debug issues across data, infra, pipelines, and deployments
  • Collaborative mindset and clear communication across engineering, analytics, and business stakeholders
Job Responsibility
Job Responsibility
  • Build and maintain data pipelines for ingestion, transformation, and export across multiple sources and destinations
  • Develop and evolve scalable data architecture to meet business and performance requirements
  • Partner with analysts and data scientists to deliver curated, analysis-ready datasets and enable self-service analytics
  • Implement best practices for data quality, testing, monitoring, lineage, and reliability
  • Optimize workflows for performance, cost, and scalability (e.g., tuning Spark jobs, query optimization, partitioning strategies)
  • Ensure secure data handling and compliance with relevant data protection standards and internal policies
  • Contribute to documentation, standards, and continuous improvement of the data platform and engineering processes
  • Ensure secure, compliant handling of data and models, including access controls, auditability, and governance practices
  • Build and maintain MLOps automation: CI/CD for ML, environment management, artifact handling, versioning of data/models/code
What we offer
What we offer
  • Vibrant international team operating in hi-tech environment
  • Annual salary reviews, promotions and performance bonuses
  • myPOS Academy for upskilling and training
  • Unlimited access to courses on LinkedIn Learning
  • Annual individual training and development budget
  • Refer a friend bonus
  • Teambuilding, social activities and networks on a multi-national level
  • Excellent compensation package
  • 25 days annual paid leave (+1 day per year up to 30)
  • Full “Luxury” package health insurance including dental care and optical glasses
  • Fulltime
Read More
Arrow Right

Scorecard Design and Management Lead Analyst

Citi is looking for a Hands-on Senior Data Engineer and Architect to join our te...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 15 years of experience, Banking or Finance industry preferred
  • Great understanding business analysis, project management and solution design
  • Communicates effectively, develops, and delivers multi-mode communications that convey a clear understanding of the unique needs of different audiences
  • able to drive consensus, and influence relationships at all levels
  • Must have broad understanding of the business technology landscape, the ability to design reports, and strong analytical skills
  • Excellent understanding of SQL, Python, Tableau
  • Know how to read/write to database systems, such as SQL, NoSQL (Graph, KV, Columnar), Object Storage, with ability to work with various data formats
  • To be familiar with the ETL processes and Task/Job Schedulers and ability to troubleshoot technical issues on your own
  • Experience in building and consuming APIs
  • Experience on DataOps, CI/CD, automation of ETL pipelines
Job Responsibility
Job Responsibility
  • Design, build and enhance high-performance, data-intensive solution from all the way to data acquisition, rendering, visualization, and distribution
  • Mandatory experience in Building and enhancing existing data quality process to ensure accuracy, timeliness and completeness of various dashboard and reports
  • Determines appropriateness, and guides revisions/refinement of presentation content and materials for Governance forums
  • Monitor and improve the reliability and performance, scalability of the data services
  • Coordinate work within team or project group - escalate where required in a timely basis
  • Liaises with multiple Data teams/departments and serve as subject matter for planning and analysis of project deliverables and processes
  • Prework on the governance agenda required and appropriately highlight the relevant dependencies and Risk. Work with the team to get the information up to date before senior management review
  • Fulltime
Read More
Arrow Right

Scorecard Design and Management Lead Analyst

Citi is looking for a Hands-on Senior Data Engineer and Architect to join our te...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 15 years of experience, Banking or Finance industry preferred
  • Great understanding business analysis, project management and solution design
  • Communicates effectively, develops, and delivers multi-mode communications that convey a clear understanding of the unique needs of different audiences
  • able to drive consensus, and influence relationships at all levels
  • Must have broad understanding of the business technology landscape, the ability to design reports, and strong analytical skills
  • Excellent understanding of SQL, Python, Tableau
  • Know how to read/write to database systems, such as SQL, NoSQL (Graph, KV, Columnar), Object Storage, with ability to work with various data formats
  • To be familiar with the ETL processes and Task/Job Schedulers and ability to troubleshoot technical issues on your own
  • Experience in building and consuming APIs
  • Experience on DataOps, CI/CD, automation of ETL pipelines
Job Responsibility
Job Responsibility
  • Design, build and enhance high-performance, data-intensive solution from all the way to data acquisition, rendering, visualization, and distribution
  • Mandatory experience in Building and enhancing existing data quality process to ensure accuracy, timeliness and completeness of various dashboard and reports
  • Determines appropriateness, and guides revisions/refinement of presentation content and materials for Governance forums
  • Monitor and improve the reliability and performance, scalability of the data services
  • Coordinate work within team or project group - escalate where required in a timely basis
  • Liaises with multiple Data teams/departments and serve as subject matter for planning and analysis of project deliverables and processes
  • Prework on the governance agenda required and appropriately highlight the relevant dependencies and Risk. Work with the team to get the information up to date before senior management review
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Our Senior Data Engineers enable public sector organisations to embrace a data-d...
Location
Location
United Kingdom , Bristol; London; Manchester; Swansea
Salary
Salary:
60000.00 - 80000.00 GBP / Year
madetech.com Logo
Made Tech
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Enthusiasm for learning and self-development
  • Proficiency in Git (inc. Github Actions) and able to explain the benefits of different branch strategies
  • Gathering and meeting the requirements of both clients and users on a data project
  • Strong experience in IaC and able to guide how one could deploy infrastructure into different environments
  • Owning the cloud infrastructure underpinning data systems through a DevOps approach
  • Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop
  • Good understanding of the possible architectures involved in modern data system design (e.g. Data Warehouse, Data Lakes and Data Meshes) and the different use cases for them
  • Ability to create data pipelines on a cloud environment and integrate error handling within these pipelines. With an understanding how to create reusable libraries to encourage uniformity of approach across multiple data pipelines.
  • Able to document and present an end-to-end diagram to explain a data processing system on a cloud environment, with some knowledge of how you would present diagrams (C4, UML etc.)
  • To provide guidance how one would implement a robust DevOps approach in a data project. Also would be able to talk about tools needed for DataOps in areas such as orchestration, data integration and data analytics.
Job Responsibility
Job Responsibility
  • Enable public sector organisations to embrace a data-driven approach by providing data platforms and services that are high-quality, cost-efficient, and tailored to clients’ needs
  • Develop, operate, and maintain these services
  • Provide maximum value to data consumers, including analysts, scientists, and business stakeholders
  • Play one or more roles according to our clients' needs
  • Support as a senior contributor for a project, focusing on both delivering engineering work as well as upskilling members of the client team
  • Play more of a technical architect role and work with the larger MadeTech team to identify growth opportunities within the account
  • Have a drive to deliver outcomes for users
  • Make sure that the wider context of a delivery is considered and maintain alignment between the operational and analytical aspects of the engineering solution
What we offer
What we offer
  • 30 days of paid annual leave + bank holidays
  • Flexible Parental Leave
  • Part time remote working for all our staff
  • Paid counselling as well as financial and legal advice
  • Flexible benefit platform which includes a Smart Tech scheme, Cycle to work scheme, and an individual benefits allowance which you can invest in a Health care cash plan or Pension plan
  • Optional social and wellbeing calendar of events
  • Fulltime
Read More
Arrow Right

Big Data / Scala / Python Engineering Lead

The Applications Development Technology Lead Analyst is a senior level position ...
Location
Location
India , Chennai
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • At least two years (Over all 10+ hands on Data Engineering experience) of experience building and leading highly complex, technical data engineering teams
  • Lead data engineering team, from sourcing to closing
  • Drive strategic vision for the team and product
  • Experience managing an data focused product, ML platform
  • Hands on experience relevant experience in design, develop, and optimize scalable distributed data processing pipelines using Apache Spark and Scala
  • Experience managing, hiring and coaching software engineering teams
  • Experience with large-scale distributed web services and the processes around testing, monitoring, and SLAs to ensure high product quality
  • 7 to 10+ years of hands-on experience in big data development, focusing on Apache Spark, Scala, and distributed systems
  • Proficiency in Functional Programming: High proficiency in Scala-based functional programming for developing robust and efficient data processing pipelines
  • Proficiency in Big Data Technologies: Strong experience with Apache Spark, Hadoop ecosystem tools such as Hive, HDFS, and YARN
Job Responsibility
Job Responsibility
  • Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements
  • Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards
  • Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint
  • Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation
  • Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions
  • Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary
  • Fulltime
Read More
Arrow Right